Oct 11 00:51:44 crc systemd[1]: Starting Kubernetes Kubelet... Oct 11 00:51:44 crc restorecon[4728]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:44 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 11 00:51:45 crc restorecon[4728]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 11 00:51:45 crc restorecon[4728]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 11 00:51:45 crc kubenswrapper[4743]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 11 00:51:45 crc kubenswrapper[4743]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 11 00:51:45 crc kubenswrapper[4743]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 11 00:51:45 crc kubenswrapper[4743]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 11 00:51:45 crc kubenswrapper[4743]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 11 00:51:45 crc kubenswrapper[4743]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.796287 4743 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803557 4743 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803591 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803600 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803610 4743 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803619 4743 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803630 4743 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803640 4743 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803652 4743 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803662 4743 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803671 4743 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803682 4743 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803691 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803701 4743 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803709 4743 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803751 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803760 4743 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803768 4743 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803777 4743 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803786 4743 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803793 4743 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803801 4743 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803809 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803816 4743 feature_gate.go:330] unrecognized feature gate: Example Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803824 4743 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803831 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803840 4743 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803847 4743 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803855 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803889 4743 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803897 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803905 4743 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803913 4743 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803922 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803930 4743 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803942 4743 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803953 4743 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803962 4743 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803971 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803980 4743 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803988 4743 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.803996 4743 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804004 4743 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804011 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804018 4743 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804030 4743 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804039 4743 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804048 4743 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804065 4743 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804073 4743 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804081 4743 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804089 4743 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804097 4743 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804104 4743 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804112 4743 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804120 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804127 4743 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804135 4743 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804143 4743 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804151 4743 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804158 4743 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804165 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804172 4743 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804180 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804191 4743 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804198 4743 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804205 4743 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804213 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804220 4743 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804227 4743 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804235 4743 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.804242 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806027 4743 flags.go:64] FLAG: --address="0.0.0.0" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806053 4743 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806082 4743 flags.go:64] FLAG: --anonymous-auth="true" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806094 4743 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806107 4743 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806117 4743 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806137 4743 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806153 4743 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806163 4743 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806172 4743 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806181 4743 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806191 4743 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806212 4743 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806221 4743 flags.go:64] FLAG: --cgroup-root="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806229 4743 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806238 4743 flags.go:64] FLAG: --client-ca-file="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806247 4743 flags.go:64] FLAG: --cloud-config="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806256 4743 flags.go:64] FLAG: --cloud-provider="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806265 4743 flags.go:64] FLAG: --cluster-dns="[]" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806281 4743 flags.go:64] FLAG: --cluster-domain="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806290 4743 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806299 4743 flags.go:64] FLAG: --config-dir="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806308 4743 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806318 4743 flags.go:64] FLAG: --container-log-max-files="5" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806331 4743 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806340 4743 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806350 4743 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806361 4743 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806370 4743 flags.go:64] FLAG: --contention-profiling="false" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806379 4743 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806388 4743 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806397 4743 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806406 4743 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806416 4743 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806425 4743 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806434 4743 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806443 4743 flags.go:64] FLAG: --enable-load-reader="false" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806451 4743 flags.go:64] FLAG: --enable-server="true" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806461 4743 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806479 4743 flags.go:64] FLAG: --event-burst="100" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806488 4743 flags.go:64] FLAG: --event-qps="50" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806497 4743 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806506 4743 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806515 4743 flags.go:64] FLAG: --eviction-hard="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806528 4743 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806537 4743 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806546 4743 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806555 4743 flags.go:64] FLAG: --eviction-soft="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806565 4743 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806574 4743 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806583 4743 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806591 4743 flags.go:64] FLAG: --experimental-mounter-path="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806600 4743 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806609 4743 flags.go:64] FLAG: --fail-swap-on="true" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806618 4743 flags.go:64] FLAG: --feature-gates="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806629 4743 flags.go:64] FLAG: --file-check-frequency="20s" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806639 4743 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806648 4743 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806658 4743 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806667 4743 flags.go:64] FLAG: --healthz-port="10248" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806676 4743 flags.go:64] FLAG: --help="false" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806686 4743 flags.go:64] FLAG: --hostname-override="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806696 4743 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806705 4743 flags.go:64] FLAG: --http-check-frequency="20s" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806716 4743 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806726 4743 flags.go:64] FLAG: --image-credential-provider-config="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806735 4743 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806745 4743 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806754 4743 flags.go:64] FLAG: --image-service-endpoint="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806763 4743 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806772 4743 flags.go:64] FLAG: --kube-api-burst="100" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806780 4743 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806790 4743 flags.go:64] FLAG: --kube-api-qps="50" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806798 4743 flags.go:64] FLAG: --kube-reserved="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806808 4743 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806816 4743 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806828 4743 flags.go:64] FLAG: --kubelet-cgroups="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806837 4743 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806845 4743 flags.go:64] FLAG: --lock-file="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806884 4743 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806894 4743 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806903 4743 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806917 4743 flags.go:64] FLAG: --log-json-split-stream="false" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806926 4743 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806935 4743 flags.go:64] FLAG: --log-text-split-stream="false" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806945 4743 flags.go:64] FLAG: --logging-format="text" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806954 4743 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806964 4743 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806972 4743 flags.go:64] FLAG: --manifest-url="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806981 4743 flags.go:64] FLAG: --manifest-url-header="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.806995 4743 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807005 4743 flags.go:64] FLAG: --max-open-files="1000000" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807016 4743 flags.go:64] FLAG: --max-pods="110" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807025 4743 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807035 4743 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807044 4743 flags.go:64] FLAG: --memory-manager-policy="None" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807052 4743 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807061 4743 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807070 4743 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807079 4743 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807106 4743 flags.go:64] FLAG: --node-status-max-images="50" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807116 4743 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807128 4743 flags.go:64] FLAG: --oom-score-adj="-999" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807139 4743 flags.go:64] FLAG: --pod-cidr="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807149 4743 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807166 4743 flags.go:64] FLAG: --pod-manifest-path="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807176 4743 flags.go:64] FLAG: --pod-max-pids="-1" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807187 4743 flags.go:64] FLAG: --pods-per-core="0" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807198 4743 flags.go:64] FLAG: --port="10250" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807211 4743 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807223 4743 flags.go:64] FLAG: --provider-id="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807232 4743 flags.go:64] FLAG: --qos-reserved="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807241 4743 flags.go:64] FLAG: --read-only-port="10255" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807250 4743 flags.go:64] FLAG: --register-node="true" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807259 4743 flags.go:64] FLAG: --register-schedulable="true" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807267 4743 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807283 4743 flags.go:64] FLAG: --registry-burst="10" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807292 4743 flags.go:64] FLAG: --registry-qps="5" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807301 4743 flags.go:64] FLAG: --reserved-cpus="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807309 4743 flags.go:64] FLAG: --reserved-memory="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807323 4743 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807333 4743 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807342 4743 flags.go:64] FLAG: --rotate-certificates="false" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807351 4743 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807360 4743 flags.go:64] FLAG: --runonce="false" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807368 4743 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807378 4743 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807387 4743 flags.go:64] FLAG: --seccomp-default="false" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807396 4743 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807404 4743 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807414 4743 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807423 4743 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807433 4743 flags.go:64] FLAG: --storage-driver-password="root" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807441 4743 flags.go:64] FLAG: --storage-driver-secure="false" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807450 4743 flags.go:64] FLAG: --storage-driver-table="stats" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807459 4743 flags.go:64] FLAG: --storage-driver-user="root" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807467 4743 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807477 4743 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807486 4743 flags.go:64] FLAG: --system-cgroups="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807495 4743 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807508 4743 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807517 4743 flags.go:64] FLAG: --tls-cert-file="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807526 4743 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807550 4743 flags.go:64] FLAG: --tls-min-version="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807559 4743 flags.go:64] FLAG: --tls-private-key-file="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807567 4743 flags.go:64] FLAG: --topology-manager-policy="none" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807576 4743 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807585 4743 flags.go:64] FLAG: --topology-manager-scope="container" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807593 4743 flags.go:64] FLAG: --v="2" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807605 4743 flags.go:64] FLAG: --version="false" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807616 4743 flags.go:64] FLAG: --vmodule="" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807627 4743 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.807636 4743 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.807955 4743 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.807969 4743 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.807978 4743 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.807990 4743 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808001 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808009 4743 feature_gate.go:330] unrecognized feature gate: Example Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808017 4743 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808025 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808033 4743 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808041 4743 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808049 4743 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808056 4743 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808063 4743 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808071 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808078 4743 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808086 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808093 4743 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808100 4743 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808111 4743 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808121 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808130 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808138 4743 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808145 4743 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808155 4743 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808165 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808174 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808181 4743 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808190 4743 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808199 4743 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808209 4743 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808218 4743 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808227 4743 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808235 4743 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808243 4743 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808254 4743 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808263 4743 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808273 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808281 4743 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808289 4743 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808298 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808306 4743 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808313 4743 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808321 4743 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808328 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808336 4743 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808343 4743 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808351 4743 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808359 4743 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808366 4743 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808374 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808381 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808390 4743 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808398 4743 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808405 4743 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808412 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808419 4743 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808427 4743 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808434 4743 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808442 4743 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808450 4743 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808457 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808464 4743 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808472 4743 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808480 4743 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808487 4743 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808494 4743 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808502 4743 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808510 4743 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808517 4743 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808525 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.808532 4743 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.808556 4743 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.824464 4743 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.824532 4743 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.824713 4743 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.824748 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.824761 4743 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.824774 4743 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.824788 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.824800 4743 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.824809 4743 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.824818 4743 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.824827 4743 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.824839 4743 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.824849 4743 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.824897 4743 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.824909 4743 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.824921 4743 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.824932 4743 feature_gate.go:330] unrecognized feature gate: Example Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.824947 4743 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.824958 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.824972 4743 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.824985 4743 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.824998 4743 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825012 4743 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825024 4743 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825036 4743 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825052 4743 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825070 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825084 4743 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825096 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825108 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825123 4743 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825138 4743 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825151 4743 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825165 4743 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825177 4743 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825189 4743 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825199 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825213 4743 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825227 4743 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825240 4743 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825251 4743 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825264 4743 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825274 4743 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825286 4743 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825297 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825309 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825319 4743 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825330 4743 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825346 4743 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825360 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825373 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825386 4743 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825397 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825409 4743 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825420 4743 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825432 4743 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825443 4743 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825454 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825464 4743 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825475 4743 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825486 4743 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825497 4743 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825508 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825518 4743 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825529 4743 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825539 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825548 4743 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825556 4743 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825564 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825573 4743 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825581 4743 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825589 4743 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825598 4743 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.825613 4743 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.825982 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826008 4743 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826022 4743 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826035 4743 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826047 4743 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826059 4743 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826070 4743 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826084 4743 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826098 4743 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826109 4743 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826119 4743 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826130 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826141 4743 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826152 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826164 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826175 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826186 4743 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826197 4743 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826208 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826219 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826230 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826240 4743 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826252 4743 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826262 4743 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826271 4743 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826279 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826288 4743 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826299 4743 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826310 4743 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826321 4743 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826331 4743 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826342 4743 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826353 4743 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826364 4743 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826374 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826390 4743 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826404 4743 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826414 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826424 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826434 4743 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826444 4743 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826453 4743 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826462 4743 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826471 4743 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826483 4743 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826494 4743 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826507 4743 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826517 4743 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826528 4743 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826537 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826547 4743 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826556 4743 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826566 4743 feature_gate.go:330] unrecognized feature gate: Example Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826575 4743 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826586 4743 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826672 4743 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826684 4743 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826695 4743 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826705 4743 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826715 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826725 4743 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826734 4743 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826746 4743 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826759 4743 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826772 4743 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826784 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826796 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826807 4743 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826819 4743 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826833 4743 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 11 00:51:45 crc kubenswrapper[4743]: W1011 00:51:45.826844 4743 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.826890 4743 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.827274 4743 server.go:940] "Client rotation is on, will bootstrap in background" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.833752 4743 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.833939 4743 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.835955 4743 server.go:997] "Starting client certificate rotation" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.836068 4743 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.837310 4743 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-16 20:32:35.988747833 +0000 UTC Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.837647 4743 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1603h40m50.151108861s for next certificate rotation Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.872805 4743 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.876455 4743 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.900267 4743 log.go:25] "Validated CRI v1 runtime API" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.940399 4743 log.go:25] "Validated CRI v1 image API" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.942750 4743 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.948946 4743 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-11-00-43-09-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.948994 4743 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.978162 4743 manager.go:217] Machine: {Timestamp:2025-10-11 00:51:45.974421557 +0000 UTC m=+0.627401994 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:407eb137-47d1-41e8-9c72-65f09e76d21a BootID:4a117022-09d2-46e0-826b-22308ec25890 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:bf:58:4e Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:bf:58:4e Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:a3:b4:21 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:b4:b7:12 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:4a:02:89 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:7e:41:ec Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:9b:03:d7 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:3a:f2:a8:13:b0:2b Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:32:bd:61:37:81:05 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.979192 4743 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.979554 4743 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.981599 4743 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.982116 4743 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.982175 4743 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.982744 4743 topology_manager.go:138] "Creating topology manager with none policy" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.982763 4743 container_manager_linux.go:303] "Creating device plugin manager" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.983511 4743 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.983609 4743 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.984018 4743 state_mem.go:36] "Initialized new in-memory state store" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.984188 4743 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.988113 4743 kubelet.go:418] "Attempting to sync node with API server" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.988173 4743 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.988215 4743 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.988249 4743 kubelet.go:324] "Adding apiserver pod source" Oct 11 00:51:45 crc kubenswrapper[4743]: I1011 00:51:45.988335 4743 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.001363 4743 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 11 00:51:46 crc kubenswrapper[4743]: W1011 00:51:46.002121 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Oct 11 00:51:46 crc kubenswrapper[4743]: E1011 00:51:46.002295 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Oct 11 00:51:46 crc kubenswrapper[4743]: W1011 00:51:46.002426 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Oct 11 00:51:46 crc kubenswrapper[4743]: E1011 00:51:46.002533 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.003097 4743 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.005061 4743 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.006946 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.006992 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.007007 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.007020 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.007072 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.007088 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.007103 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.007126 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.007145 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.007160 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.007178 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.007192 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.008814 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.009722 4743 server.go:1280] "Started kubelet" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.010729 4743 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.010926 4743 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.011384 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.011568 4743 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 11 00:51:46 crc systemd[1]: Started Kubernetes Kubelet. Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.016884 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.016952 4743 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.017069 4743 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.017096 4743 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.017483 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 20:30:15.213037735 +0000 UTC Oct 11 00:51:46 crc kubenswrapper[4743]: W1011 00:51:46.017961 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Oct 11 00:51:46 crc kubenswrapper[4743]: E1011 00:51:46.018155 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Oct 11 00:51:46 crc kubenswrapper[4743]: E1011 00:51:46.017189 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.017897 4743 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.018099 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1003h38m29.194948519s for next certificate rotation Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.019247 4743 server.go:460] "Adding debug handlers to kubelet server" Oct 11 00:51:46 crc kubenswrapper[4743]: E1011 00:51:46.020069 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="200ms" Oct 11 00:51:46 crc kubenswrapper[4743]: E1011 00:51:46.023040 4743 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.106:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186d498e8f468534 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-11 00:51:46.009679156 +0000 UTC m=+0.662659593,LastTimestamp:2025-10-11 00:51:46.009679156 +0000 UTC m=+0.662659593,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.026513 4743 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.026748 4743 factory.go:55] Registering systemd factory Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.026998 4743 factory.go:221] Registration of the systemd container factory successfully Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.027836 4743 factory.go:153] Registering CRI-O factory Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.027904 4743 factory.go:221] Registration of the crio container factory successfully Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.027946 4743 factory.go:103] Registering Raw factory Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.027976 4743 manager.go:1196] Started watching for new ooms in manager Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.029015 4743 manager.go:319] Starting recovery of all containers Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041282 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041369 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041392 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041414 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041439 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041459 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041480 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041501 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041527 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041584 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041604 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041626 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041646 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041669 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041690 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041712 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041731 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041751 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041770 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041790 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041812 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041831 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041852 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041902 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041922 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041943 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.041997 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.042060 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.042082 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.042104 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.042122 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.042178 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.042203 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.042223 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.042243 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.042267 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.042288 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.042306 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.042327 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.042437 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.042463 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.042482 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.042504 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.042524 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.042543 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.042564 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.042584 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.042603 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.042623 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.042644 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.042663 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.042687 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.044623 4743 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.044674 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.044698 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.044722 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.044744 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.044764 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.044782 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.044800 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.044821 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.044842 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.044889 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.044910 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.044929 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.044948 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.044966 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.044987 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045008 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045038 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045057 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045076 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045095 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045115 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045133 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045152 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045171 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045192 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045213 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045238 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045260 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045297 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045320 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045344 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045365 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045386 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045405 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045426 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045448 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045467 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045487 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045508 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045529 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045550 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045570 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045591 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045610 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045629 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045649 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045670 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045691 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045709 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045731 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045754 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045774 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045803 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045824 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045847 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045898 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045919 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045943 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045965 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.045991 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046014 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046035 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046058 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046079 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046108 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046128 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046149 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046170 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046225 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046244 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046264 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046284 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046330 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046352 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046373 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046393 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046411 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046431 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046452 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046470 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046489 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046509 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046527 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046547 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046567 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046585 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046607 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046629 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046649 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046668 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046688 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046708 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046726 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046747 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046773 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046793 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046812 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046831 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046852 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046897 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046916 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046937 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046957 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046976 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.046995 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047015 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047035 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047056 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047077 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047095 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047115 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047133 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047152 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047171 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047191 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047210 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047231 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047250 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047269 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047288 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047307 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047326 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047349 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047368 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047391 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047417 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047448 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047475 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047499 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047522 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047540 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047576 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047595 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047613 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047632 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047652 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047670 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047691 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047710 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047728 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047748 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047767 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047787 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047808 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047826 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047845 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047932 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047954 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.047994 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.048011 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.048030 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.048047 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.048065 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.048083 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.048102 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.048121 4743 reconstruct.go:97] "Volume reconstruction finished" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.048135 4743 reconciler.go:26] "Reconciler: start to sync state" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.069550 4743 manager.go:324] Recovery completed Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.085948 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.086639 4743 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.088943 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.088993 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.089009 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.089833 4743 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.090045 4743 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.090244 4743 kubelet.go:2335] "Starting kubelet main sync loop" Oct 11 00:51:46 crc kubenswrapper[4743]: E1011 00:51:46.090529 4743 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 11 00:51:46 crc kubenswrapper[4743]: W1011 00:51:46.092395 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Oct 11 00:51:46 crc kubenswrapper[4743]: E1011 00:51:46.093286 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.093391 4743 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.093635 4743 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.093766 4743 state_mem.go:36] "Initialized new in-memory state store" Oct 11 00:51:46 crc kubenswrapper[4743]: E1011 00:51:46.119180 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.119643 4743 policy_none.go:49] "None policy: Start" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.121145 4743 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.121184 4743 state_mem.go:35] "Initializing new in-memory state store" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.189385 4743 manager.go:334] "Starting Device Plugin manager" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.189503 4743 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.189559 4743 server.go:79] "Starting device plugin registration server" Oct 11 00:51:46 crc kubenswrapper[4743]: E1011 00:51:46.191148 4743 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.191333 4743 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.191379 4743 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.191789 4743 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.191996 4743 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.192024 4743 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 11 00:51:46 crc kubenswrapper[4743]: E1011 00:51:46.205270 4743 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 11 00:51:46 crc kubenswrapper[4743]: E1011 00:51:46.221497 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="400ms" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.292026 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.293759 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.293813 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.293833 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.293906 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 11 00:51:46 crc kubenswrapper[4743]: E1011 00:51:46.294625 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.106:6443: connect: connection refused" node="crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.392115 4743 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.392370 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.394710 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.394782 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.394802 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.395078 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.395299 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.395374 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.396557 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.396601 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.396619 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.396694 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.396736 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.396759 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.396803 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.397024 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.397088 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.398176 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.398226 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.398246 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.398401 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.398449 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.398480 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.398498 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.398526 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.398564 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.399928 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.399981 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.399999 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.399932 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.400109 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.400128 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.400312 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.400504 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.400570 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.401653 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.401712 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.401737 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.402141 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.402217 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.402252 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.402292 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.402312 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.403606 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.403714 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.403735 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.458285 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.458397 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.458472 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.458636 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.458697 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.458811 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.458913 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.458962 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.459020 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.459063 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.459096 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.459146 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.459200 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.459295 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.459416 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.495014 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.497117 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.497267 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.497287 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.497328 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 11 00:51:46 crc kubenswrapper[4743]: E1011 00:51:46.498126 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.106:6443: connect: connection refused" node="crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.561183 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.561245 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.561286 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.561322 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.561356 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.561389 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.561419 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.561449 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.561488 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.561552 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.562012 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.562171 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.561482 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.562299 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.562386 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.562478 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.562569 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.562597 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.562602 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.562596 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.562767 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.562903 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.562940 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.563038 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.563043 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.563093 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.563137 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.563223 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.563152 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.561548 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: E1011 00:51:46.623314 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="800ms" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.735204 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.751786 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.782081 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: W1011 00:51:46.803052 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-eaaf9ecb0ae2166cd6cba73f8b0c06167dfad4630557a5ca87d76dfb64efbf90 WatchSource:0}: Error finding container eaaf9ecb0ae2166cd6cba73f8b0c06167dfad4630557a5ca87d76dfb64efbf90: Status 404 returned error can't find the container with id eaaf9ecb0ae2166cd6cba73f8b0c06167dfad4630557a5ca87d76dfb64efbf90 Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.804364 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: W1011 00:51:46.804729 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-180d0feaac03fc77b1840c7c96f1dface63eefc8831e37b90e81f9bfc15be82e WatchSource:0}: Error finding container 180d0feaac03fc77b1840c7c96f1dface63eefc8831e37b90e81f9bfc15be82e: Status 404 returned error can't find the container with id 180d0feaac03fc77b1840c7c96f1dface63eefc8831e37b90e81f9bfc15be82e Oct 11 00:51:46 crc kubenswrapper[4743]: W1011 00:51:46.815140 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-400de0f082d20bb9db6d8caf82fea8c310d449d24d4ff1cac803770e7931966a WatchSource:0}: Error finding container 400de0f082d20bb9db6d8caf82fea8c310d449d24d4ff1cac803770e7931966a: Status 404 returned error can't find the container with id 400de0f082d20bb9db6d8caf82fea8c310d449d24d4ff1cac803770e7931966a Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.815258 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 00:51:46 crc kubenswrapper[4743]: W1011 00:51:46.818847 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-869d41cd375a29e6120c025e5add9e965b43b234ed3216d977e070a539edf88a WatchSource:0}: Error finding container 869d41cd375a29e6120c025e5add9e965b43b234ed3216d977e070a539edf88a: Status 404 returned error can't find the container with id 869d41cd375a29e6120c025e5add9e965b43b234ed3216d977e070a539edf88a Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.898838 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.901328 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.901397 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.901419 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:46 crc kubenswrapper[4743]: I1011 00:51:46.901464 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 11 00:51:46 crc kubenswrapper[4743]: E1011 00:51:46.902158 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.106:6443: connect: connection refused" node="crc" Oct 11 00:51:47 crc kubenswrapper[4743]: I1011 00:51:47.012786 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Oct 11 00:51:47 crc kubenswrapper[4743]: W1011 00:51:47.092270 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Oct 11 00:51:47 crc kubenswrapper[4743]: E1011 00:51:47.092421 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Oct 11 00:51:47 crc kubenswrapper[4743]: I1011 00:51:47.098711 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"869d41cd375a29e6120c025e5add9e965b43b234ed3216d977e070a539edf88a"} Oct 11 00:51:47 crc kubenswrapper[4743]: I1011 00:51:47.101500 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"400de0f082d20bb9db6d8caf82fea8c310d449d24d4ff1cac803770e7931966a"} Oct 11 00:51:47 crc kubenswrapper[4743]: I1011 00:51:47.103079 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"eaaf9ecb0ae2166cd6cba73f8b0c06167dfad4630557a5ca87d76dfb64efbf90"} Oct 11 00:51:47 crc kubenswrapper[4743]: I1011 00:51:47.104782 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"180d0feaac03fc77b1840c7c96f1dface63eefc8831e37b90e81f9bfc15be82e"} Oct 11 00:51:47 crc kubenswrapper[4743]: I1011 00:51:47.105916 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"283bbb22fbd45cd0517da4418c5ff9d14de31715c35d8f2750c5c22c29658f81"} Oct 11 00:51:47 crc kubenswrapper[4743]: W1011 00:51:47.173992 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Oct 11 00:51:47 crc kubenswrapper[4743]: E1011 00:51:47.174129 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Oct 11 00:51:47 crc kubenswrapper[4743]: W1011 00:51:47.293018 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Oct 11 00:51:47 crc kubenswrapper[4743]: E1011 00:51:47.293115 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Oct 11 00:51:47 crc kubenswrapper[4743]: E1011 00:51:47.424085 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="1.6s" Oct 11 00:51:47 crc kubenswrapper[4743]: W1011 00:51:47.500842 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Oct 11 00:51:47 crc kubenswrapper[4743]: E1011 00:51:47.501019 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Oct 11 00:51:47 crc kubenswrapper[4743]: I1011 00:51:47.702666 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:47 crc kubenswrapper[4743]: I1011 00:51:47.705708 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:47 crc kubenswrapper[4743]: I1011 00:51:47.705774 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:47 crc kubenswrapper[4743]: I1011 00:51:47.705798 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:47 crc kubenswrapper[4743]: I1011 00:51:47.706088 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 11 00:51:47 crc kubenswrapper[4743]: E1011 00:51:47.709235 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.106:6443: connect: connection refused" node="crc" Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.012787 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.113043 4743 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="ceb1b6fd70dcc7f3acfd121861698011e37b5edd0a022432ed140d9508ed1f1d" exitCode=0 Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.113192 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.113236 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"ceb1b6fd70dcc7f3acfd121861698011e37b5edd0a022432ed140d9508ed1f1d"} Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.115053 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.115098 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.115117 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.118574 4743 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="c3532ffcf956621ad56477cbb9ff70c2855091a6ad996d3a15fa3c9a28943cea" exitCode=0 Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.118692 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.118697 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"c3532ffcf956621ad56477cbb9ff70c2855091a6ad996d3a15fa3c9a28943cea"} Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.120447 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.120475 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.120491 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.124539 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6"} Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.124589 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e"} Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.124610 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31"} Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.127593 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18" exitCode=0 Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.127698 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18"} Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.127817 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.130143 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.130225 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.130253 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.131657 4743 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6" exitCode=0 Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.131710 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6"} Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.131910 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.134268 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.134359 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.134390 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.134565 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.135684 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.135728 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:48 crc kubenswrapper[4743]: I1011 00:51:48.135746 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.012737 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Oct 11 00:51:49 crc kubenswrapper[4743]: E1011 00:51:49.025467 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="3.2s" Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.139098 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542"} Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.139163 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f"} Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.139179 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af"} Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.139222 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb"} Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.141153 4743 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4" exitCode=0 Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.141201 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4"} Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.141319 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.142458 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.142486 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.142496 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.150802 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"496005e7eab2c6d4e670411965f6761742df49342e67085345d2a0bc7edb484c"} Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.150990 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.152572 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.152612 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.152625 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.155217 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a2bc6f9e58ae313b026b5597d2569b343931bc066ac8b3751c62f39c8d849bf4"} Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.155255 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"895152fd89f4c35c7d379d3a93c1b5f185275cd151d27b442b8f06280f3f74a0"} Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.155275 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4335571969a3cb66f2fefdad63b77b4a31fc2631aba5ba427b2af4db8b6c6f0b"} Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.155418 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.156784 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.156823 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.156837 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.158724 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af"} Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.158880 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.159669 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.159714 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.159727 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:49 crc kubenswrapper[4743]: W1011 00:51:49.233175 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Oct 11 00:51:49 crc kubenswrapper[4743]: E1011 00:51:49.233320 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.314937 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.316418 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.316451 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.316461 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:49 crc kubenswrapper[4743]: I1011 00:51:49.316485 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 11 00:51:49 crc kubenswrapper[4743]: E1011 00:51:49.317183 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.106:6443: connect: connection refused" node="crc" Oct 11 00:51:50 crc kubenswrapper[4743]: I1011 00:51:50.127212 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 00:51:50 crc kubenswrapper[4743]: I1011 00:51:50.170125 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf"} Oct 11 00:51:50 crc kubenswrapper[4743]: I1011 00:51:50.170298 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:50 crc kubenswrapper[4743]: I1011 00:51:50.172013 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:50 crc kubenswrapper[4743]: I1011 00:51:50.172089 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:50 crc kubenswrapper[4743]: I1011 00:51:50.172112 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:50 crc kubenswrapper[4743]: I1011 00:51:50.173573 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe"} Oct 11 00:51:50 crc kubenswrapper[4743]: I1011 00:51:50.173595 4743 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe" exitCode=0 Oct 11 00:51:50 crc kubenswrapper[4743]: I1011 00:51:50.173768 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:50 crc kubenswrapper[4743]: I1011 00:51:50.173801 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:50 crc kubenswrapper[4743]: I1011 00:51:50.174052 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:50 crc kubenswrapper[4743]: I1011 00:51:50.174065 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 11 00:51:50 crc kubenswrapper[4743]: I1011 00:51:50.174358 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:50 crc kubenswrapper[4743]: I1011 00:51:50.180350 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:50 crc kubenswrapper[4743]: I1011 00:51:50.180385 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:50 crc kubenswrapper[4743]: I1011 00:51:50.180426 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:50 crc kubenswrapper[4743]: I1011 00:51:50.180451 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:50 crc kubenswrapper[4743]: I1011 00:51:50.180449 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:50 crc kubenswrapper[4743]: I1011 00:51:50.180665 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:50 crc kubenswrapper[4743]: I1011 00:51:50.180402 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:50 crc kubenswrapper[4743]: I1011 00:51:50.180815 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:50 crc kubenswrapper[4743]: I1011 00:51:50.180839 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:50 crc kubenswrapper[4743]: I1011 00:51:50.180993 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:50 crc kubenswrapper[4743]: I1011 00:51:50.181041 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:50 crc kubenswrapper[4743]: I1011 00:51:50.181065 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:51 crc kubenswrapper[4743]: I1011 00:51:51.184940 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9"} Oct 11 00:51:51 crc kubenswrapper[4743]: I1011 00:51:51.185010 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b"} Oct 11 00:51:51 crc kubenswrapper[4743]: I1011 00:51:51.185030 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac"} Oct 11 00:51:51 crc kubenswrapper[4743]: I1011 00:51:51.185060 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:51 crc kubenswrapper[4743]: I1011 00:51:51.185107 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 11 00:51:51 crc kubenswrapper[4743]: I1011 00:51:51.185174 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:51 crc kubenswrapper[4743]: I1011 00:51:51.186363 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:51 crc kubenswrapper[4743]: I1011 00:51:51.186405 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:51 crc kubenswrapper[4743]: I1011 00:51:51.186425 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:51 crc kubenswrapper[4743]: I1011 00:51:51.186664 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:51 crc kubenswrapper[4743]: I1011 00:51:51.186741 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:51 crc kubenswrapper[4743]: I1011 00:51:51.186766 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:51 crc kubenswrapper[4743]: I1011 00:51:51.767461 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 00:51:52 crc kubenswrapper[4743]: I1011 00:51:52.035086 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 00:51:52 crc kubenswrapper[4743]: I1011 00:51:52.193790 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722"} Oct 11 00:51:52 crc kubenswrapper[4743]: I1011 00:51:52.193934 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:52 crc kubenswrapper[4743]: I1011 00:51:52.193966 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556"} Oct 11 00:51:52 crc kubenswrapper[4743]: I1011 00:51:52.193906 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:52 crc kubenswrapper[4743]: I1011 00:51:52.195506 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:52 crc kubenswrapper[4743]: I1011 00:51:52.195599 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:52 crc kubenswrapper[4743]: I1011 00:51:52.195674 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:52 crc kubenswrapper[4743]: I1011 00:51:52.195603 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:52 crc kubenswrapper[4743]: I1011 00:51:52.195850 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:52 crc kubenswrapper[4743]: I1011 00:51:52.195906 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:52 crc kubenswrapper[4743]: I1011 00:51:52.456426 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 00:51:52 crc kubenswrapper[4743]: I1011 00:51:52.517636 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:52 crc kubenswrapper[4743]: I1011 00:51:52.519625 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:52 crc kubenswrapper[4743]: I1011 00:51:52.519716 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:52 crc kubenswrapper[4743]: I1011 00:51:52.519752 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:52 crc kubenswrapper[4743]: I1011 00:51:52.519804 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 11 00:51:52 crc kubenswrapper[4743]: I1011 00:51:52.585069 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 00:51:52 crc kubenswrapper[4743]: I1011 00:51:52.585481 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:52 crc kubenswrapper[4743]: I1011 00:51:52.587676 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:52 crc kubenswrapper[4743]: I1011 00:51:52.587741 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:52 crc kubenswrapper[4743]: I1011 00:51:52.587763 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:52 crc kubenswrapper[4743]: I1011 00:51:52.798641 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 00:51:52 crc kubenswrapper[4743]: I1011 00:51:52.805194 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 00:51:53 crc kubenswrapper[4743]: I1011 00:51:53.127736 4743 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 11 00:51:53 crc kubenswrapper[4743]: I1011 00:51:53.128214 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 11 00:51:53 crc kubenswrapper[4743]: I1011 00:51:53.196779 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:53 crc kubenswrapper[4743]: I1011 00:51:53.196846 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:53 crc kubenswrapper[4743]: I1011 00:51:53.197922 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:53 crc kubenswrapper[4743]: I1011 00:51:53.198403 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:53 crc kubenswrapper[4743]: I1011 00:51:53.198462 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:53 crc kubenswrapper[4743]: I1011 00:51:53.198484 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:53 crc kubenswrapper[4743]: I1011 00:51:53.198505 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:53 crc kubenswrapper[4743]: I1011 00:51:53.198539 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:53 crc kubenswrapper[4743]: I1011 00:51:53.198555 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:53 crc kubenswrapper[4743]: I1011 00:51:53.200267 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:53 crc kubenswrapper[4743]: I1011 00:51:53.200317 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:53 crc kubenswrapper[4743]: I1011 00:51:53.200337 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:53 crc kubenswrapper[4743]: I1011 00:51:53.777944 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 00:51:54 crc kubenswrapper[4743]: I1011 00:51:54.199295 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:54 crc kubenswrapper[4743]: I1011 00:51:54.199281 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:54 crc kubenswrapper[4743]: I1011 00:51:54.201275 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:54 crc kubenswrapper[4743]: I1011 00:51:54.201304 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:54 crc kubenswrapper[4743]: I1011 00:51:54.201357 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:54 crc kubenswrapper[4743]: I1011 00:51:54.201379 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:54 crc kubenswrapper[4743]: I1011 00:51:54.201323 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:54 crc kubenswrapper[4743]: I1011 00:51:54.201487 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:54 crc kubenswrapper[4743]: I1011 00:51:54.505056 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 11 00:51:54 crc kubenswrapper[4743]: I1011 00:51:54.505387 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:54 crc kubenswrapper[4743]: I1011 00:51:54.507154 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:54 crc kubenswrapper[4743]: I1011 00:51:54.507237 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:54 crc kubenswrapper[4743]: I1011 00:51:54.507258 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:55 crc kubenswrapper[4743]: I1011 00:51:55.011406 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 11 00:51:55 crc kubenswrapper[4743]: I1011 00:51:55.011613 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:55 crc kubenswrapper[4743]: I1011 00:51:55.013021 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:55 crc kubenswrapper[4743]: I1011 00:51:55.013074 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:55 crc kubenswrapper[4743]: I1011 00:51:55.013092 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:55 crc kubenswrapper[4743]: I1011 00:51:55.202641 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:55 crc kubenswrapper[4743]: I1011 00:51:55.203850 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:55 crc kubenswrapper[4743]: I1011 00:51:55.203941 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:55 crc kubenswrapper[4743]: I1011 00:51:55.203964 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:56 crc kubenswrapper[4743]: E1011 00:51:56.205421 4743 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 11 00:51:59 crc kubenswrapper[4743]: W1011 00:51:59.616044 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 11 00:51:59 crc kubenswrapper[4743]: I1011 00:51:59.616190 4743 trace.go:236] Trace[2061914000]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Oct-2025 00:51:49.614) (total time: 10001ms): Oct 11 00:51:59 crc kubenswrapper[4743]: Trace[2061914000]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:51:59.616) Oct 11 00:51:59 crc kubenswrapper[4743]: Trace[2061914000]: [10.00176227s] [10.00176227s] END Oct 11 00:51:59 crc kubenswrapper[4743]: E1011 00:51:59.616221 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 11 00:51:59 crc kubenswrapper[4743]: I1011 00:51:59.626660 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 11 00:51:59 crc kubenswrapper[4743]: I1011 00:51:59.627067 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:51:59 crc kubenswrapper[4743]: I1011 00:51:59.628895 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:51:59 crc kubenswrapper[4743]: I1011 00:51:59.629331 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:51:59 crc kubenswrapper[4743]: I1011 00:51:59.629358 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:51:59 crc kubenswrapper[4743]: W1011 00:51:59.733698 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 11 00:51:59 crc kubenswrapper[4743]: I1011 00:51:59.733839 4743 trace.go:236] Trace[768402566]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Oct-2025 00:51:49.731) (total time: 10002ms): Oct 11 00:51:59 crc kubenswrapper[4743]: Trace[768402566]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (00:51:59.733) Oct 11 00:51:59 crc kubenswrapper[4743]: Trace[768402566]: [10.002166107s] [10.002166107s] END Oct 11 00:51:59 crc kubenswrapper[4743]: E1011 00:51:59.733917 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 11 00:52:00 crc kubenswrapper[4743]: I1011 00:52:00.014368 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 11 00:52:00 crc kubenswrapper[4743]: W1011 00:52:00.056302 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 11 00:52:00 crc kubenswrapper[4743]: I1011 00:52:00.056420 4743 trace.go:236] Trace[581844752]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Oct-2025 00:51:50.055) (total time: 10001ms): Oct 11 00:52:00 crc kubenswrapper[4743]: Trace[581844752]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:52:00.056) Oct 11 00:52:00 crc kubenswrapper[4743]: Trace[581844752]: [10.001235966s] [10.001235966s] END Oct 11 00:52:00 crc kubenswrapper[4743]: E1011 00:52:00.056454 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 11 00:52:00 crc kubenswrapper[4743]: I1011 00:52:00.841082 4743 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 11 00:52:00 crc kubenswrapper[4743]: I1011 00:52:00.841169 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 11 00:52:00 crc kubenswrapper[4743]: I1011 00:52:00.848097 4743 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 11 00:52:00 crc kubenswrapper[4743]: I1011 00:52:00.848185 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 11 00:52:02 crc kubenswrapper[4743]: I1011 00:52:02.463907 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 00:52:02 crc kubenswrapper[4743]: I1011 00:52:02.464955 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:52:02 crc kubenswrapper[4743]: I1011 00:52:02.466895 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:02 crc kubenswrapper[4743]: I1011 00:52:02.466969 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:02 crc kubenswrapper[4743]: I1011 00:52:02.466992 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:02 crc kubenswrapper[4743]: I1011 00:52:02.472726 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 00:52:02 crc kubenswrapper[4743]: I1011 00:52:02.591846 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 00:52:02 crc kubenswrapper[4743]: I1011 00:52:02.592371 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:52:02 crc kubenswrapper[4743]: I1011 00:52:02.594013 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:02 crc kubenswrapper[4743]: I1011 00:52:02.594191 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:02 crc kubenswrapper[4743]: I1011 00:52:02.594323 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:03 crc kubenswrapper[4743]: I1011 00:52:03.128407 4743 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 11 00:52:03 crc kubenswrapper[4743]: I1011 00:52:03.128472 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 11 00:52:03 crc kubenswrapper[4743]: I1011 00:52:03.224175 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 11 00:52:03 crc kubenswrapper[4743]: I1011 00:52:03.224249 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 11 00:52:03 crc kubenswrapper[4743]: I1011 00:52:03.225614 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:03 crc kubenswrapper[4743]: I1011 00:52:03.225678 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:03 crc kubenswrapper[4743]: I1011 00:52:03.225703 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:03 crc kubenswrapper[4743]: I1011 00:52:03.547715 4743 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 11 00:52:04 crc kubenswrapper[4743]: I1011 00:52:04.271682 4743 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 11 00:52:05 crc kubenswrapper[4743]: E1011 00:52:05.836350 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.841140 4743 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.841224 4743 trace.go:236] Trace[586781370]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Oct-2025 00:51:53.255) (total time: 12585ms): Oct 11 00:52:05 crc kubenswrapper[4743]: Trace[586781370]: ---"Objects listed" error: 12585ms (00:52:05.841) Oct 11 00:52:05 crc kubenswrapper[4743]: Trace[586781370]: [12.585729195s] [12.585729195s] END Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.841268 4743 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.847531 4743 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.847960 4743 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.850148 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.850746 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.850784 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.850821 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.850841 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:05Z","lastTransitionTime":"2025-10-11T00:52:05Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.872945 4743 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 11 00:52:05 crc kubenswrapper[4743]: E1011 00:52:05.877524 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.883763 4743 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:45864->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.883843 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:45864->192.168.126.11:17697: read: connection reset by peer" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.884296 4743 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.884395 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.886229 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.886268 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.886285 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.886314 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.886329 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:05Z","lastTransitionTime":"2025-10-11T00:52:05Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Oct 11 00:52:05 crc kubenswrapper[4743]: E1011 00:52:05.910152 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.921068 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.921116 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.921130 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.921151 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.921161 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:05Z","lastTransitionTime":"2025-10-11T00:52:05Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Oct 11 00:52:05 crc kubenswrapper[4743]: E1011 00:52:05.935498 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.940770 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.940810 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.940821 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.940867 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.940879 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:05Z","lastTransitionTime":"2025-10-11T00:52:05Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Oct 11 00:52:05 crc kubenswrapper[4743]: E1011 00:52:05.960368 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.967737 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.968117 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.968193 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.968352 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.968424 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:05Z","lastTransitionTime":"2025-10-11T00:52:05Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Oct 11 00:52:05 crc kubenswrapper[4743]: E1011 00:52:05.992191 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 00:52:05 crc kubenswrapper[4743]: E1011 00:52:05.992661 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.994369 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.994400 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.994412 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.994428 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:05 crc kubenswrapper[4743]: I1011 00:52:05.994438 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:05Z","lastTransitionTime":"2025-10-11T00:52:05Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.004973 4743 apiserver.go:52] "Watching apiserver" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.007916 4743 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.008216 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.008542 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.008600 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.008630 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.008891 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.008951 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.009208 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.009258 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.009385 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.009474 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.011040 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.011568 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.011656 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.011731 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.011866 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.012028 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.012084 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.012317 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.019420 4743 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.023401 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.041803 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.041850 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.041911 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.041940 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.041965 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.041987 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042010 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042035 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042058 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042089 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042113 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042138 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042161 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042156 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042184 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042206 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042229 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042252 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042272 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042297 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042305 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042321 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042372 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042387 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042412 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042430 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042451 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042470 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042493 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042510 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042527 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042545 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042563 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042569 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042581 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042601 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042608 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042619 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042660 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042681 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042700 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042716 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042731 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042732 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042747 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042765 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042815 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042833 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042850 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042880 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042895 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042910 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042926 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042941 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042963 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.042986 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043008 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043032 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043053 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043075 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043096 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043108 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043117 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043167 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043194 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043216 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043261 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043278 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043296 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043313 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043329 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043344 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043360 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043396 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043410 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043425 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043440 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043456 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043472 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043490 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043506 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043523 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.044103 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.044145 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.044169 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.044204 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.044239 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.044270 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.044305 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.044341 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.044369 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043294 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.043470 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.044072 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.047596 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.044234 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.044295 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.044567 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.044660 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.044844 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.044944 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.045134 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.045327 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.045369 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.045603 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.045632 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.046136 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.046180 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.046531 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.046737 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.047161 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.047558 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.047885 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.048142 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.048496 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.048665 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.048885 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.049028 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.049174 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.049442 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.049528 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.049827 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.050079 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.050105 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.050423 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.050529 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.051344 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.051608 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.051686 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.051827 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.051841 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.052122 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.052166 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.052158 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.052201 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.052376 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.052372 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.052389 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.052482 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.052596 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.052637 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.052828 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.052845 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.053575 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.053618 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.053652 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.053673 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.053698 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.053719 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.053738 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.053761 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.053783 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.053801 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.053820 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.053840 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.053883 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.053903 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.053924 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.053943 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.053962 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.053980 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.054651 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.057215 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.057367 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.057456 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.057532 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.057603 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.057682 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060050 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060094 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060114 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060135 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060156 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060240 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060261 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060280 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060301 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060320 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060337 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060357 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060412 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060432 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060452 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060470 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060488 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060505 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060524 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060543 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060560 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060578 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060599 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060619 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060639 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060657 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060679 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060698 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060725 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060743 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060761 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060780 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060799 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060817 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060836 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060871 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060892 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060914 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060934 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060955 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060988 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061008 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061028 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061043 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061061 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061079 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061098 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061117 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061138 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061154 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061173 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061191 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061208 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061231 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061256 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061276 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061292 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061312 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061329 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061345 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061363 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061379 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061398 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061452 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061468 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061486 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061505 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061522 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061541 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061558 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061577 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061595 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061610 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061682 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061701 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061718 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061738 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061754 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061773 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061797 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061820 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061841 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061911 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061937 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061959 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061979 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062016 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062099 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062130 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062156 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062181 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062212 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062241 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062265 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062300 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062325 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062350 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062380 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062408 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062436 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062462 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062574 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062589 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062602 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062617 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062630 4743 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062643 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062657 4743 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062670 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062684 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062699 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062714 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062728 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062744 4743 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062758 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062771 4743 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062784 4743 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062798 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062812 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062825 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062838 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062868 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062883 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062896 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062910 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062924 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062936 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062948 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062961 4743 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062974 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062987 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063001 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063014 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063027 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063040 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063053 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063068 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063080 4743 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063092 4743 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063105 4743 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063117 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063130 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063143 4743 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063157 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063170 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063182 4743 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063195 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063207 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063219 4743 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063232 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063245 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063255 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063266 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063276 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063285 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063296 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063305 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063314 4743 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063324 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063333 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.053978 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.069053 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.054000 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.054289 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.054519 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.054562 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.054600 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.054887 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.054945 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.054956 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.055088 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.055254 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.055387 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.055611 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.056241 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.056335 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.057170 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.057179 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.057389 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.057670 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.057930 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.057982 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.058228 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.056569 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.058402 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.058701 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.059077 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.059179 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.059194 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060019 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.060570 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061545 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061724 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.061974 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062253 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062702 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.062949 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.063333 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.064244 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.064267 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.064489 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.064605 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.064710 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.064944 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.065204 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.065320 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.065714 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.066145 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.066367 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.067192 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.067491 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.067690 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.067682 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.067738 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.067986 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.068132 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.068191 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.068214 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.068210 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.068369 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.068453 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.068475 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.068491 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.068527 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.068526 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.068716 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.068963 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.069018 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.069135 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.069180 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.069323 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.069377 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.069495 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.069514 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.069675 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.069970 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.070008 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.070040 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.070299 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.070395 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.070515 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.070893 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.070803 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.070686 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.071117 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.071135 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:52:06.571112144 +0000 UTC m=+21.224092541 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.071978 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.072355 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.072730 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.072907 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.072951 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.073575 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.073589 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.073954 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.074387 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.074652 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.075289 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.075717 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.075772 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:06.575756835 +0000 UTC m=+21.228737232 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.076002 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.076048 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:06.576031712 +0000 UTC m=+21.229012109 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.076495 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.076873 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.077408 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.079119 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.079229 4743 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.081667 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.082143 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.083483 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.084796 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.084952 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.085125 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.086953 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.087100 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.088948 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.089264 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.090101 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.090190 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.090337 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.090464 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.090592 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.090607 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.090626 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.090639 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.090704 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:06.590687333 +0000 UTC m=+21.243667730 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.091554 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.091693 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.092260 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.094250 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.094403 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.094585 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.094633 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.094636 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.094623 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.095153 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.095586 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.095771 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.095957 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.095987 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.098400 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.098430 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.098472 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.098490 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.098502 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:06Z","lastTransitionTime":"2025-10-11T00:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.102161 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.103779 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.105590 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.106210 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.106246 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.106267 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.106347 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:06.606317419 +0000 UTC m=+21.259298036 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.106398 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.107335 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.108381 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.108631 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.109632 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.109658 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.111887 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.112087 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.112344 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.113309 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.114525 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.114999 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.115845 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.116977 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.119327 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.119985 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.120741 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.126093 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.126775 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.127961 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.131037 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.131286 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.132844 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.134125 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.135709 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.136399 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.137846 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.139569 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.139660 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.140496 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.141753 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.141929 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.142149 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.142763 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.145271 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.146032 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.146636 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.147584 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.148284 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.149181 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.149820 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.150973 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.151455 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.151555 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.152592 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.153537 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.154066 4743 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.154173 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.156626 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.157311 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.158450 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.160251 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.160580 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.161473 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.162440 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.163069 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.164090 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.164141 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.164251 4743 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.164270 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.164280 4743 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.164290 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.164301 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.164317 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.164326 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.164335 4743 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.164344 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.164354 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.164362 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.164370 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.164384 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.164403 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.164290 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.165086 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.165427 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.165463 4743 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.165477 4743 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.165488 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.165674 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166008 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166081 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166098 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166111 4743 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166129 4743 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166141 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166152 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166166 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166182 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166216 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166241 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166258 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166271 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166283 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166295 4743 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166311 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166323 4743 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166334 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166345 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166360 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166373 4743 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166386 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166397 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166411 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166423 4743 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166435 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166451 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166461 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166471 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166482 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166497 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166508 4743 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166518 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166531 4743 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166547 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166557 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166568 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166579 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166593 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166604 4743 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166615 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166631 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166643 4743 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166654 4743 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166665 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166680 4743 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.166690 4743 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167140 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167154 4743 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167192 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167203 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167215 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167231 4743 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167242 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167318 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167330 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167367 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167379 4743 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167392 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167406 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167424 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167437 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167447 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167460 4743 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167475 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167486 4743 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167497 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167511 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167522 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167533 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167543 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167763 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167806 4743 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167833 4743 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167850 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167878 4743 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167890 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167894 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167907 4743 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167920 4743 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167958 4743 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167970 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167986 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.167999 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168014 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168032 4743 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168047 4743 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168059 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168070 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168082 4743 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168092 4743 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168106 4743 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168119 4743 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168136 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168151 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168165 4743 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168188 4743 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168203 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168222 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168236 4743 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168254 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168271 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168284 4743 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168298 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168316 4743 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168329 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168344 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168356 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168374 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168387 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168400 4743 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168420 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168436 4743 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168449 4743 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168519 4743 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168538 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168553 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168568 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.168584 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.169181 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.169998 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.171093 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.171159 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.171683 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.173991 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.176324 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.176981 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.177878 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.178871 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.179561 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.180902 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.181482 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.183033 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.192061 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.201193 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.204374 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.204402 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.204411 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.204427 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.204442 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:06Z","lastTransitionTime":"2025-10-11T00:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.211123 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.234332 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.239111 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf" exitCode=255 Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.239179 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf"} Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.250090 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.263518 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.274665 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.283941 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.294795 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.301903 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.302266 4743 scope.go:117] "RemoveContainer" containerID="530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.306571 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.306600 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.306613 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.306632 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.306645 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:06Z","lastTransitionTime":"2025-10-11T00:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.307187 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.323896 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.335166 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.340764 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 11 00:52:06 crc kubenswrapper[4743]: W1011 00:52:06.343519 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-adfcf56ab7ea6808a46c836b426a202958a03636c274317429754ed20ea214ca WatchSource:0}: Error finding container adfcf56ab7ea6808a46c836b426a202958a03636c274317429754ed20ea214ca: Status 404 returned error can't find the container with id adfcf56ab7ea6808a46c836b426a202958a03636c274317429754ed20ea214ca Oct 11 00:52:06 crc kubenswrapper[4743]: W1011 00:52:06.351697 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-6e8dfbbc782885d1479cb460bb5dc40e0b7432d5cbb99ce95437a44285326a45 WatchSource:0}: Error finding container 6e8dfbbc782885d1479cb460bb5dc40e0b7432d5cbb99ce95437a44285326a45: Status 404 returned error can't find the container with id 6e8dfbbc782885d1479cb460bb5dc40e0b7432d5cbb99ce95437a44285326a45 Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.410297 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.410543 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.410561 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.410637 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.410649 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:06Z","lastTransitionTime":"2025-10-11T00:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.514105 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.514142 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.514152 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.514167 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.514177 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:06Z","lastTransitionTime":"2025-10-11T00:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.571175 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.571691 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:52:07.571667234 +0000 UTC m=+22.224647651 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.616590 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.616658 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.616688 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.616710 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.616724 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:06Z","lastTransitionTime":"2025-10-11T00:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.672017 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.672070 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.672097 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.672128 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.672258 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.672286 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.672346 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.672388 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.672404 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.672320 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.672456 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.672468 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.672433 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:07.672406505 +0000 UTC m=+22.325386892 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.672544 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:07.672515507 +0000 UTC m=+22.325495904 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.672562 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:07.672553798 +0000 UTC m=+22.325534185 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 00:52:06 crc kubenswrapper[4743]: E1011 00:52:06.672572 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:07.672567749 +0000 UTC m=+22.325548136 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.719667 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.719707 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.719716 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.719730 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.719742 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:06Z","lastTransitionTime":"2025-10-11T00:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.822662 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.822707 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.822717 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.822733 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.822742 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:06Z","lastTransitionTime":"2025-10-11T00:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.925036 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.925086 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.925107 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.925125 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:06 crc kubenswrapper[4743]: I1011 00:52:06.925135 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:06Z","lastTransitionTime":"2025-10-11T00:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.027478 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.027519 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.027530 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.027547 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.027557 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:07Z","lastTransitionTime":"2025-10-11T00:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.129330 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.129372 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.129382 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.129398 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.129408 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:07Z","lastTransitionTime":"2025-10-11T00:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.217842 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.218316 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-cvm72"] Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.218631 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-6wcnk"] Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.218876 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.219101 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.219801 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-48ljj"] Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.222593 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-vlxgw"] Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.222938 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-9jfxn"] Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.223463 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.223790 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.224098 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vlxgw" Oct 11 00:52:07 crc kubenswrapper[4743]: W1011 00:52:07.225331 4743 reflector.go:561] object-"openshift-machine-config-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Oct 11 00:52:07 crc kubenswrapper[4743]: E1011 00:52:07.225375 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 11 00:52:07 crc kubenswrapper[4743]: W1011 00:52:07.225440 4743 reflector.go:561] object-"openshift-multus"/"cni-copy-resources": failed to list *v1.ConfigMap: configmaps "cni-copy-resources" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.225449 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.225459 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 11 00:52:07 crc kubenswrapper[4743]: W1011 00:52:07.225501 4743 reflector.go:561] object-"openshift-machine-config-operator"/"proxy-tls": failed to list *v1.Secret: secrets "proxy-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Oct 11 00:52:07 crc kubenswrapper[4743]: E1011 00:52:07.225519 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"proxy-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"proxy-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 11 00:52:07 crc kubenswrapper[4743]: E1011 00:52:07.225453 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"cni-copy-resources\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"cni-copy-resources\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 11 00:52:07 crc kubenswrapper[4743]: W1011 00:52:07.225564 4743 reflector.go:561] object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": failed to list *v1.Secret: secrets "multus-ancillary-tools-dockercfg-vnmsz" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 11 00:52:07 crc kubenswrapper[4743]: E1011 00:52:07.225576 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vnmsz\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"multus-ancillary-tools-dockercfg-vnmsz\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.225651 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 11 00:52:07 crc kubenswrapper[4743]: W1011 00:52:07.225818 4743 reflector.go:561] object-"openshift-machine-config-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Oct 11 00:52:07 crc kubenswrapper[4743]: E1011 00:52:07.225843 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 11 00:52:07 crc kubenswrapper[4743]: W1011 00:52:07.225909 4743 reflector.go:561] object-"openshift-multus"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 11 00:52:07 crc kubenswrapper[4743]: E1011 00:52:07.225925 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 11 00:52:07 crc kubenswrapper[4743]: W1011 00:52:07.225978 4743 reflector.go:561] object-"openshift-multus"/"default-cni-sysctl-allowlist": failed to list *v1.ConfigMap: configmaps "default-cni-sysctl-allowlist" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 11 00:52:07 crc kubenswrapper[4743]: E1011 00:52:07.225996 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"default-cni-sysctl-allowlist\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 11 00:52:07 crc kubenswrapper[4743]: W1011 00:52:07.226044 4743 reflector.go:561] object-"openshift-machine-config-operator"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Oct 11 00:52:07 crc kubenswrapper[4743]: E1011 00:52:07.226059 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.226188 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 11 00:52:07 crc kubenswrapper[4743]: W1011 00:52:07.226355 4743 reflector.go:561] object-"openshift-multus"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 11 00:52:07 crc kubenswrapper[4743]: E1011 00:52:07.226380 4743 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.230645 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.230756 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.230888 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.230917 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.231105 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.232161 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.232309 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.232521 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.235123 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.235491 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.235662 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.235677 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.235739 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.235771 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:07Z","lastTransitionTime":"2025-10-11T00:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.243264 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553"} Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.243331 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0"} Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.243345 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"946f86b833c7ca099bb9859660f6e25e9841ac42afe8e8a9083a68263026bc29"} Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.244758 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"6e8dfbbc782885d1479cb460bb5dc40e0b7432d5cbb99ce95437a44285326a45"} Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.248135 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef"} Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.248275 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"adfcf56ab7ea6808a46c836b426a202958a03636c274317429754ed20ea214ca"} Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.253224 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.255488 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41"} Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.255891 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.276833 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:07Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.277223 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-log-socket\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.277335 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.277446 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-slash\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.277544 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-host-var-lib-cni-bin\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.277644 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d677d122-c8be-4938-8d2c-bde4a088a63a-hosts-file\") pod \"node-resolver-vlxgw\" (UID: \"d677d122-c8be-4938-8d2c-bde4a088a63a\") " pod="openshift-dns/node-resolver-vlxgw" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.277731 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-systemd-units\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.277812 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-etc-openvswitch\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.277926 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-system-cni-dir\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.278025 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/add92263-e252-446b-95de-092585b4357f-rootfs\") pod \"machine-config-daemon-cvm72\" (UID: \"add92263-e252-446b-95de-092585b4357f\") " pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.278118 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jkrc\" (UniqueName: \"kubernetes.io/projected/06a7b971-8779-491c-8d3f-e7d5b4d60968-kube-api-access-5jkrc\") pod \"multus-additional-cni-plugins-6wcnk\" (UID: \"06a7b971-8779-491c-8d3f-e7d5b4d60968\") " pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.278206 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-run-netns\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.278293 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-run-systemd\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.278389 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ed16b35-862f-47f2-9e32-63c98f868fb8-ovnkube-config\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.278483 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9ed16b35-862f-47f2-9e32-63c98f868fb8-ovnkube-script-lib\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.278572 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-cnibin\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.278660 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-multus-conf-dir\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.278749 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d42sg\" (UniqueName: \"kubernetes.io/projected/add92263-e252-446b-95de-092585b4357f-kube-api-access-d42sg\") pod \"machine-config-daemon-cvm72\" (UID: \"add92263-e252-446b-95de-092585b4357f\") " pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.278873 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e8c603f4-717c-4554-992a-8338b3bef24d-cni-binary-copy\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.278958 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-hostroot\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.279051 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-host-run-multus-certs\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.279133 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95k6c\" (UniqueName: \"kubernetes.io/projected/d677d122-c8be-4938-8d2c-bde4a088a63a-kube-api-access-95k6c\") pod \"node-resolver-vlxgw\" (UID: \"d677d122-c8be-4938-8d2c-bde4a088a63a\") " pod="openshift-dns/node-resolver-vlxgw" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.279218 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/add92263-e252-446b-95de-092585b4357f-proxy-tls\") pod \"machine-config-daemon-cvm72\" (UID: \"add92263-e252-446b-95de-092585b4357f\") " pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.279309 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-run-ovn-kubernetes\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.279395 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-cni-netd\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.279488 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-multus-socket-dir-parent\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.280148 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/06a7b971-8779-491c-8d3f-e7d5b4d60968-os-release\") pod \"multus-additional-cni-plugins-6wcnk\" (UID: \"06a7b971-8779-491c-8d3f-e7d5b4d60968\") " pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.280256 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/06a7b971-8779-491c-8d3f-e7d5b4d60968-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6wcnk\" (UID: \"06a7b971-8779-491c-8d3f-e7d5b4d60968\") " pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.280345 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-node-log\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.280439 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-etc-kubernetes\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.280530 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-run-ovn\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.280639 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-host-run-netns\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.280739 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/06a7b971-8779-491c-8d3f-e7d5b4d60968-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6wcnk\" (UID: \"06a7b971-8779-491c-8d3f-e7d5b4d60968\") " pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.280830 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw92h\" (UniqueName: \"kubernetes.io/projected/9ed16b35-862f-47f2-9e32-63c98f868fb8-kube-api-access-xw92h\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.280930 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-host-var-lib-kubelet\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.281024 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/06a7b971-8779-491c-8d3f-e7d5b4d60968-cnibin\") pod \"multus-additional-cni-plugins-6wcnk\" (UID: \"06a7b971-8779-491c-8d3f-e7d5b4d60968\") " pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.281113 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/06a7b971-8779-491c-8d3f-e7d5b4d60968-cni-binary-copy\") pod \"multus-additional-cni-plugins-6wcnk\" (UID: \"06a7b971-8779-491c-8d3f-e7d5b4d60968\") " pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.281227 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-cni-bin\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.281311 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-os-release\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.281399 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b2pr\" (UniqueName: \"kubernetes.io/projected/e8c603f4-717c-4554-992a-8338b3bef24d-kube-api-access-5b2pr\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.281497 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-host-run-k8s-cni-cncf-io\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.281587 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-multus-cni-dir\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.281680 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e8c603f4-717c-4554-992a-8338b3bef24d-multus-daemon-config\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.281766 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/add92263-e252-446b-95de-092585b4357f-mcd-auth-proxy-config\") pod \"machine-config-daemon-cvm72\" (UID: \"add92263-e252-446b-95de-092585b4357f\") " pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.281881 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/06a7b971-8779-491c-8d3f-e7d5b4d60968-system-cni-dir\") pod \"multus-additional-cni-plugins-6wcnk\" (UID: \"06a7b971-8779-491c-8d3f-e7d5b4d60968\") " pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.281996 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-kubelet\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.282093 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-var-lib-openvswitch\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.282184 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-run-openvswitch\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.282255 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ed16b35-862f-47f2-9e32-63c98f868fb8-env-overrides\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.282352 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ed16b35-862f-47f2-9e32-63c98f868fb8-ovn-node-metrics-cert\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.282419 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-host-var-lib-cni-multus\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.289382 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:07Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.308313 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:07Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.321622 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:07Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.335633 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:07Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.338174 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.338220 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.338230 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.338244 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.338253 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:07Z","lastTransitionTime":"2025-10-11T00:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.351170 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:07Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.364033 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:07Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.375898 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:07Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.383798 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-run-ovn\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.383829 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-host-run-netns\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.383847 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/06a7b971-8779-491c-8d3f-e7d5b4d60968-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6wcnk\" (UID: \"06a7b971-8779-491c-8d3f-e7d5b4d60968\") " pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.383880 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw92h\" (UniqueName: \"kubernetes.io/projected/9ed16b35-862f-47f2-9e32-63c98f868fb8-kube-api-access-xw92h\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.383897 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-host-var-lib-kubelet\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.383912 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/06a7b971-8779-491c-8d3f-e7d5b4d60968-cnibin\") pod \"multus-additional-cni-plugins-6wcnk\" (UID: \"06a7b971-8779-491c-8d3f-e7d5b4d60968\") " pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.383927 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/06a7b971-8779-491c-8d3f-e7d5b4d60968-cni-binary-copy\") pod \"multus-additional-cni-plugins-6wcnk\" (UID: \"06a7b971-8779-491c-8d3f-e7d5b4d60968\") " pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.383959 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-cni-bin\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.383973 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-os-release\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.383987 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b2pr\" (UniqueName: \"kubernetes.io/projected/e8c603f4-717c-4554-992a-8338b3bef24d-kube-api-access-5b2pr\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384014 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-host-run-k8s-cni-cncf-io\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384035 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-multus-cni-dir\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384049 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e8c603f4-717c-4554-992a-8338b3bef24d-multus-daemon-config\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384065 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/add92263-e252-446b-95de-092585b4357f-mcd-auth-proxy-config\") pod \"machine-config-daemon-cvm72\" (UID: \"add92263-e252-446b-95de-092585b4357f\") " pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384081 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/06a7b971-8779-491c-8d3f-e7d5b4d60968-system-cni-dir\") pod \"multus-additional-cni-plugins-6wcnk\" (UID: \"06a7b971-8779-491c-8d3f-e7d5b4d60968\") " pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384096 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-kubelet\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384110 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-var-lib-openvswitch\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384127 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-run-openvswitch\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384142 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ed16b35-862f-47f2-9e32-63c98f868fb8-env-overrides\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384159 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ed16b35-862f-47f2-9e32-63c98f868fb8-ovn-node-metrics-cert\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384175 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-host-var-lib-cni-multus\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384190 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-log-socket\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384206 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384230 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-slash\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384244 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-host-var-lib-cni-bin\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384258 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jkrc\" (UniqueName: \"kubernetes.io/projected/06a7b971-8779-491c-8d3f-e7d5b4d60968-kube-api-access-5jkrc\") pod \"multus-additional-cni-plugins-6wcnk\" (UID: \"06a7b971-8779-491c-8d3f-e7d5b4d60968\") " pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384280 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d677d122-c8be-4938-8d2c-bde4a088a63a-hosts-file\") pod \"node-resolver-vlxgw\" (UID: \"d677d122-c8be-4938-8d2c-bde4a088a63a\") " pod="openshift-dns/node-resolver-vlxgw" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384294 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-systemd-units\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384310 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-etc-openvswitch\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384328 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-system-cni-dir\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384344 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/add92263-e252-446b-95de-092585b4357f-rootfs\") pod \"machine-config-daemon-cvm72\" (UID: \"add92263-e252-446b-95de-092585b4357f\") " pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384363 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-multus-conf-dir\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384381 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d42sg\" (UniqueName: \"kubernetes.io/projected/add92263-e252-446b-95de-092585b4357f-kube-api-access-d42sg\") pod \"machine-config-daemon-cvm72\" (UID: \"add92263-e252-446b-95de-092585b4357f\") " pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384400 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-run-netns\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384414 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-run-systemd\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384414 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/06a7b971-8779-491c-8d3f-e7d5b4d60968-system-cni-dir\") pod \"multus-additional-cni-plugins-6wcnk\" (UID: \"06a7b971-8779-491c-8d3f-e7d5b4d60968\") " pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384431 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ed16b35-862f-47f2-9e32-63c98f868fb8-ovnkube-config\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384474 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9ed16b35-862f-47f2-9e32-63c98f868fb8-ovnkube-script-lib\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384494 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-cnibin\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384538 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e8c603f4-717c-4554-992a-8338b3bef24d-cni-binary-copy\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384554 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-hostroot\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384571 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-host-run-multus-certs\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384590 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95k6c\" (UniqueName: \"kubernetes.io/projected/d677d122-c8be-4938-8d2c-bde4a088a63a-kube-api-access-95k6c\") pod \"node-resolver-vlxgw\" (UID: \"d677d122-c8be-4938-8d2c-bde4a088a63a\") " pod="openshift-dns/node-resolver-vlxgw" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384606 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/add92263-e252-446b-95de-092585b4357f-proxy-tls\") pod \"machine-config-daemon-cvm72\" (UID: \"add92263-e252-446b-95de-092585b4357f\") " pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384623 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-run-ovn-kubernetes\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384640 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-cni-netd\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384659 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-multus-socket-dir-parent\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384706 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/06a7b971-8779-491c-8d3f-e7d5b4d60968-os-release\") pod \"multus-additional-cni-plugins-6wcnk\" (UID: \"06a7b971-8779-491c-8d3f-e7d5b4d60968\") " pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384725 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/06a7b971-8779-491c-8d3f-e7d5b4d60968-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6wcnk\" (UID: \"06a7b971-8779-491c-8d3f-e7d5b4d60968\") " pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384743 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-node-log\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384763 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-etc-kubernetes\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384834 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-etc-kubernetes\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384889 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-run-ovn\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.384920 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-host-run-netns\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385049 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ed16b35-862f-47f2-9e32-63c98f868fb8-ovnkube-config\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385094 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-kubelet\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385118 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-var-lib-openvswitch\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385139 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-run-openvswitch\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385268 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-cnibin\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385506 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ed16b35-862f-47f2-9e32-63c98f868fb8-env-overrides\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385550 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-systemd-units\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385564 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-hostroot\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385574 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-etc-openvswitch\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385590 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-host-run-multus-certs\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385613 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-system-cni-dir\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385619 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9ed16b35-862f-47f2-9e32-63c98f868fb8-ovnkube-script-lib\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385642 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/add92263-e252-446b-95de-092585b4357f-rootfs\") pod \"machine-config-daemon-cvm72\" (UID: \"add92263-e252-446b-95de-092585b4357f\") " pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385667 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-multus-conf-dir\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385689 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-host-var-lib-kubelet\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385718 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/06a7b971-8779-491c-8d3f-e7d5b4d60968-cnibin\") pod \"multus-additional-cni-plugins-6wcnk\" (UID: \"06a7b971-8779-491c-8d3f-e7d5b4d60968\") " pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385769 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-run-ovn-kubernetes\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385797 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-cni-netd\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385829 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-multus-socket-dir-parent\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385868 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-run-netns\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385885 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/06a7b971-8779-491c-8d3f-e7d5b4d60968-os-release\") pod \"multus-additional-cni-plugins-6wcnk\" (UID: \"06a7b971-8779-491c-8d3f-e7d5b4d60968\") " pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385898 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-run-systemd\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385923 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-host-run-k8s-cni-cncf-io\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385941 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-cni-bin\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385980 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-os-release\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.385998 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-node-log\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.386198 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-slash\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.386225 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-multus-cni-dir\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.386281 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-host-var-lib-cni-bin\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.386359 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e8c603f4-717c-4554-992a-8338b3bef24d-host-var-lib-cni-multus\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.386384 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-log-socket\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.386482 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d677d122-c8be-4938-8d2c-bde4a088a63a-hosts-file\") pod \"node-resolver-vlxgw\" (UID: \"d677d122-c8be-4938-8d2c-bde4a088a63a\") " pod="openshift-dns/node-resolver-vlxgw" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.386512 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e8c603f4-717c-4554-992a-8338b3bef24d-multus-daemon-config\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.386584 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/06a7b971-8779-491c-8d3f-e7d5b4d60968-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6wcnk\" (UID: \"06a7b971-8779-491c-8d3f-e7d5b4d60968\") " pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.387258 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.393706 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ed16b35-862f-47f2-9e32-63c98f868fb8-ovn-node-metrics-cert\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.396702 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:07Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.402842 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw92h\" (UniqueName: \"kubernetes.io/projected/9ed16b35-862f-47f2-9e32-63c98f868fb8-kube-api-access-xw92h\") pod \"ovnkube-node-48ljj\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.405659 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95k6c\" (UniqueName: \"kubernetes.io/projected/d677d122-c8be-4938-8d2c-bde4a088a63a-kube-api-access-95k6c\") pod \"node-resolver-vlxgw\" (UID: \"d677d122-c8be-4938-8d2c-bde4a088a63a\") " pod="openshift-dns/node-resolver-vlxgw" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.407107 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:07Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.419577 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:07Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.430572 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:07Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.441296 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.441351 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.441364 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.441386 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.441400 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:07Z","lastTransitionTime":"2025-10-11T00:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.443026 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:07Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.458429 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:07Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.468957 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:07Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.483785 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:07Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.506956 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:07Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.517657 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:07Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.532650 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:07Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.543536 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.543583 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.543597 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.543619 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.543636 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:07Z","lastTransitionTime":"2025-10-11T00:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.547824 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:07Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.557361 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.569157 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vlxgw" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.586953 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:52:07 crc kubenswrapper[4743]: E1011 00:52:07.587327 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:52:09.587302683 +0000 UTC m=+24.240283090 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.647051 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.647090 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.647099 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.647148 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.647158 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:07Z","lastTransitionTime":"2025-10-11T00:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:07 crc kubenswrapper[4743]: W1011 00:52:07.666421 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd677d122_c8be_4938_8d2c_bde4a088a63a.slice/crio-084b129ed147ee8fbdbd34861bfd8b4b48999c30f24c6cdd6f4c546826979833 WatchSource:0}: Error finding container 084b129ed147ee8fbdbd34861bfd8b4b48999c30f24c6cdd6f4c546826979833: Status 404 returned error can't find the container with id 084b129ed147ee8fbdbd34861bfd8b4b48999c30f24c6cdd6f4c546826979833 Oct 11 00:52:07 crc kubenswrapper[4743]: W1011 00:52:07.667396 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ed16b35_862f_47f2_9e32_63c98f868fb8.slice/crio-2f5badd1be3c857cf217ca9129b12849439ab560d2d8295c9af0ad9dfafc557f WatchSource:0}: Error finding container 2f5badd1be3c857cf217ca9129b12849439ab560d2d8295c9af0ad9dfafc557f: Status 404 returned error can't find the container with id 2f5badd1be3c857cf217ca9129b12849439ab560d2d8295c9af0ad9dfafc557f Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.687780 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.687892 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.687919 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.687955 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:07 crc kubenswrapper[4743]: E1011 00:52:07.688061 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 00:52:07 crc kubenswrapper[4743]: E1011 00:52:07.688097 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 00:52:07 crc kubenswrapper[4743]: E1011 00:52:07.688121 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:09.688101104 +0000 UTC m=+24.341081501 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 00:52:07 crc kubenswrapper[4743]: E1011 00:52:07.688259 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:09.688230608 +0000 UTC m=+24.341211015 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 00:52:07 crc kubenswrapper[4743]: E1011 00:52:07.688355 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 00:52:07 crc kubenswrapper[4743]: E1011 00:52:07.688386 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 00:52:07 crc kubenswrapper[4743]: E1011 00:52:07.688402 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:07 crc kubenswrapper[4743]: E1011 00:52:07.688443 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:09.688433693 +0000 UTC m=+24.341414100 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:07 crc kubenswrapper[4743]: E1011 00:52:07.688503 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 00:52:07 crc kubenswrapper[4743]: E1011 00:52:07.688516 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 00:52:07 crc kubenswrapper[4743]: E1011 00:52:07.688527 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:07 crc kubenswrapper[4743]: E1011 00:52:07.688569 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:09.688559576 +0000 UTC m=+24.341539983 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.749654 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.749999 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.750016 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.750038 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.750053 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:07Z","lastTransitionTime":"2025-10-11T00:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.854306 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.854343 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.854352 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.854371 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.854380 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:07Z","lastTransitionTime":"2025-10-11T00:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.956748 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.956788 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.956799 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.956814 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:07 crc kubenswrapper[4743]: I1011 00:52:07.956823 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:07Z","lastTransitionTime":"2025-10-11T00:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.059210 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.059244 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.059253 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.059269 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.059279 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:08Z","lastTransitionTime":"2025-10-11T00:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.087629 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.090906 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:08 crc kubenswrapper[4743]: E1011 00:52:08.091065 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.091503 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:08 crc kubenswrapper[4743]: E1011 00:52:08.091580 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.092173 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:08 crc kubenswrapper[4743]: E1011 00:52:08.092245 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.094569 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.156462 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.162329 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.162398 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.162420 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.162458 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.162486 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:08Z","lastTransitionTime":"2025-10-11T00:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.171137 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d42sg\" (UniqueName: \"kubernetes.io/projected/add92263-e252-446b-95de-092585b4357f-kube-api-access-d42sg\") pod \"machine-config-daemon-cvm72\" (UID: \"add92263-e252-446b-95de-092585b4357f\") " pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.245650 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.258267 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/add92263-e252-446b-95de-092585b4357f-proxy-tls\") pod \"machine-config-daemon-cvm72\" (UID: \"add92263-e252-446b-95de-092585b4357f\") " pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.262550 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vlxgw" event={"ID":"d677d122-c8be-4938-8d2c-bde4a088a63a","Type":"ContainerStarted","Data":"9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9"} Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.262608 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vlxgw" event={"ID":"d677d122-c8be-4938-8d2c-bde4a088a63a","Type":"ContainerStarted","Data":"084b129ed147ee8fbdbd34861bfd8b4b48999c30f24c6cdd6f4c546826979833"} Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.263938 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.263974 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.263986 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.264003 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.264015 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:08Z","lastTransitionTime":"2025-10-11T00:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.264785 4743 generic.go:334] "Generic (PLEG): container finished" podID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerID="f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736" exitCode=0 Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.264904 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerDied","Data":"f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736"} Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.264955 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerStarted","Data":"2f5badd1be3c857cf217ca9129b12849439ab560d2d8295c9af0ad9dfafc557f"} Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.279751 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.291799 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.306506 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.306821 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.318709 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.330599 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.342464 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.343436 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.347638 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/add92263-e252-446b-95de-092585b4357f-mcd-auth-proxy-config\") pod \"machine-config-daemon-cvm72\" (UID: \"add92263-e252-446b-95de-092585b4357f\") " pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.356644 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.365840 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.365902 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.365914 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.365930 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.365942 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:08Z","lastTransitionTime":"2025-10-11T00:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.371085 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:08 crc kubenswrapper[4743]: E1011 00:52:08.386040 4743 configmap.go:193] Couldn't get configMap openshift-multus/cni-copy-resources: failed to sync configmap cache: timed out waiting for the condition Oct 11 00:52:08 crc kubenswrapper[4743]: E1011 00:52:08.386096 4743 configmap.go:193] Couldn't get configMap openshift-multus/cni-copy-resources: failed to sync configmap cache: timed out waiting for the condition Oct 11 00:52:08 crc kubenswrapper[4743]: E1011 00:52:08.386139 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/06a7b971-8779-491c-8d3f-e7d5b4d60968-cni-binary-copy podName:06a7b971-8779-491c-8d3f-e7d5b4d60968 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:08.886118151 +0000 UTC m=+23.539098558 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-binary-copy" (UniqueName: "kubernetes.io/configmap/06a7b971-8779-491c-8d3f-e7d5b4d60968-cni-binary-copy") pod "multus-additional-cni-plugins-6wcnk" (UID: "06a7b971-8779-491c-8d3f-e7d5b4d60968") : failed to sync configmap cache: timed out waiting for the condition Oct 11 00:52:08 crc kubenswrapper[4743]: E1011 00:52:08.386202 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8c603f4-717c-4554-992a-8338b3bef24d-cni-binary-copy podName:e8c603f4-717c-4554-992a-8338b3bef24d nodeName:}" failed. No retries permitted until 2025-10-11 00:52:08.886172282 +0000 UTC m=+23.539152799 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-binary-copy" (UniqueName: "kubernetes.io/configmap/e8c603f4-717c-4554-992a-8338b3bef24d-cni-binary-copy") pod "multus-9jfxn" (UID: "e8c603f4-717c-4554-992a-8338b3bef24d") : failed to sync configmap cache: timed out waiting for the condition Oct 11 00:52:08 crc kubenswrapper[4743]: E1011 00:52:08.386104 4743 configmap.go:193] Couldn't get configMap openshift-multus/default-cni-sysctl-allowlist: failed to sync configmap cache: timed out waiting for the condition Oct 11 00:52:08 crc kubenswrapper[4743]: E1011 00:52:08.386258 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/06a7b971-8779-491c-8d3f-e7d5b4d60968-cni-sysctl-allowlist podName:06a7b971-8779-491c-8d3f-e7d5b4d60968 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:08.886246494 +0000 UTC m=+23.539226981 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-sysctl-allowlist" (UniqueName: "kubernetes.io/configmap/06a7b971-8779-491c-8d3f-e7d5b4d60968-cni-sysctl-allowlist") pod "multus-additional-cni-plugins-6wcnk" (UID: "06a7b971-8779-491c-8d3f-e7d5b4d60968") : failed to sync configmap cache: timed out waiting for the condition Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.388622 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:08 crc kubenswrapper[4743]: E1011 00:52:08.405729 4743 projected.go:288] Couldn't get configMap openshift-multus/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Oct 11 00:52:08 crc kubenswrapper[4743]: E1011 00:52:08.405781 4743 projected.go:194] Error preparing data for projected volume kube-api-access-5jkrc for pod openshift-multus/multus-additional-cni-plugins-6wcnk: failed to sync configmap cache: timed out waiting for the condition Oct 11 00:52:08 crc kubenswrapper[4743]: E1011 00:52:08.405850 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/06a7b971-8779-491c-8d3f-e7d5b4d60968-kube-api-access-5jkrc podName:06a7b971-8779-491c-8d3f-e7d5b4d60968 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:08.905828464 +0000 UTC m=+23.558808871 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5jkrc" (UniqueName: "kubernetes.io/projected/06a7b971-8779-491c-8d3f-e7d5b4d60968-kube-api-access-5jkrc") pod "multus-additional-cni-plugins-6wcnk" (UID: "06a7b971-8779-491c-8d3f-e7d5b4d60968") : failed to sync configmap cache: timed out waiting for the condition Oct 11 00:52:08 crc kubenswrapper[4743]: E1011 00:52:08.406822 4743 projected.go:288] Couldn't get configMap openshift-multus/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Oct 11 00:52:08 crc kubenswrapper[4743]: E1011 00:52:08.407022 4743 projected.go:194] Error preparing data for projected volume kube-api-access-5b2pr for pod openshift-multus/multus-9jfxn: failed to sync configmap cache: timed out waiting for the condition Oct 11 00:52:08 crc kubenswrapper[4743]: E1011 00:52:08.407235 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8c603f4-717c-4554-992a-8338b3bef24d-kube-api-access-5b2pr podName:e8c603f4-717c-4554-992a-8338b3bef24d nodeName:}" failed. No retries permitted until 2025-10-11 00:52:08.90720893 +0000 UTC m=+23.560189347 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5b2pr" (UniqueName: "kubernetes.io/projected/e8c603f4-717c-4554-992a-8338b3bef24d-kube-api-access-5b2pr") pod "multus-9jfxn" (UID: "e8c603f4-717c-4554-992a-8338b3bef24d") : failed to sync configmap cache: timed out waiting for the condition Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.414568 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.435456 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.442153 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.448985 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:08 crc kubenswrapper[4743]: W1011 00:52:08.455778 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadd92263_e252_446b_95de_092585b4357f.slice/crio-94864934381bd571cce087789c932a8272a513f31cfc3afa780742bd42724deb WatchSource:0}: Error finding container 94864934381bd571cce087789c932a8272a513f31cfc3afa780742bd42724deb: Status 404 returned error can't find the container with id 94864934381bd571cce087789c932a8272a513f31cfc3afa780742bd42724deb Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.467491 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.467542 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.467557 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.467579 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.467595 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:08Z","lastTransitionTime":"2025-10-11T00:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.469428 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.476439 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.485159 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.498374 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.510073 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.523806 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.525347 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.544601 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.557074 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.570779 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.572815 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.572893 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.572907 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.572930 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.572943 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:08Z","lastTransitionTime":"2025-10-11T00:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.591334 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.608087 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.621885 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.635173 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.653265 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.675707 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.675747 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.675759 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.675778 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.675790 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:08Z","lastTransitionTime":"2025-10-11T00:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.713918 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.778375 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.778412 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.778423 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.778441 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.778450 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:08Z","lastTransitionTime":"2025-10-11T00:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.881383 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.881458 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.881491 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.881533 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.881551 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:08Z","lastTransitionTime":"2025-10-11T00:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.901297 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e8c603f4-717c-4554-992a-8338b3bef24d-cni-binary-copy\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.901360 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/06a7b971-8779-491c-8d3f-e7d5b4d60968-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6wcnk\" (UID: \"06a7b971-8779-491c-8d3f-e7d5b4d60968\") " pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.901385 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/06a7b971-8779-491c-8d3f-e7d5b4d60968-cni-binary-copy\") pod \"multus-additional-cni-plugins-6wcnk\" (UID: \"06a7b971-8779-491c-8d3f-e7d5b4d60968\") " pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.902184 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/06a7b971-8779-491c-8d3f-e7d5b4d60968-cni-binary-copy\") pod \"multus-additional-cni-plugins-6wcnk\" (UID: \"06a7b971-8779-491c-8d3f-e7d5b4d60968\") " pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.902184 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e8c603f4-717c-4554-992a-8338b3bef24d-cni-binary-copy\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.902236 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/06a7b971-8779-491c-8d3f-e7d5b4d60968-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6wcnk\" (UID: \"06a7b971-8779-491c-8d3f-e7d5b4d60968\") " pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.984651 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.984713 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.984726 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.984747 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:08 crc kubenswrapper[4743]: I1011 00:52:08.984760 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:08Z","lastTransitionTime":"2025-10-11T00:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.002597 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b2pr\" (UniqueName: \"kubernetes.io/projected/e8c603f4-717c-4554-992a-8338b3bef24d-kube-api-access-5b2pr\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.002666 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jkrc\" (UniqueName: \"kubernetes.io/projected/06a7b971-8779-491c-8d3f-e7d5b4d60968-kube-api-access-5jkrc\") pod \"multus-additional-cni-plugins-6wcnk\" (UID: \"06a7b971-8779-491c-8d3f-e7d5b4d60968\") " pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.007180 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b2pr\" (UniqueName: \"kubernetes.io/projected/e8c603f4-717c-4554-992a-8338b3bef24d-kube-api-access-5b2pr\") pod \"multus-9jfxn\" (UID: \"e8c603f4-717c-4554-992a-8338b3bef24d\") " pod="openshift-multus/multus-9jfxn" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.007795 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jkrc\" (UniqueName: \"kubernetes.io/projected/06a7b971-8779-491c-8d3f-e7d5b4d60968-kube-api-access-5jkrc\") pod \"multus-additional-cni-plugins-6wcnk\" (UID: \"06a7b971-8779-491c-8d3f-e7d5b4d60968\") " pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.055796 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.067742 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9jfxn" Oct 11 00:52:09 crc kubenswrapper[4743]: W1011 00:52:09.079620 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8c603f4_717c_4554_992a_8338b3bef24d.slice/crio-cf9cf71378af78305602d9b6d1829aba152688d31f059b82d6d4fde9e64489a7 WatchSource:0}: Error finding container cf9cf71378af78305602d9b6d1829aba152688d31f059b82d6d4fde9e64489a7: Status 404 returned error can't find the container with id cf9cf71378af78305602d9b6d1829aba152688d31f059b82d6d4fde9e64489a7 Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.086721 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.086752 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.086761 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.086776 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.086784 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:09Z","lastTransitionTime":"2025-10-11T00:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.189303 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.189697 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.189706 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.189720 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.189730 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:09Z","lastTransitionTime":"2025-10-11T00:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.269641 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9jfxn" event={"ID":"e8c603f4-717c-4554-992a-8338b3bef24d","Type":"ContainerStarted","Data":"853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad"} Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.269713 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9jfxn" event={"ID":"e8c603f4-717c-4554-992a-8338b3bef24d","Type":"ContainerStarted","Data":"cf9cf71378af78305602d9b6d1829aba152688d31f059b82d6d4fde9e64489a7"} Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.273271 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" event={"ID":"06a7b971-8779-491c-8d3f-e7d5b4d60968","Type":"ContainerStarted","Data":"8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a"} Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.273352 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" event={"ID":"06a7b971-8779-491c-8d3f-e7d5b4d60968","Type":"ContainerStarted","Data":"c1c1326127dcb61df36b27c3999807bf45368629a15cee6db8a3fe597d1bac27"} Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.279779 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerStarted","Data":"487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35"} Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.279842 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerStarted","Data":"acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885"} Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.279868 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerStarted","Data":"4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b"} Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.279877 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerStarted","Data":"d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee"} Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.279886 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerStarted","Data":"17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63"} Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.279898 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerStarted","Data":"d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe"} Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.282237 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728"} Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.284709 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7"} Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.284773 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c"} Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.284788 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"94864934381bd571cce087789c932a8272a513f31cfc3afa780742bd42724deb"} Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.287754 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.293180 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.293225 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.293236 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.293257 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.293267 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:09Z","lastTransitionTime":"2025-10-11T00:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.301413 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.313423 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.327285 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.342604 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.361207 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.376020 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.387696 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.396324 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.396365 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.396380 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.396401 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.396413 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:09Z","lastTransitionTime":"2025-10-11T00:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.400426 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.413500 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.433256 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.445482 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.458343 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.478200 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.488963 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.499258 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.499336 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.499353 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.499376 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.499391 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:09Z","lastTransitionTime":"2025-10-11T00:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.508030 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.531745 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.547437 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.562179 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.577849 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.593679 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.603887 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.603929 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.603941 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.603963 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.603976 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:09Z","lastTransitionTime":"2025-10-11T00:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.608844 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:52:09 crc kubenswrapper[4743]: E1011 00:52:09.609107 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:52:13.609088053 +0000 UTC m=+28.262068460 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.618023 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.630229 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.644175 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.654257 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.667979 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.669646 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.670508 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.680982 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.695233 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.706723 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.706766 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.706776 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.706797 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.706809 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:09Z","lastTransitionTime":"2025-10-11T00:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.710186 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.710223 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.710252 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.710278 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:09 crc kubenswrapper[4743]: E1011 00:52:09.710565 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 00:52:09 crc kubenswrapper[4743]: E1011 00:52:09.710602 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 00:52:09 crc kubenswrapper[4743]: E1011 00:52:09.710634 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:13.710615854 +0000 UTC m=+28.363596251 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 00:52:09 crc kubenswrapper[4743]: E1011 00:52:09.710656 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 00:52:09 crc kubenswrapper[4743]: E1011 00:52:09.710571 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 00:52:09 crc kubenswrapper[4743]: E1011 00:52:09.710693 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 00:52:09 crc kubenswrapper[4743]: E1011 00:52:09.710705 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:13.710679656 +0000 UTC m=+28.363660263 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 00:52:09 crc kubenswrapper[4743]: E1011 00:52:09.710713 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 00:52:09 crc kubenswrapper[4743]: E1011 00:52:09.710705 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:09 crc kubenswrapper[4743]: E1011 00:52:09.710728 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:09 crc kubenswrapper[4743]: E1011 00:52:09.710763 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:13.710754708 +0000 UTC m=+28.363735355 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:09 crc kubenswrapper[4743]: E1011 00:52:09.710787 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:13.710775738 +0000 UTC m=+28.363756125 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.712788 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.728419 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.743906 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.762152 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.777386 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.794474 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.809333 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.809374 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.809383 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.809402 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.809413 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:09Z","lastTransitionTime":"2025-10-11T00:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.820804 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.836096 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.847994 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.860124 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.872405 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.886059 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.903610 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.913149 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.913182 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.913194 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.913214 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.913230 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:09Z","lastTransitionTime":"2025-10-11T00:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.917664 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.929662 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.944757 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.959210 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.970244 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:09 crc kubenswrapper[4743]: I1011 00:52:09.993321 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:09Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.016167 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.016228 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.016247 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.016274 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.016137 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.016299 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:10Z","lastTransitionTime":"2025-10-11T00:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.033773 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.046469 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.091158 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.091251 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.091466 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:10 crc kubenswrapper[4743]: E1011 00:52:10.091464 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:10 crc kubenswrapper[4743]: E1011 00:52:10.091547 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:10 crc kubenswrapper[4743]: E1011 00:52:10.091621 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.119058 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.119115 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.119131 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.119154 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.119168 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:10Z","lastTransitionTime":"2025-10-11T00:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.131673 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.136276 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.142194 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.152714 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.173377 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.190410 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.212299 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.222020 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.222092 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.222114 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.222146 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.222190 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:10Z","lastTransitionTime":"2025-10-11T00:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.225432 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.243602 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.265388 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.285943 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.289665 4743 generic.go:334] "Generic (PLEG): container finished" podID="06a7b971-8779-491c-8d3f-e7d5b4d60968" containerID="8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a" exitCode=0 Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.289957 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" event={"ID":"06a7b971-8779-491c-8d3f-e7d5b4d60968","Type":"ContainerDied","Data":"8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a"} Oct 11 00:52:10 crc kubenswrapper[4743]: E1011 00:52:10.298500 4743 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 11 00:52:10 crc kubenswrapper[4743]: E1011 00:52:10.302932 4743 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.311111 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.327022 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.327073 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.327090 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.327117 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.327133 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:10Z","lastTransitionTime":"2025-10-11T00:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.364095 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.395658 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.413948 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.426698 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.431020 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.431059 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.431070 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.431090 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.431100 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:10Z","lastTransitionTime":"2025-10-11T00:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.444076 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.457541 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.474658 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.492617 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.505218 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.516593 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.537358 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.537401 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.537418 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.537438 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.537449 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:10Z","lastTransitionTime":"2025-10-11T00:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.564145 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.591006 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.627245 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.640466 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.640500 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.640511 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.640528 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.640541 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:10Z","lastTransitionTime":"2025-10-11T00:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.672465 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.712144 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.743773 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.743831 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.743843 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.743882 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.743895 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:10Z","lastTransitionTime":"2025-10-11T00:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.757003 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.795381 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.833388 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:10Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.846724 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.846780 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.846793 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.846816 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.846829 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:10Z","lastTransitionTime":"2025-10-11T00:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.949939 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.950011 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.950032 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.950062 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:10 crc kubenswrapper[4743]: I1011 00:52:10.950080 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:10Z","lastTransitionTime":"2025-10-11T00:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.053428 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.053511 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.053531 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.053563 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.053584 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:11Z","lastTransitionTime":"2025-10-11T00:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.157312 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.157384 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.157401 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.157428 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.157447 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:11Z","lastTransitionTime":"2025-10-11T00:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.261784 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.261851 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.261900 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.261930 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.261949 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:11Z","lastTransitionTime":"2025-10-11T00:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.303307 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerStarted","Data":"ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0"} Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.307908 4743 generic.go:334] "Generic (PLEG): container finished" podID="06a7b971-8779-491c-8d3f-e7d5b4d60968" containerID="800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8" exitCode=0 Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.307990 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" event={"ID":"06a7b971-8779-491c-8d3f-e7d5b4d60968","Type":"ContainerDied","Data":"800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8"} Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.331348 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:11Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.355267 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:11Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.373571 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.373723 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.373747 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.373780 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.373811 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:11Z","lastTransitionTime":"2025-10-11T00:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.384015 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:11Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.404056 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:11Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.428091 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:11Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.461941 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:11Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.476931 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.477000 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.477022 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.477051 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.477069 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:11Z","lastTransitionTime":"2025-10-11T00:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.479283 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:11Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.496280 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:11Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.513160 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:11Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.530734 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:11Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.544261 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:11Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.567080 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:11Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.581659 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.581721 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.581734 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.581753 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.581765 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:11Z","lastTransitionTime":"2025-10-11T00:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.584308 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:11Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.630329 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:11Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.685088 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.685134 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.685143 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.685158 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.685169 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:11Z","lastTransitionTime":"2025-10-11T00:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.788279 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.788315 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.788325 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.788342 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.788352 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:11Z","lastTransitionTime":"2025-10-11T00:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.892264 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.892324 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.892334 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.892357 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.892370 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:11Z","lastTransitionTime":"2025-10-11T00:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.995597 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.995663 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.995682 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.995709 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:11 crc kubenswrapper[4743]: I1011 00:52:11.995729 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:11Z","lastTransitionTime":"2025-10-11T00:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.091695 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.091779 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:12 crc kubenswrapper[4743]: E1011 00:52:12.091869 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.091779 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:12 crc kubenswrapper[4743]: E1011 00:52:12.092095 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:12 crc kubenswrapper[4743]: E1011 00:52:12.092385 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.098978 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.099043 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.099066 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.099104 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.099130 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:12Z","lastTransitionTime":"2025-10-11T00:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.202958 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.203006 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.203023 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.203045 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.203063 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:12Z","lastTransitionTime":"2025-10-11T00:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.306083 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.306153 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.306167 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.306188 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.306201 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:12Z","lastTransitionTime":"2025-10-11T00:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.315593 4743 generic.go:334] "Generic (PLEG): container finished" podID="06a7b971-8779-491c-8d3f-e7d5b4d60968" containerID="a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d" exitCode=0 Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.315691 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" event={"ID":"06a7b971-8779-491c-8d3f-e7d5b4d60968","Type":"ContainerDied","Data":"a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d"} Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.351167 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:12Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.368785 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:12Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.394580 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:12Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.409210 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.409289 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.409308 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.409338 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.409359 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:12Z","lastTransitionTime":"2025-10-11T00:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.416507 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:12Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.437984 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:12Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.457274 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:12Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.476466 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:12Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.498486 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:12Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.511541 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:12Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.513766 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.513798 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.513810 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.513834 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.513846 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:12Z","lastTransitionTime":"2025-10-11T00:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.537988 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:12Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.561988 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:12Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.584624 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:12Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.600728 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:12Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.615039 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:12Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.616346 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.616378 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.616389 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.616409 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.616424 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:12Z","lastTransitionTime":"2025-10-11T00:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.718907 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.718945 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.718959 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.718977 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.718989 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:12Z","lastTransitionTime":"2025-10-11T00:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.822313 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.822385 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.822404 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.822435 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.822456 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:12Z","lastTransitionTime":"2025-10-11T00:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.925905 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.925973 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.925992 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.926023 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:12 crc kubenswrapper[4743]: I1011 00:52:12.926045 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:12Z","lastTransitionTime":"2025-10-11T00:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.029405 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.029466 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.029484 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.029513 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.029531 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:13Z","lastTransitionTime":"2025-10-11T00:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.133517 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.133599 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.133623 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.133658 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.133682 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:13Z","lastTransitionTime":"2025-10-11T00:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.239169 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.239749 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.239813 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.239887 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.239917 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:13Z","lastTransitionTime":"2025-10-11T00:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.323826 4743 generic.go:334] "Generic (PLEG): container finished" podID="06a7b971-8779-491c-8d3f-e7d5b4d60968" containerID="27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4" exitCode=0 Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.323905 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" event={"ID":"06a7b971-8779-491c-8d3f-e7d5b4d60968","Type":"ContainerDied","Data":"27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4"} Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.341316 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:13Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.343478 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.343655 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.343801 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.344002 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.344119 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:13Z","lastTransitionTime":"2025-10-11T00:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.366147 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:13Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.385157 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:13Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.408034 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:13Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.429501 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:13Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.446223 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:13Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.447940 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.448040 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.448061 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.448135 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.448157 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:13Z","lastTransitionTime":"2025-10-11T00:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.466233 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:13Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.491422 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:13Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.509113 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:13Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.528073 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:13Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.551046 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.551091 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.551106 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.551127 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.551144 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:13Z","lastTransitionTime":"2025-10-11T00:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.553642 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:13Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.579377 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:13Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.601709 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:13Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.617631 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:13Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.654580 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.654829 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:13 crc kubenswrapper[4743]: E1011 00:52:13.654851 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:52:21.65480957 +0000 UTC m=+36.307790147 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.654929 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.655044 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.655081 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.655096 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:13Z","lastTransitionTime":"2025-10-11T00:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.756419 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.756501 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.756553 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.756596 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:13 crc kubenswrapper[4743]: E1011 00:52:13.756603 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 00:52:13 crc kubenswrapper[4743]: E1011 00:52:13.756697 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:21.75666877 +0000 UTC m=+36.409649177 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 00:52:13 crc kubenswrapper[4743]: E1011 00:52:13.756748 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 00:52:13 crc kubenswrapper[4743]: E1011 00:52:13.756793 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 00:52:13 crc kubenswrapper[4743]: E1011 00:52:13.756825 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 00:52:13 crc kubenswrapper[4743]: E1011 00:52:13.756977 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:21.756947327 +0000 UTC m=+36.409927774 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 00:52:13 crc kubenswrapper[4743]: E1011 00:52:13.756840 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 00:52:13 crc kubenswrapper[4743]: E1011 00:52:13.757026 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:13 crc kubenswrapper[4743]: E1011 00:52:13.757101 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:21.757087621 +0000 UTC m=+36.410068038 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:13 crc kubenswrapper[4743]: E1011 00:52:13.756803 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 00:52:13 crc kubenswrapper[4743]: E1011 00:52:13.757130 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:13 crc kubenswrapper[4743]: E1011 00:52:13.757194 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:21.757184483 +0000 UTC m=+36.410164890 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.758380 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.758461 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.758490 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.758522 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.758547 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:13Z","lastTransitionTime":"2025-10-11T00:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.861401 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.861450 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.861461 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.861480 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.861495 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:13Z","lastTransitionTime":"2025-10-11T00:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.964725 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.964807 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.964830 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.964901 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:13 crc kubenswrapper[4743]: I1011 00:52:13.964932 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:13Z","lastTransitionTime":"2025-10-11T00:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.067452 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.067500 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.067516 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.067538 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.067550 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:14Z","lastTransitionTime":"2025-10-11T00:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.091223 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.091282 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.091317 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:14 crc kubenswrapper[4743]: E1011 00:52:14.091425 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:14 crc kubenswrapper[4743]: E1011 00:52:14.092080 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:14 crc kubenswrapper[4743]: E1011 00:52:14.092237 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.098420 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-t9nsf"] Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.098931 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-t9nsf" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.102476 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.102579 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.102601 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.102723 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.137027 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.162110 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.171082 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.171473 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.171706 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.171925 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.172092 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:14Z","lastTransitionTime":"2025-10-11T00:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.181405 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.202559 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.225985 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.257390 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.262651 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f07b1e57-3c09-4e75-866d-a4292db4e151-host\") pod \"node-ca-t9nsf\" (UID: \"f07b1e57-3c09-4e75-866d-a4292db4e151\") " pod="openshift-image-registry/node-ca-t9nsf" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.262697 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c97lv\" (UniqueName: \"kubernetes.io/projected/f07b1e57-3c09-4e75-866d-a4292db4e151-kube-api-access-c97lv\") pod \"node-ca-t9nsf\" (UID: \"f07b1e57-3c09-4e75-866d-a4292db4e151\") " pod="openshift-image-registry/node-ca-t9nsf" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.262736 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f07b1e57-3c09-4e75-866d-a4292db4e151-serviceca\") pod \"node-ca-t9nsf\" (UID: \"f07b1e57-3c09-4e75-866d-a4292db4e151\") " pod="openshift-image-registry/node-ca-t9nsf" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.278189 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.280767 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.280812 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.280825 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.280846 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.280886 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:14Z","lastTransitionTime":"2025-10-11T00:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.297124 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.313920 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.329457 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.335840 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerStarted","Data":"9e5db0acfb2627ea36e6ba19e1dc664129ca14d1320f74a4d5b4fc28cc6967cb"} Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.337237 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.337298 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.341039 4743 generic.go:334] "Generic (PLEG): container finished" podID="06a7b971-8779-491c-8d3f-e7d5b4d60968" containerID="00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb" exitCode=0 Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.341142 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" event={"ID":"06a7b971-8779-491c-8d3f-e7d5b4d60968","Type":"ContainerDied","Data":"00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb"} Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.351002 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.364573 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f07b1e57-3c09-4e75-866d-a4292db4e151-host\") pod \"node-ca-t9nsf\" (UID: \"f07b1e57-3c09-4e75-866d-a4292db4e151\") " pod="openshift-image-registry/node-ca-t9nsf" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.364757 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f07b1e57-3c09-4e75-866d-a4292db4e151-host\") pod \"node-ca-t9nsf\" (UID: \"f07b1e57-3c09-4e75-866d-a4292db4e151\") " pod="openshift-image-registry/node-ca-t9nsf" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.364765 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c97lv\" (UniqueName: \"kubernetes.io/projected/f07b1e57-3c09-4e75-866d-a4292db4e151-kube-api-access-c97lv\") pod \"node-ca-t9nsf\" (UID: \"f07b1e57-3c09-4e75-866d-a4292db4e151\") " pod="openshift-image-registry/node-ca-t9nsf" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.365014 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f07b1e57-3c09-4e75-866d-a4292db4e151-serviceca\") pod \"node-ca-t9nsf\" (UID: \"f07b1e57-3c09-4e75-866d-a4292db4e151\") " pod="openshift-image-registry/node-ca-t9nsf" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.367842 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f07b1e57-3c09-4e75-866d-a4292db4e151-serviceca\") pod \"node-ca-t9nsf\" (UID: \"f07b1e57-3c09-4e75-866d-a4292db4e151\") " pod="openshift-image-registry/node-ca-t9nsf" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.367752 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.383766 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.384544 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.384588 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.384600 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.384620 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.384635 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:14Z","lastTransitionTime":"2025-10-11T00:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.386889 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.388972 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.401909 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c97lv\" (UniqueName: \"kubernetes.io/projected/f07b1e57-3c09-4e75-866d-a4292db4e151-kube-api-access-c97lv\") pod \"node-ca-t9nsf\" (UID: \"f07b1e57-3c09-4e75-866d-a4292db4e151\") " pod="openshift-image-registry/node-ca-t9nsf" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.417367 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.427185 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-t9nsf" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.435583 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: W1011 00:52:14.440938 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf07b1e57_3c09_4e75_866d_a4292db4e151.slice/crio-ea5d109c3a0af4425ae72d731a3036c105fbc0c5c1ea567e9106600025fc62a2 WatchSource:0}: Error finding container ea5d109c3a0af4425ae72d731a3036c105fbc0c5c1ea567e9106600025fc62a2: Status 404 returned error can't find the container with id ea5d109c3a0af4425ae72d731a3036c105fbc0c5c1ea567e9106600025fc62a2 Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.454358 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.478064 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.487689 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.487736 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.487754 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.487779 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.487797 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:14Z","lastTransitionTime":"2025-10-11T00:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.494580 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.518720 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.535848 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.558187 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.582648 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5db0acfb2627ea36e6ba19e1dc664129ca14d1320f74a4d5b4fc28cc6967cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.591260 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.591325 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.591343 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.591372 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.591392 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:14Z","lastTransitionTime":"2025-10-11T00:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.598412 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.619247 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.646442 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.662822 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.674917 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.685955 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.693551 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.693691 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.693779 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.693894 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.693988 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:14Z","lastTransitionTime":"2025-10-11T00:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.698543 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.712114 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:14Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.797191 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.797265 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.797287 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.797318 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.797339 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:14Z","lastTransitionTime":"2025-10-11T00:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.900467 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.900918 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.901037 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.901141 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:14 crc kubenswrapper[4743]: I1011 00:52:14.901205 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:14Z","lastTransitionTime":"2025-10-11T00:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.005542 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.005960 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.005980 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.006008 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.006027 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:15Z","lastTransitionTime":"2025-10-11T00:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.109403 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.110019 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.110175 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.110312 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.110445 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:15Z","lastTransitionTime":"2025-10-11T00:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.214532 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.214596 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.214613 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.214639 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.214656 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:15Z","lastTransitionTime":"2025-10-11T00:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.317810 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.317948 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.317968 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.317998 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.318018 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:15Z","lastTransitionTime":"2025-10-11T00:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.351034 4743 generic.go:334] "Generic (PLEG): container finished" podID="06a7b971-8779-491c-8d3f-e7d5b4d60968" containerID="276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b" exitCode=0 Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.351109 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" event={"ID":"06a7b971-8779-491c-8d3f-e7d5b4d60968","Type":"ContainerDied","Data":"276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b"} Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.355098 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.356326 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-t9nsf" event={"ID":"f07b1e57-3c09-4e75-866d-a4292db4e151","Type":"ContainerStarted","Data":"fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be"} Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.356385 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-t9nsf" event={"ID":"f07b1e57-3c09-4e75-866d-a4292db4e151","Type":"ContainerStarted","Data":"ea5d109c3a0af4425ae72d731a3036c105fbc0c5c1ea567e9106600025fc62a2"} Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.391756 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5db0acfb2627ea36e6ba19e1dc664129ca14d1320f74a4d5b4fc28cc6967cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.418145 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.422597 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.423424 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.423454 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.423501 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.423521 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:15Z","lastTransitionTime":"2025-10-11T00:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.441194 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.459008 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.476311 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.498145 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.513734 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.526999 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.527056 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.527071 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.527094 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.527136 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:15Z","lastTransitionTime":"2025-10-11T00:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.547468 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.567990 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.584629 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.609659 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.631271 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.631326 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.631340 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.631364 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.631381 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:15Z","lastTransitionTime":"2025-10-11T00:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.632558 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.656975 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.671452 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.686340 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.704389 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.717986 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.730394 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.734554 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.734619 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.734638 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.734667 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.734685 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:15Z","lastTransitionTime":"2025-10-11T00:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.748603 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.765913 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.798095 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.823777 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5db0acfb2627ea36e6ba19e1dc664129ca14d1320f74a4d5b4fc28cc6967cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.836983 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.837037 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.837051 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.837070 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.837084 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:15Z","lastTransitionTime":"2025-10-11T00:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.846716 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.860492 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.889499 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.906425 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.933847 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.939671 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.939722 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.939738 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.939763 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.939781 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:15Z","lastTransitionTime":"2025-10-11T00:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.950592 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:15 crc kubenswrapper[4743]: I1011 00:52:15.983891 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:15Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.012294 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.042546 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.042605 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.042621 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.042642 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.042656 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:16Z","lastTransitionTime":"2025-10-11T00:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.091259 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.091259 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:16 crc kubenswrapper[4743]: E1011 00:52:16.091398 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:16 crc kubenswrapper[4743]: E1011 00:52:16.091579 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.091267 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:16 crc kubenswrapper[4743]: E1011 00:52:16.091722 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.122995 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.136545 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.145123 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.145275 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.145355 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.145419 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.145474 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:16Z","lastTransitionTime":"2025-10-11T00:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.153312 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.163578 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.176170 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.189053 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.199968 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.220297 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.233691 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.248507 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.248570 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.248590 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.248619 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.248642 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:16Z","lastTransitionTime":"2025-10-11T00:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.252846 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.268671 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.286421 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.311478 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5db0acfb2627ea36e6ba19e1dc664129ca14d1320f74a4d5b4fc28cc6967cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.331968 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.352030 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.352524 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.352577 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.352630 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.352657 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.352675 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:16Z","lastTransitionTime":"2025-10-11T00:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.355953 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.356013 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.356027 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.356049 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.356066 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:16Z","lastTransitionTime":"2025-10-11T00:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.364061 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" event={"ID":"06a7b971-8779-491c-8d3f-e7d5b4d60968","Type":"ContainerStarted","Data":"38b5a40ddd5390cc392fcf2240304a72163dd797511640ada5ad39a59c0f4283"} Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.364172 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 11 00:52:16 crc kubenswrapper[4743]: E1011 00:52:16.373173 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.378038 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.378080 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.378089 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.378107 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.378122 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:16Z","lastTransitionTime":"2025-10-11T00:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.383284 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: E1011 00:52:16.395038 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.403208 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.403762 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.403822 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.403835 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.403877 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.403892 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:16Z","lastTransitionTime":"2025-10-11T00:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.419078 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: E1011 00:52:16.424225 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.427891 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.428048 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.428140 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.428228 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.428308 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:16Z","lastTransitionTime":"2025-10-11T00:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.437326 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: E1011 00:52:16.441300 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.445544 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.445585 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.445595 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.445614 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.445626 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:16Z","lastTransitionTime":"2025-10-11T00:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.451163 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: E1011 00:52:16.457531 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: E1011 00:52:16.457651 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.459550 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.459578 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.459588 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.459608 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.459618 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:16Z","lastTransitionTime":"2025-10-11T00:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.470469 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38b5a40ddd5390cc392fcf2240304a72163dd797511640ada5ad39a59c0f4283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.495379 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5db0acfb2627ea36e6ba19e1dc664129ca14d1320f74a4d5b4fc28cc6967cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.509570 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.525792 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.546701 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.562080 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.562211 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.562237 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.562269 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.562293 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:16Z","lastTransitionTime":"2025-10-11T00:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.566662 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.578837 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.592697 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.614380 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.633367 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.666456 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.666517 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.666538 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.666570 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.666592 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:16Z","lastTransitionTime":"2025-10-11T00:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.770371 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.770425 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.770436 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.770455 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.770466 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:16Z","lastTransitionTime":"2025-10-11T00:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.874428 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.874504 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.874528 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.874567 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.874595 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:16Z","lastTransitionTime":"2025-10-11T00:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.978033 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.978094 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.978112 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.978136 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:16 crc kubenswrapper[4743]: I1011 00:52:16.978148 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:16Z","lastTransitionTime":"2025-10-11T00:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.081903 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.082003 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.082025 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.082102 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.082128 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:17Z","lastTransitionTime":"2025-10-11T00:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.186381 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.186448 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.186465 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.186493 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.186513 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:17Z","lastTransitionTime":"2025-10-11T00:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.289739 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.289807 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.289830 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.289939 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.289979 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:17Z","lastTransitionTime":"2025-10-11T00:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.371904 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48ljj_9ed16b35-862f-47f2-9e32-63c98f868fb8/ovnkube-controller/0.log" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.375183 4743 generic.go:334] "Generic (PLEG): container finished" podID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerID="9e5db0acfb2627ea36e6ba19e1dc664129ca14d1320f74a4d5b4fc28cc6967cb" exitCode=1 Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.375307 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerDied","Data":"9e5db0acfb2627ea36e6ba19e1dc664129ca14d1320f74a4d5b4fc28cc6967cb"} Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.376742 4743 scope.go:117] "RemoveContainer" containerID="9e5db0acfb2627ea36e6ba19e1dc664129ca14d1320f74a4d5b4fc28cc6967cb" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.393344 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.393449 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.393471 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.393535 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.393555 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:17Z","lastTransitionTime":"2025-10-11T00:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.409304 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:17Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.433758 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:17Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.458163 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:17Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.482657 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:17Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.498255 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.498322 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.498347 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.498386 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.498414 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:17Z","lastTransitionTime":"2025-10-11T00:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.510714 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:17Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.532953 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:17Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.557049 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:17Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.583323 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:17Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.603927 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.603973 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.603991 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.604018 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.604037 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:17Z","lastTransitionTime":"2025-10-11T00:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.604473 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:17Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.635366 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38b5a40ddd5390cc392fcf2240304a72163dd797511640ada5ad39a59c0f4283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:17Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.664604 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5db0acfb2627ea36e6ba19e1dc664129ca14d1320f74a4d5b4fc28cc6967cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5db0acfb2627ea36e6ba19e1dc664129ca14d1320f74a4d5b4fc28cc6967cb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"message\\\":\\\"00:52:16.427132 5939 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1011 00:52:16.434070 5939 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1011 00:52:16.434133 5939 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1011 00:52:16.434198 5939 handler.go:208] Removed *v1.Node event handler 7\\\\nI1011 00:52:16.434291 5939 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1011 00:52:16.434302 5939 handler.go:208] Removed *v1.Node event handler 2\\\\nI1011 00:52:16.434385 5939 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1011 00:52:16.435058 5939 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1011 00:52:16.435094 5939 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1011 00:52:16.435134 5939 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1011 00:52:16.435168 5939 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1011 00:52:16.435182 5939 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1011 00:52:16.435199 5939 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1011 00:52:16.435207 5939 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1011 00:52:16.435207 5939 factory.go:656] Stopping watch factory\\\\nI1011 00:52:16.435219 5939 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:17Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.697311 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:17Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.706992 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.707020 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.707034 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.707055 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.707068 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:17Z","lastTransitionTime":"2025-10-11T00:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.720616 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:17Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.735238 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:17Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.749046 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:17Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.810295 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.810331 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.810344 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.810362 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.810375 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:17Z","lastTransitionTime":"2025-10-11T00:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.913226 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.913289 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.913311 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.913341 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:17 crc kubenswrapper[4743]: I1011 00:52:17.913363 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:17Z","lastTransitionTime":"2025-10-11T00:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.015400 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.015434 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.015442 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.015455 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.015464 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:18Z","lastTransitionTime":"2025-10-11T00:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.091453 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.091602 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.091848 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:18 crc kubenswrapper[4743]: E1011 00:52:18.091888 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:18 crc kubenswrapper[4743]: E1011 00:52:18.092023 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:18 crc kubenswrapper[4743]: E1011 00:52:18.092089 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.117370 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.117398 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.117407 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.117427 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.117437 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:18Z","lastTransitionTime":"2025-10-11T00:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.220605 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.220661 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.220673 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.220693 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.220705 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:18Z","lastTransitionTime":"2025-10-11T00:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.323747 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.323801 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.323813 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.323836 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.323850 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:18Z","lastTransitionTime":"2025-10-11T00:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.382762 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48ljj_9ed16b35-862f-47f2-9e32-63c98f868fb8/ovnkube-controller/0.log" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.387180 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerStarted","Data":"d87f5b08faef77769b5965131a6bbc8c2d86c5b51c4561962b902a3056dc9a84"} Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.387424 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.410999 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:18Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.427696 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.427735 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.427747 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.427764 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.427777 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:18Z","lastTransitionTime":"2025-10-11T00:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.434791 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:18Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.453783 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:18Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.472442 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:18Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.493965 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:18Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.511218 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:18Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.532460 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.532499 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.532509 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.532552 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.532566 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:18Z","lastTransitionTime":"2025-10-11T00:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.533054 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38b5a40ddd5390cc392fcf2240304a72163dd797511640ada5ad39a59c0f4283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:18Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.571006 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d87f5b08faef77769b5965131a6bbc8c2d86c5b51c4561962b902a3056dc9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5db0acfb2627ea36e6ba19e1dc664129ca14d1320f74a4d5b4fc28cc6967cb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"message\\\":\\\"00:52:16.427132 5939 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1011 00:52:16.434070 5939 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1011 00:52:16.434133 5939 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1011 00:52:16.434198 5939 handler.go:208] Removed *v1.Node event handler 7\\\\nI1011 00:52:16.434291 5939 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1011 00:52:16.434302 5939 handler.go:208] Removed *v1.Node event handler 2\\\\nI1011 00:52:16.434385 5939 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1011 00:52:16.435058 5939 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1011 00:52:16.435094 5939 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1011 00:52:16.435134 5939 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1011 00:52:16.435168 5939 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1011 00:52:16.435182 5939 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1011 00:52:16.435199 5939 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1011 00:52:16.435207 5939 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1011 00:52:16.435207 5939 factory.go:656] Stopping watch factory\\\\nI1011 00:52:16.435219 5939 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:18Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.592527 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:18Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.626679 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:18Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.636043 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.636114 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.636134 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.636163 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.636186 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:18Z","lastTransitionTime":"2025-10-11T00:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.654404 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:18Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.671651 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:18Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.688077 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:18Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.709813 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:18Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.733138 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:18Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.739981 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.740051 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.740070 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.740098 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.740115 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:18Z","lastTransitionTime":"2025-10-11T00:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.843300 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.843358 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.843376 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.843402 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.843420 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:18Z","lastTransitionTime":"2025-10-11T00:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.947471 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.947531 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.947552 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.947581 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:18 crc kubenswrapper[4743]: I1011 00:52:18.947602 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:18Z","lastTransitionTime":"2025-10-11T00:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.050450 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.050514 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.050534 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.050558 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.050576 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:19Z","lastTransitionTime":"2025-10-11T00:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.154612 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.154688 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.154712 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.154744 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.154770 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:19Z","lastTransitionTime":"2025-10-11T00:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.257760 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.257830 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.257847 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.257899 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.257920 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:19Z","lastTransitionTime":"2025-10-11T00:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.361048 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.361099 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.361119 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.361143 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.361160 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:19Z","lastTransitionTime":"2025-10-11T00:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.393095 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48ljj_9ed16b35-862f-47f2-9e32-63c98f868fb8/ovnkube-controller/1.log" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.393981 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48ljj_9ed16b35-862f-47f2-9e32-63c98f868fb8/ovnkube-controller/0.log" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.398950 4743 generic.go:334] "Generic (PLEG): container finished" podID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerID="d87f5b08faef77769b5965131a6bbc8c2d86c5b51c4561962b902a3056dc9a84" exitCode=1 Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.399066 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerDied","Data":"d87f5b08faef77769b5965131a6bbc8c2d86c5b51c4561962b902a3056dc9a84"} Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.399155 4743 scope.go:117] "RemoveContainer" containerID="9e5db0acfb2627ea36e6ba19e1dc664129ca14d1320f74a4d5b4fc28cc6967cb" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.400429 4743 scope.go:117] "RemoveContainer" containerID="d87f5b08faef77769b5965131a6bbc8c2d86c5b51c4561962b902a3056dc9a84" Oct 11 00:52:19 crc kubenswrapper[4743]: E1011 00:52:19.400710 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-48ljj_openshift-ovn-kubernetes(9ed16b35-862f-47f2-9e32-63c98f868fb8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.428179 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:19Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.447050 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:19Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.464543 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.464853 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.465145 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.464908 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:19Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.465381 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.465540 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:19Z","lastTransitionTime":"2025-10-11T00:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.501850 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:19Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.526251 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:19Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.554522 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:19Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.570060 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.570139 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.570160 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.570204 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.570228 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:19Z","lastTransitionTime":"2025-10-11T00:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.577927 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:19Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.600954 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:19Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.625426 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:19Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.648708 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:19Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.673409 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.673494 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.673519 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.673556 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.673582 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:19Z","lastTransitionTime":"2025-10-11T00:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.676426 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38b5a40ddd5390cc392fcf2240304a72163dd797511640ada5ad39a59c0f4283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:19Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.708962 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d87f5b08faef77769b5965131a6bbc8c2d86c5b51c4561962b902a3056dc9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5db0acfb2627ea36e6ba19e1dc664129ca14d1320f74a4d5b4fc28cc6967cb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"message\\\":\\\"00:52:16.427132 5939 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1011 00:52:16.434070 5939 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1011 00:52:16.434133 5939 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1011 00:52:16.434198 5939 handler.go:208] Removed *v1.Node event handler 7\\\\nI1011 00:52:16.434291 5939 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1011 00:52:16.434302 5939 handler.go:208] Removed *v1.Node event handler 2\\\\nI1011 00:52:16.434385 5939 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1011 00:52:16.435058 5939 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1011 00:52:16.435094 5939 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1011 00:52:16.435134 5939 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1011 00:52:16.435168 5939 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1011 00:52:16.435182 5939 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1011 00:52:16.435199 5939 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1011 00:52:16.435207 5939 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1011 00:52:16.435207 5939 factory.go:656] Stopping watch factory\\\\nI1011 00:52:16.435219 5939 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d87f5b08faef77769b5965131a6bbc8c2d86c5b51c4561962b902a3056dc9a84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:18Z\\\",\\\"message\\\":\\\"Recording success event on pod openshift-multus/multus-additional-cni-plugins-6wcnk\\\\nI1011 00:52:18.608350 6153 lb_config.go:1031] Cluster endpoints for openshift-machine-config-operator/machine-config-operator for network=default are: map[]\\\\nI1011 00:52:18.608301 6153 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI1011 00:52:18.608367 6153 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1011 00:52:18.608369 6153 services_controller.go:360] Finished syncing service machine-api-operator on namespace openshift-machine-api for network=default : 1.580511ms\\\\nF1011 00:52:18.608026 6153 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:19Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.730128 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:19Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.747416 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:19Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.765393 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:19Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.777237 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.777323 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.777348 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.777382 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.777409 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:19Z","lastTransitionTime":"2025-10-11T00:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.881109 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.881192 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.881212 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.881244 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.881264 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:19Z","lastTransitionTime":"2025-10-11T00:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.985309 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.985379 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.985399 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.985432 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:19 crc kubenswrapper[4743]: I1011 00:52:19.985453 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:19Z","lastTransitionTime":"2025-10-11T00:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.089098 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.089175 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.089194 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.089221 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.089240 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:20Z","lastTransitionTime":"2025-10-11T00:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.091826 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.092235 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:20 crc kubenswrapper[4743]: E1011 00:52:20.092342 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:20 crc kubenswrapper[4743]: E1011 00:52:20.092426 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.092243 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:20 crc kubenswrapper[4743]: E1011 00:52:20.092557 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.192973 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.193032 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.193050 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.193076 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.193096 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:20Z","lastTransitionTime":"2025-10-11T00:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.296106 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.296156 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.296175 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.296198 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.296217 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:20Z","lastTransitionTime":"2025-10-11T00:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.400579 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.400649 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.400670 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.400702 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.400721 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:20Z","lastTransitionTime":"2025-10-11T00:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.407241 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48ljj_9ed16b35-862f-47f2-9e32-63c98f868fb8/ovnkube-controller/1.log" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.504734 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.505275 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.505355 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.505436 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.505556 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:20Z","lastTransitionTime":"2025-10-11T00:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.609576 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.609955 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.610020 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.610088 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.610145 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:20Z","lastTransitionTime":"2025-10-11T00:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.714019 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.714120 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.714150 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.714180 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.714197 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:20Z","lastTransitionTime":"2025-10-11T00:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.817554 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.817638 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.817656 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.817687 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.817708 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:20Z","lastTransitionTime":"2025-10-11T00:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.921136 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.921516 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.921638 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.921712 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:20 crc kubenswrapper[4743]: I1011 00:52:20.921770 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:20Z","lastTransitionTime":"2025-10-11T00:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.025514 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.025997 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.026150 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.026358 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.026483 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:21Z","lastTransitionTime":"2025-10-11T00:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.131099 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.131217 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.131247 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.131278 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.131306 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:21Z","lastTransitionTime":"2025-10-11T00:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.132782 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt"] Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.142428 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.146688 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.148140 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.168942 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.190207 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.212074 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptjnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.235047 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.235115 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.235137 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.235166 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.235188 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:21Z","lastTransitionTime":"2025-10-11T00:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.236118 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.252521 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5cb3d634-b381-4ee0-a819-ce6f87fa8afb-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ptjnt\" (UID: \"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.252629 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5cb3d634-b381-4ee0-a819-ce6f87fa8afb-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ptjnt\" (UID: \"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.252740 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5cb3d634-b381-4ee0-a819-ce6f87fa8afb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ptjnt\" (UID: \"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.252966 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4dnv\" (UniqueName: \"kubernetes.io/projected/5cb3d634-b381-4ee0-a819-ce6f87fa8afb-kube-api-access-m4dnv\") pod \"ovnkube-control-plane-749d76644c-ptjnt\" (UID: \"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.256988 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.282334 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.305130 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.326699 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.339062 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.339139 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.339160 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.339192 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.339216 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:21Z","lastTransitionTime":"2025-10-11T00:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.350271 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.354004 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5cb3d634-b381-4ee0-a819-ce6f87fa8afb-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ptjnt\" (UID: \"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.354241 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5cb3d634-b381-4ee0-a819-ce6f87fa8afb-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ptjnt\" (UID: \"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.354414 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5cb3d634-b381-4ee0-a819-ce6f87fa8afb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ptjnt\" (UID: \"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.354567 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4dnv\" (UniqueName: \"kubernetes.io/projected/5cb3d634-b381-4ee0-a819-ce6f87fa8afb-kube-api-access-m4dnv\") pod \"ovnkube-control-plane-749d76644c-ptjnt\" (UID: \"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.355296 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5cb3d634-b381-4ee0-a819-ce6f87fa8afb-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ptjnt\" (UID: \"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.355409 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5cb3d634-b381-4ee0-a819-ce6f87fa8afb-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ptjnt\" (UID: \"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.365570 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5cb3d634-b381-4ee0-a819-ce6f87fa8afb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ptjnt\" (UID: \"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.371644 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.384809 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4dnv\" (UniqueName: \"kubernetes.io/projected/5cb3d634-b381-4ee0-a819-ce6f87fa8afb-kube-api-access-m4dnv\") pod \"ovnkube-control-plane-749d76644c-ptjnt\" (UID: \"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.402145 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38b5a40ddd5390cc392fcf2240304a72163dd797511640ada5ad39a59c0f4283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.436041 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d87f5b08faef77769b5965131a6bbc8c2d86c5b51c4561962b902a3056dc9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5db0acfb2627ea36e6ba19e1dc664129ca14d1320f74a4d5b4fc28cc6967cb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"message\\\":\\\"00:52:16.427132 5939 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1011 00:52:16.434070 5939 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1011 00:52:16.434133 5939 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1011 00:52:16.434198 5939 handler.go:208] Removed *v1.Node event handler 7\\\\nI1011 00:52:16.434291 5939 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1011 00:52:16.434302 5939 handler.go:208] Removed *v1.Node event handler 2\\\\nI1011 00:52:16.434385 5939 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1011 00:52:16.435058 5939 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1011 00:52:16.435094 5939 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1011 00:52:16.435134 5939 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1011 00:52:16.435168 5939 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1011 00:52:16.435182 5939 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1011 00:52:16.435199 5939 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1011 00:52:16.435207 5939 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1011 00:52:16.435207 5939 factory.go:656] Stopping watch factory\\\\nI1011 00:52:16.435219 5939 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d87f5b08faef77769b5965131a6bbc8c2d86c5b51c4561962b902a3056dc9a84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:18Z\\\",\\\"message\\\":\\\"Recording success event on pod openshift-multus/multus-additional-cni-plugins-6wcnk\\\\nI1011 00:52:18.608350 6153 lb_config.go:1031] Cluster endpoints for openshift-machine-config-operator/machine-config-operator for network=default are: map[]\\\\nI1011 00:52:18.608301 6153 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI1011 00:52:18.608367 6153 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1011 00:52:18.608369 6153 services_controller.go:360] Finished syncing service machine-api-operator on namespace openshift-machine-api for network=default : 1.580511ms\\\\nF1011 00:52:18.608026 6153 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.442643 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.442708 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.442728 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.442756 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.442776 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:21Z","lastTransitionTime":"2025-10-11T00:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.471279 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.473387 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: W1011 00:52:21.499999 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cb3d634_b381_4ee0_a819_ce6f87fa8afb.slice/crio-7c8eefc63ea15301892d39665f358ac43ee4219978cde3655b045121be689088 WatchSource:0}: Error finding container 7c8eefc63ea15301892d39665f358ac43ee4219978cde3655b045121be689088: Status 404 returned error can't find the container with id 7c8eefc63ea15301892d39665f358ac43ee4219978cde3655b045121be689088 Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.505587 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.525560 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.541520 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.542350 4743 scope.go:117] "RemoveContainer" containerID="d87f5b08faef77769b5965131a6bbc8c2d86c5b51c4561962b902a3056dc9a84" Oct 11 00:52:21 crc kubenswrapper[4743]: E1011 00:52:21.542520 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-48ljj_openshift-ovn-kubernetes(9ed16b35-862f-47f2-9e32-63c98f868fb8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.542560 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.545909 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.545938 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.545950 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.545964 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.545977 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:21Z","lastTransitionTime":"2025-10-11T00:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.562060 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.581576 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.597367 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.628486 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d87f5b08faef77769b5965131a6bbc8c2d86c5b51c4561962b902a3056dc9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d87f5b08faef77769b5965131a6bbc8c2d86c5b51c4561962b902a3056dc9a84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:18Z\\\",\\\"message\\\":\\\"Recording success event on pod openshift-multus/multus-additional-cni-plugins-6wcnk\\\\nI1011 00:52:18.608350 6153 lb_config.go:1031] Cluster endpoints for openshift-machine-config-operator/machine-config-operator for network=default are: map[]\\\\nI1011 00:52:18.608301 6153 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI1011 00:52:18.608367 6153 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1011 00:52:18.608369 6153 services_controller.go:360] Finished syncing service machine-api-operator on namespace openshift-machine-api for network=default : 1.580511ms\\\\nF1011 00:52:18.608026 6153 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-48ljj_openshift-ovn-kubernetes(9ed16b35-862f-47f2-9e32-63c98f868fb8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.647768 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.650128 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.650178 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.650198 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.650223 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.650241 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:21Z","lastTransitionTime":"2025-10-11T00:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.657234 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:52:21 crc kubenswrapper[4743]: E1011 00:52:21.657900 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:52:37.657837764 +0000 UTC m=+52.310818171 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.664993 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.682443 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.714687 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.739928 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38b5a40ddd5390cc392fcf2240304a72163dd797511640ada5ad39a59c0f4283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.754937 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.754981 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.754993 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.755377 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.755538 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:21Z","lastTransitionTime":"2025-10-11T00:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.762264 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.762347 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.762399 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.762450 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:21 crc kubenswrapper[4743]: E1011 00:52:21.762646 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 00:52:21 crc kubenswrapper[4743]: E1011 00:52:21.762673 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 00:52:21 crc kubenswrapper[4743]: E1011 00:52:21.762690 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:21 crc kubenswrapper[4743]: E1011 00:52:21.762768 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:37.762743633 +0000 UTC m=+52.415724030 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:21 crc kubenswrapper[4743]: E1011 00:52:21.763365 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 00:52:21 crc kubenswrapper[4743]: E1011 00:52:21.763424 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:37.76340955 +0000 UTC m=+52.416389947 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 00:52:21 crc kubenswrapper[4743]: E1011 00:52:21.763521 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 00:52:21 crc kubenswrapper[4743]: E1011 00:52:21.763542 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 00:52:21 crc kubenswrapper[4743]: E1011 00:52:21.763556 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:21 crc kubenswrapper[4743]: E1011 00:52:21.763598 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:37.763586905 +0000 UTC m=+52.416567302 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:21 crc kubenswrapper[4743]: E1011 00:52:21.763662 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 00:52:21 crc kubenswrapper[4743]: E1011 00:52:21.763703 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 00:52:37.763686207 +0000 UTC m=+52.416666604 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.782526 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.783061 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.807283 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.834093 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.852321 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.859639 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.859676 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.859686 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.859704 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.859717 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:21Z","lastTransitionTime":"2025-10-11T00:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.871487 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.885569 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.902838 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptjnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.915947 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.932698 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.947446 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.960000 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.962188 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.962237 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.962255 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.962280 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.962296 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:21Z","lastTransitionTime":"2025-10-11T00:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.977052 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38b5a40ddd5390cc392fcf2240304a72163dd797511640ada5ad39a59c0f4283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:21 crc kubenswrapper[4743]: I1011 00:52:21.999205 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d87f5b08faef77769b5965131a6bbc8c2d86c5b51c4561962b902a3056dc9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d87f5b08faef77769b5965131a6bbc8c2d86c5b51c4561962b902a3056dc9a84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:18Z\\\",\\\"message\\\":\\\"Recording success event on pod openshift-multus/multus-additional-cni-plugins-6wcnk\\\\nI1011 00:52:18.608350 6153 lb_config.go:1031] Cluster endpoints for openshift-machine-config-operator/machine-config-operator for network=default are: map[]\\\\nI1011 00:52:18.608301 6153 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI1011 00:52:18.608367 6153 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1011 00:52:18.608369 6153 services_controller.go:360] Finished syncing service machine-api-operator on namespace openshift-machine-api for network=default : 1.580511ms\\\\nF1011 00:52:18.608026 6153 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-48ljj_openshift-ovn-kubernetes(9ed16b35-862f-47f2-9e32-63c98f868fb8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:21Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.027497 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.045425 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.060263 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.065507 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.065561 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.065575 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.065597 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.065611 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:22Z","lastTransitionTime":"2025-10-11T00:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.075588 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.091698 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.091784 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.091818 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:22 crc kubenswrapper[4743]: E1011 00:52:22.091873 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:22 crc kubenswrapper[4743]: E1011 00:52:22.092004 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:22 crc kubenswrapper[4743]: E1011 00:52:22.093712 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.093906 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.108619 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.122497 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptjnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.139730 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.157910 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.168119 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.168162 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.168177 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.168196 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.168211 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:22Z","lastTransitionTime":"2025-10-11T00:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.177649 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.264082 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-cb5z5"] Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.265006 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:22 crc kubenswrapper[4743]: E1011 00:52:22.265161 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.271709 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.271766 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.271780 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.271801 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.271814 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:22Z","lastTransitionTime":"2025-10-11T00:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.291768 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.310613 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.320811 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.330740 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.352708 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.368170 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.368923 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs\") pod \"network-metrics-daemon-cb5z5\" (UID: \"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\") " pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.369019 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2zdb\" (UniqueName: \"kubernetes.io/projected/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-kube-api-access-d2zdb\") pod \"network-metrics-daemon-cb5z5\" (UID: \"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\") " pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.374660 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.374723 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.374736 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.374757 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.374770 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:22Z","lastTransitionTime":"2025-10-11T00:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.385228 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptjnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.406058 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cb5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cb5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.427157 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" event={"ID":"5cb3d634-b381-4ee0-a819-ce6f87fa8afb","Type":"ContainerStarted","Data":"b62556627f75a8ed41c68a7fb1982461fd9fb35965014b72ec15492e763e42b5"} Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.427265 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" event={"ID":"5cb3d634-b381-4ee0-a819-ce6f87fa8afb","Type":"ContainerStarted","Data":"7c47ab12f326ce6568bfd06d8beebdeffc07d464b747a288a14793db096ccc6c"} Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.427288 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" event={"ID":"5cb3d634-b381-4ee0-a819-ce6f87fa8afb","Type":"ContainerStarted","Data":"7c8eefc63ea15301892d39665f358ac43ee4219978cde3655b045121be689088"} Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.428213 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.446907 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.466339 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.470697 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2zdb\" (UniqueName: \"kubernetes.io/projected/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-kube-api-access-d2zdb\") pod \"network-metrics-daemon-cb5z5\" (UID: \"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\") " pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.470836 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs\") pod \"network-metrics-daemon-cb5z5\" (UID: \"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\") " pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:22 crc kubenswrapper[4743]: E1011 00:52:22.471095 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 00:52:22 crc kubenswrapper[4743]: E1011 00:52:22.471224 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs podName:b02b8636-a5c4-447d-b1cf-401b3dcfa02b nodeName:}" failed. No retries permitted until 2025-10-11 00:52:22.97119186 +0000 UTC m=+37.624172287 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs") pod "network-metrics-daemon-cb5z5" (UID: "b02b8636-a5c4-447d-b1cf-401b3dcfa02b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.481545 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.481591 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.481716 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.481967 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.482453 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.482584 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:22Z","lastTransitionTime":"2025-10-11T00:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.491919 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2zdb\" (UniqueName: \"kubernetes.io/projected/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-kube-api-access-d2zdb\") pod \"network-metrics-daemon-cb5z5\" (UID: \"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\") " pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.502204 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.517681 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.532073 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.555119 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38b5a40ddd5390cc392fcf2240304a72163dd797511640ada5ad39a59c0f4283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.584643 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.584753 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.584771 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.584795 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.584813 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:22Z","lastTransitionTime":"2025-10-11T00:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.586163 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d87f5b08faef77769b5965131a6bbc8c2d86c5b51c4561962b902a3056dc9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d87f5b08faef77769b5965131a6bbc8c2d86c5b51c4561962b902a3056dc9a84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:18Z\\\",\\\"message\\\":\\\"Recording success event on pod openshift-multus/multus-additional-cni-plugins-6wcnk\\\\nI1011 00:52:18.608350 6153 lb_config.go:1031] Cluster endpoints for openshift-machine-config-operator/machine-config-operator for network=default are: map[]\\\\nI1011 00:52:18.608301 6153 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI1011 00:52:18.608367 6153 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1011 00:52:18.608369 6153 services_controller.go:360] Finished syncing service machine-api-operator on namespace openshift-machine-api for network=default : 1.580511ms\\\\nF1011 00:52:18.608026 6153 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-48ljj_openshift-ovn-kubernetes(9ed16b35-862f-47f2-9e32-63c98f868fb8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.607336 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.628047 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c47ab12f326ce6568bfd06d8beebdeffc07d464b747a288a14793db096ccc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b62556627f75a8ed41c68a7fb1982461fd9fb35965014b72ec15492e763e42b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptjnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.646416 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cb5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cb5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.668813 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.688818 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.688899 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.688917 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.688944 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.688964 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:22Z","lastTransitionTime":"2025-10-11T00:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.690472 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.716053 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.735903 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.755552 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.777315 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.792273 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.792330 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.792351 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.792383 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.792403 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:22Z","lastTransitionTime":"2025-10-11T00:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.796365 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.825553 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38b5a40ddd5390cc392fcf2240304a72163dd797511640ada5ad39a59c0f4283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.858816 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d87f5b08faef77769b5965131a6bbc8c2d86c5b51c4561962b902a3056dc9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d87f5b08faef77769b5965131a6bbc8c2d86c5b51c4561962b902a3056dc9a84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:18Z\\\",\\\"message\\\":\\\"Recording success event on pod openshift-multus/multus-additional-cni-plugins-6wcnk\\\\nI1011 00:52:18.608350 6153 lb_config.go:1031] Cluster endpoints for openshift-machine-config-operator/machine-config-operator for network=default are: map[]\\\\nI1011 00:52:18.608301 6153 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI1011 00:52:18.608367 6153 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1011 00:52:18.608369 6153 services_controller.go:360] Finished syncing service machine-api-operator on namespace openshift-machine-api for network=default : 1.580511ms\\\\nF1011 00:52:18.608026 6153 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-48ljj_openshift-ovn-kubernetes(9ed16b35-862f-47f2-9e32-63c98f868fb8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.876758 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.896000 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.896049 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.896061 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.896080 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.896092 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:22Z","lastTransitionTime":"2025-10-11T00:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.901683 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.917759 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.932932 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.950802 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:22Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.983282 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs\") pod \"network-metrics-daemon-cb5z5\" (UID: \"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\") " pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:22 crc kubenswrapper[4743]: E1011 00:52:22.983519 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 00:52:22 crc kubenswrapper[4743]: E1011 00:52:22.983920 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs podName:b02b8636-a5c4-447d-b1cf-401b3dcfa02b nodeName:}" failed. No retries permitted until 2025-10-11 00:52:23.983842905 +0000 UTC m=+38.636823332 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs") pod "network-metrics-daemon-cb5z5" (UID: "b02b8636-a5c4-447d-b1cf-401b3dcfa02b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.999605 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.999661 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.999675 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.999700 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:22 crc kubenswrapper[4743]: I1011 00:52:22.999720 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:22Z","lastTransitionTime":"2025-10-11T00:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.103099 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.103168 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.103187 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.103213 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.103233 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:23Z","lastTransitionTime":"2025-10-11T00:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.207008 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.207082 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.207104 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.207217 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.207240 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:23Z","lastTransitionTime":"2025-10-11T00:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.310639 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.310708 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.310721 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.310742 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.310755 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:23Z","lastTransitionTime":"2025-10-11T00:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.413372 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.414269 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.414409 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.414536 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.414655 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:23Z","lastTransitionTime":"2025-10-11T00:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.518289 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.518336 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.518349 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.518371 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.518383 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:23Z","lastTransitionTime":"2025-10-11T00:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.621564 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.621636 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.621657 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.621686 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.621707 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:23Z","lastTransitionTime":"2025-10-11T00:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.725053 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.725150 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.725168 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.725227 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.725244 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:23Z","lastTransitionTime":"2025-10-11T00:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.827934 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.828006 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.828035 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.828073 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.828096 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:23Z","lastTransitionTime":"2025-10-11T00:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.930733 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.931157 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.931356 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.931513 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.931637 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:23Z","lastTransitionTime":"2025-10-11T00:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:23 crc kubenswrapper[4743]: I1011 00:52:23.994744 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs\") pod \"network-metrics-daemon-cb5z5\" (UID: \"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\") " pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:23 crc kubenswrapper[4743]: E1011 00:52:23.995039 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 00:52:23 crc kubenswrapper[4743]: E1011 00:52:23.995436 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs podName:b02b8636-a5c4-447d-b1cf-401b3dcfa02b nodeName:}" failed. No retries permitted until 2025-10-11 00:52:25.995397798 +0000 UTC m=+40.648378235 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs") pod "network-metrics-daemon-cb5z5" (UID: "b02b8636-a5c4-447d-b1cf-401b3dcfa02b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.034682 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.034740 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.034760 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.034791 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.034811 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:24Z","lastTransitionTime":"2025-10-11T00:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.091199 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.091219 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:24 crc kubenswrapper[4743]: E1011 00:52:24.091445 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.091533 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.091234 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:24 crc kubenswrapper[4743]: E1011 00:52:24.091657 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:24 crc kubenswrapper[4743]: E1011 00:52:24.091798 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:24 crc kubenswrapper[4743]: E1011 00:52:24.091976 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.138382 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.138436 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.138456 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.138480 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.138504 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:24Z","lastTransitionTime":"2025-10-11T00:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.241430 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.241505 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.241530 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.241587 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.241627 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:24Z","lastTransitionTime":"2025-10-11T00:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.346173 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.346233 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.346251 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.346279 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.346300 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:24Z","lastTransitionTime":"2025-10-11T00:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.448486 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.449465 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.449770 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.450103 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.450282 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:24Z","lastTransitionTime":"2025-10-11T00:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.554516 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.554578 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.554597 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.554622 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.554642 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:24Z","lastTransitionTime":"2025-10-11T00:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.658173 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.658230 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.658248 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.658274 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.658291 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:24Z","lastTransitionTime":"2025-10-11T00:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.761710 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.761764 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.761783 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.761809 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.761901 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:24Z","lastTransitionTime":"2025-10-11T00:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.864611 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.864665 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.864683 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.864705 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.864722 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:24Z","lastTransitionTime":"2025-10-11T00:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.968051 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.968114 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.968133 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.968162 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:24 crc kubenswrapper[4743]: I1011 00:52:24.968184 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:24Z","lastTransitionTime":"2025-10-11T00:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.071462 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.071542 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.071562 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.071593 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.071612 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:25Z","lastTransitionTime":"2025-10-11T00:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.175143 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.175217 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.175243 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.175278 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.175302 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:25Z","lastTransitionTime":"2025-10-11T00:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.278921 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.278984 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.279002 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.279027 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.279045 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:25Z","lastTransitionTime":"2025-10-11T00:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.382190 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.382373 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.382456 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.382555 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.382589 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:25Z","lastTransitionTime":"2025-10-11T00:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.486396 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.486460 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.486487 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.486520 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.486544 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:25Z","lastTransitionTime":"2025-10-11T00:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.590643 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.590714 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.590733 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.590762 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.590783 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:25Z","lastTransitionTime":"2025-10-11T00:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.694672 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.694769 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.694796 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.694832 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.695074 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:25Z","lastTransitionTime":"2025-10-11T00:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.797969 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.798043 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.798064 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.798093 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.798116 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:25Z","lastTransitionTime":"2025-10-11T00:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.901759 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.902126 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.902268 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.902472 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:25 crc kubenswrapper[4743]: I1011 00:52:25.902620 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:25Z","lastTransitionTime":"2025-10-11T00:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.005985 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.006055 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.006074 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.006101 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.006123 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:26Z","lastTransitionTime":"2025-10-11T00:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.020981 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs\") pod \"network-metrics-daemon-cb5z5\" (UID: \"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\") " pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:26 crc kubenswrapper[4743]: E1011 00:52:26.021343 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 00:52:26 crc kubenswrapper[4743]: E1011 00:52:26.021573 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs podName:b02b8636-a5c4-447d-b1cf-401b3dcfa02b nodeName:}" failed. No retries permitted until 2025-10-11 00:52:30.021542122 +0000 UTC m=+44.674522559 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs") pod "network-metrics-daemon-cb5z5" (UID: "b02b8636-a5c4-447d-b1cf-401b3dcfa02b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.091484 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.091558 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:26 crc kubenswrapper[4743]: E1011 00:52:26.091894 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:26 crc kubenswrapper[4743]: E1011 00:52:26.091995 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.092192 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.092323 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:26 crc kubenswrapper[4743]: E1011 00:52:26.092473 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:26 crc kubenswrapper[4743]: E1011 00:52:26.092716 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.109818 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.109910 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.109928 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.109954 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.109973 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:26Z","lastTransitionTime":"2025-10-11T00:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.125730 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:26Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.149821 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:26Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.168270 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:26Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.186423 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:26Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.204468 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:26Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.211601 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.211677 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.211701 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.211735 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.211755 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:26Z","lastTransitionTime":"2025-10-11T00:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.227267 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:26Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.246309 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c47ab12f326ce6568bfd06d8beebdeffc07d464b747a288a14793db096ccc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b62556627f75a8ed41c68a7fb1982461fd9fb35965014b72ec15492e763e42b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptjnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:26Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.262703 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cb5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cb5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:26Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.277684 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:26Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.297899 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:26Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.314491 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.314547 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.314566 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.314595 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.314613 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:26Z","lastTransitionTime":"2025-10-11T00:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.317986 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:26Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.337517 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:26Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.357582 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:26Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.378453 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:26Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.395943 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:26Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.417773 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.417891 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.417919 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.417955 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.417984 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:26Z","lastTransitionTime":"2025-10-11T00:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.424453 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38b5a40ddd5390cc392fcf2240304a72163dd797511640ada5ad39a59c0f4283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:26Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.466470 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d87f5b08faef77769b5965131a6bbc8c2d86c5b51c4561962b902a3056dc9a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d87f5b08faef77769b5965131a6bbc8c2d86c5b51c4561962b902a3056dc9a84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:18Z\\\",\\\"message\\\":\\\"Recording success event on pod openshift-multus/multus-additional-cni-plugins-6wcnk\\\\nI1011 00:52:18.608350 6153 lb_config.go:1031] Cluster endpoints for openshift-machine-config-operator/machine-config-operator for network=default are: map[]\\\\nI1011 00:52:18.608301 6153 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI1011 00:52:18.608367 6153 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1011 00:52:18.608369 6153 services_controller.go:360] Finished syncing service machine-api-operator on namespace openshift-machine-api for network=default : 1.580511ms\\\\nF1011 00:52:18.608026 6153 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-48ljj_openshift-ovn-kubernetes(9ed16b35-862f-47f2-9e32-63c98f868fb8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:26Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.521246 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.521299 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.521319 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.521346 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.521366 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:26Z","lastTransitionTime":"2025-10-11T00:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.624717 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.624784 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.624802 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.624831 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.624852 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:26Z","lastTransitionTime":"2025-10-11T00:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.728711 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.728810 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.728838 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.728916 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.728942 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:26Z","lastTransitionTime":"2025-10-11T00:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.831664 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.831715 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.831727 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.831745 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.831760 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:26Z","lastTransitionTime":"2025-10-11T00:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.844570 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.844639 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.844661 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.844692 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.844714 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:26Z","lastTransitionTime":"2025-10-11T00:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:26 crc kubenswrapper[4743]: E1011 00:52:26.860885 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:26Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.866547 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.866622 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.866638 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.866661 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.866698 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:26Z","lastTransitionTime":"2025-10-11T00:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:26 crc kubenswrapper[4743]: E1011 00:52:26.888993 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:26Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.894351 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.894446 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.894482 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.894530 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.894554 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:26Z","lastTransitionTime":"2025-10-11T00:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:26 crc kubenswrapper[4743]: E1011 00:52:26.919232 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:26Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.924032 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.924162 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.924183 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.924211 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.924232 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:26Z","lastTransitionTime":"2025-10-11T00:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:26 crc kubenswrapper[4743]: E1011 00:52:26.938732 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:26Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.942930 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.943052 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.943105 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.943132 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.943183 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:26Z","lastTransitionTime":"2025-10-11T00:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:26 crc kubenswrapper[4743]: E1011 00:52:26.956269 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:26Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:26 crc kubenswrapper[4743]: E1011 00:52:26.956472 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.958997 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.959084 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.959135 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.959158 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:26 crc kubenswrapper[4743]: I1011 00:52:26.959174 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:26Z","lastTransitionTime":"2025-10-11T00:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.062708 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.062742 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.062756 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.062772 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.062785 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:27Z","lastTransitionTime":"2025-10-11T00:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.166581 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.166658 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.166678 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.166710 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.166731 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:27Z","lastTransitionTime":"2025-10-11T00:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.269714 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.269786 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.269807 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.269838 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.269887 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:27Z","lastTransitionTime":"2025-10-11T00:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.373173 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.373240 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.373258 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.373286 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.373310 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:27Z","lastTransitionTime":"2025-10-11T00:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.477584 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.477654 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.477673 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.477702 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.477723 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:27Z","lastTransitionTime":"2025-10-11T00:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.583106 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.583161 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.583177 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.583204 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.583221 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:27Z","lastTransitionTime":"2025-10-11T00:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.686020 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.686094 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.686116 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.686147 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.686165 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:27Z","lastTransitionTime":"2025-10-11T00:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.790154 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.790252 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.790281 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.790313 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.790333 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:27Z","lastTransitionTime":"2025-10-11T00:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.893520 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.893594 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.893619 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.893651 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.893672 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:27Z","lastTransitionTime":"2025-10-11T00:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.997619 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.997689 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.997709 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.997737 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:27 crc kubenswrapper[4743]: I1011 00:52:27.997759 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:27Z","lastTransitionTime":"2025-10-11T00:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.091749 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.091841 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.091938 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.092200 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:28 crc kubenswrapper[4743]: E1011 00:52:28.092169 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:28 crc kubenswrapper[4743]: E1011 00:52:28.092331 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:28 crc kubenswrapper[4743]: E1011 00:52:28.092515 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:52:28 crc kubenswrapper[4743]: E1011 00:52:28.092731 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.100613 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.100689 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.100708 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.100738 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.100760 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:28Z","lastTransitionTime":"2025-10-11T00:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.204707 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.204776 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.204795 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.204823 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.204843 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:28Z","lastTransitionTime":"2025-10-11T00:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.309003 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.309078 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.309102 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.309137 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.309161 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:28Z","lastTransitionTime":"2025-10-11T00:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.412804 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.412928 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.413018 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.413056 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.413163 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:28Z","lastTransitionTime":"2025-10-11T00:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.517374 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.517478 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.517506 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.517544 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.517570 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:28Z","lastTransitionTime":"2025-10-11T00:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.622705 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.622788 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.622806 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.622839 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.622913 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:28Z","lastTransitionTime":"2025-10-11T00:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.726753 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.726818 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.726837 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.726911 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.726939 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:28Z","lastTransitionTime":"2025-10-11T00:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.830575 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.830655 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.830679 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.830710 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.830730 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:28Z","lastTransitionTime":"2025-10-11T00:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.934917 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.934975 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.934984 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.935003 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:28 crc kubenswrapper[4743]: I1011 00:52:28.935015 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:28Z","lastTransitionTime":"2025-10-11T00:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.038907 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.039009 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.039034 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.039072 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.039100 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:29Z","lastTransitionTime":"2025-10-11T00:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.142491 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.142570 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.142594 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.142634 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.142657 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:29Z","lastTransitionTime":"2025-10-11T00:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.246012 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.246083 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.246134 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.246161 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.246182 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:29Z","lastTransitionTime":"2025-10-11T00:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.349483 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.349550 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.349568 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.349670 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.349694 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:29Z","lastTransitionTime":"2025-10-11T00:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.452676 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.452748 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.452767 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.452795 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.452817 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:29Z","lastTransitionTime":"2025-10-11T00:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.555929 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.555996 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.556016 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.556049 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.556075 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:29Z","lastTransitionTime":"2025-10-11T00:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.658547 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.658627 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.658645 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.658720 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.658744 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:29Z","lastTransitionTime":"2025-10-11T00:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.761954 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.762028 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.762050 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.762079 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.762103 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:29Z","lastTransitionTime":"2025-10-11T00:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.864979 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.865039 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.865052 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.865072 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.865091 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:29Z","lastTransitionTime":"2025-10-11T00:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.968642 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.968683 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.968693 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.968712 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:29 crc kubenswrapper[4743]: I1011 00:52:29.968723 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:29Z","lastTransitionTime":"2025-10-11T00:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.070543 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs\") pod \"network-metrics-daemon-cb5z5\" (UID: \"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\") " pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:30 crc kubenswrapper[4743]: E1011 00:52:30.070969 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 00:52:30 crc kubenswrapper[4743]: E1011 00:52:30.071143 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs podName:b02b8636-a5c4-447d-b1cf-401b3dcfa02b nodeName:}" failed. No retries permitted until 2025-10-11 00:52:38.071093679 +0000 UTC m=+52.724074296 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs") pod "network-metrics-daemon-cb5z5" (UID: "b02b8636-a5c4-447d-b1cf-401b3dcfa02b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.071751 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.071809 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.071826 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.071850 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.071908 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:30Z","lastTransitionTime":"2025-10-11T00:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.090984 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.091062 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.091001 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:30 crc kubenswrapper[4743]: E1011 00:52:30.091193 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.090989 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:30 crc kubenswrapper[4743]: E1011 00:52:30.091354 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:52:30 crc kubenswrapper[4743]: E1011 00:52:30.091459 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:30 crc kubenswrapper[4743]: E1011 00:52:30.091641 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.174760 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.174819 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.174839 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.174891 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.174911 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:30Z","lastTransitionTime":"2025-10-11T00:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.279302 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.279369 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.279387 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.279412 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.279430 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:30Z","lastTransitionTime":"2025-10-11T00:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.382538 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.382595 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.382613 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.382639 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.382658 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:30Z","lastTransitionTime":"2025-10-11T00:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.486077 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.486148 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.486169 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.486202 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.486225 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:30Z","lastTransitionTime":"2025-10-11T00:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.589348 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.589428 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.589446 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.589472 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.589491 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:30Z","lastTransitionTime":"2025-10-11T00:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.692638 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.692685 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.692698 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.692717 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.692731 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:30Z","lastTransitionTime":"2025-10-11T00:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.795787 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.795848 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.795894 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.795924 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.795942 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:30Z","lastTransitionTime":"2025-10-11T00:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.899786 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.899842 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.899895 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.899923 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:30 crc kubenswrapper[4743]: I1011 00:52:30.899940 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:30Z","lastTransitionTime":"2025-10-11T00:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.003321 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.003386 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.003405 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.003434 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.003453 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:31Z","lastTransitionTime":"2025-10-11T00:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.107664 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.107738 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.107755 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.107782 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.107799 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:31Z","lastTransitionTime":"2025-10-11T00:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.211709 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.211760 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.211778 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.211801 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.211818 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:31Z","lastTransitionTime":"2025-10-11T00:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.314815 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.314901 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.314945 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.314968 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.314985 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:31Z","lastTransitionTime":"2025-10-11T00:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.418691 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.418760 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.418780 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.418806 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.418824 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:31Z","lastTransitionTime":"2025-10-11T00:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.522313 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.522376 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.522394 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.522424 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.522443 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:31Z","lastTransitionTime":"2025-10-11T00:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.626341 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.626404 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.626424 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.626457 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.626477 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:31Z","lastTransitionTime":"2025-10-11T00:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.729256 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.729302 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.729313 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.729331 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.729343 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:31Z","lastTransitionTime":"2025-10-11T00:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.832485 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.832540 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.832562 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.832585 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.832603 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:31Z","lastTransitionTime":"2025-10-11T00:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.936394 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.936458 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.936475 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.936501 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:31 crc kubenswrapper[4743]: I1011 00:52:31.936520 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:31Z","lastTransitionTime":"2025-10-11T00:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.040090 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.040138 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.040157 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.040184 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.040210 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:32Z","lastTransitionTime":"2025-10-11T00:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.090925 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.091085 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.090919 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:32 crc kubenswrapper[4743]: E1011 00:52:32.091099 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:32 crc kubenswrapper[4743]: E1011 00:52:32.091745 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:32 crc kubenswrapper[4743]: E1011 00:52:32.092004 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.092137 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:32 crc kubenswrapper[4743]: E1011 00:52:32.092240 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.092488 4743 scope.go:117] "RemoveContainer" containerID="d87f5b08faef77769b5965131a6bbc8c2d86c5b51c4561962b902a3056dc9a84" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.143618 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.143939 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.143953 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.143977 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.143992 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:32Z","lastTransitionTime":"2025-10-11T00:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.252675 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.252798 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.252954 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.252994 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.253020 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:32Z","lastTransitionTime":"2025-10-11T00:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.357892 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.357938 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.357956 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.357980 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.357997 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:32Z","lastTransitionTime":"2025-10-11T00:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.460454 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.460511 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.460528 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.460570 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.460587 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:32Z","lastTransitionTime":"2025-10-11T00:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.470163 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48ljj_9ed16b35-862f-47f2-9e32-63c98f868fb8/ovnkube-controller/1.log" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.473910 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerStarted","Data":"f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497"} Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.474687 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.514595 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:32Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.554050 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:32Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.563372 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.563410 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.563419 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.563434 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.563443 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:32Z","lastTransitionTime":"2025-10-11T00:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.573421 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:32Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.586073 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:32Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.636011 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:32Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.651552 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:32Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.665932 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.665968 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.665991 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.666008 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.666019 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:32Z","lastTransitionTime":"2025-10-11T00:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.667435 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:32Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.688502 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38b5a40ddd5390cc392fcf2240304a72163dd797511640ada5ad39a59c0f4283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:32Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.715305 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d87f5b08faef77769b5965131a6bbc8c2d86c5b51c4561962b902a3056dc9a84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:18Z\\\",\\\"message\\\":\\\"Recording success event on pod openshift-multus/multus-additional-cni-plugins-6wcnk\\\\nI1011 00:52:18.608350 6153 lb_config.go:1031] Cluster endpoints for openshift-machine-config-operator/machine-config-operator for network=default are: map[]\\\\nI1011 00:52:18.608301 6153 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI1011 00:52:18.608367 6153 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1011 00:52:18.608369 6153 services_controller.go:360] Finished syncing service machine-api-operator on namespace openshift-machine-api for network=default : 1.580511ms\\\\nF1011 00:52:18.608026 6153 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:32Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.739895 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:32Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.762603 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:32Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.768663 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.768688 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.768696 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.768712 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.768722 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:32Z","lastTransitionTime":"2025-10-11T00:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.773197 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:32Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.784012 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:32Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.798510 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:32Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.812183 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:32Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.824054 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c47ab12f326ce6568bfd06d8beebdeffc07d464b747a288a14793db096ccc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b62556627f75a8ed41c68a7fb1982461fd9fb35965014b72ec15492e763e42b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptjnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:32Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.835404 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cb5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cb5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:32Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.871597 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.871650 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.871666 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.871687 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.871703 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:32Z","lastTransitionTime":"2025-10-11T00:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.975225 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.975302 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.975320 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.975346 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:32 crc kubenswrapper[4743]: I1011 00:52:32.975364 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:32Z","lastTransitionTime":"2025-10-11T00:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.077440 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.077695 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.077709 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.077723 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.077731 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:33Z","lastTransitionTime":"2025-10-11T00:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.180334 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.180374 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.180392 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.180418 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.180436 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:33Z","lastTransitionTime":"2025-10-11T00:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.283161 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.283204 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.283212 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.283228 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.283237 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:33Z","lastTransitionTime":"2025-10-11T00:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.385651 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.385678 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.385703 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.385717 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.385726 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:33Z","lastTransitionTime":"2025-10-11T00:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.481402 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48ljj_9ed16b35-862f-47f2-9e32-63c98f868fb8/ovnkube-controller/2.log" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.482671 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48ljj_9ed16b35-862f-47f2-9e32-63c98f868fb8/ovnkube-controller/1.log" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.488047 4743 generic.go:334] "Generic (PLEG): container finished" podID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerID="f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497" exitCode=1 Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.488151 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerDied","Data":"f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497"} Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.488216 4743 scope.go:117] "RemoveContainer" containerID="d87f5b08faef77769b5965131a6bbc8c2d86c5b51c4561962b902a3056dc9a84" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.490901 4743 scope.go:117] "RemoveContainer" containerID="f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497" Oct 11 00:52:33 crc kubenswrapper[4743]: E1011 00:52:33.491348 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-48ljj_openshift-ovn-kubernetes(9ed16b35-862f-47f2-9e32-63c98f868fb8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.492356 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.492409 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.492432 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.492470 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.492492 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:33Z","lastTransitionTime":"2025-10-11T00:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.530342 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:33Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.553375 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:33Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.571931 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:33Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.588518 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:33Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.594920 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.594987 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.595006 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.595035 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.595053 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:33Z","lastTransitionTime":"2025-10-11T00:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.609082 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:33Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.627782 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c47ab12f326ce6568bfd06d8beebdeffc07d464b747a288a14793db096ccc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b62556627f75a8ed41c68a7fb1982461fd9fb35965014b72ec15492e763e42b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptjnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:33Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.644532 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cb5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cb5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:33Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.666279 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:33Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.687565 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:33Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.697584 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.697808 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.697928 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.698027 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.698118 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:33Z","lastTransitionTime":"2025-10-11T00:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.705357 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:33Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.722738 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:33Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.737429 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:33Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.756105 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:33Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.771137 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:33Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.792750 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38b5a40ddd5390cc392fcf2240304a72163dd797511640ada5ad39a59c0f4283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:33Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.800923 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.800972 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.800984 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.801003 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.801016 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:33Z","lastTransitionTime":"2025-10-11T00:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.823187 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d87f5b08faef77769b5965131a6bbc8c2d86c5b51c4561962b902a3056dc9a84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:18Z\\\",\\\"message\\\":\\\"Recording success event on pod openshift-multus/multus-additional-cni-plugins-6wcnk\\\\nI1011 00:52:18.608350 6153 lb_config.go:1031] Cluster endpoints for openshift-machine-config-operator/machine-config-operator for network=default are: map[]\\\\nI1011 00:52:18.608301 6153 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI1011 00:52:18.608367 6153 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1011 00:52:18.608369 6153 services_controller.go:360] Finished syncing service machine-api-operator on namespace openshift-machine-api for network=default : 1.580511ms\\\\nF1011 00:52:18.608026 6153 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:33Z\\\",\\\"message\\\":\\\"d syncing service metrics on namespace openshift-kube-apiserver-operator for network=default : 3.881381ms\\\\nI1011 00:52:33.160604 6362 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}\\\\nI1011 00:52:33.160615 6362 services_controller.go:360] Finished syncing service scheduler on namespace openshift-kube-scheduler for network=default : 2.970317ms\\\\nI1011 00:52:33.165904 6362 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1011 00:52:33.165984 6362 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1011 00:52:33.166020 6362 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1011 00:52:33.166766 6362 handler.go:208] Removed *v1.Node event handler 2\\\\nI1011 00:52:33.166850 6362 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1011 00:52:33.166973 6362 factory.go:656] Stopping watch factory\\\\nI1011 00:52:33.167000 6362 ovnkube.go:599] Stopped ovnkube\\\\nI1011 00:52:33.167036 6362 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1011 00:52:33.167068 6362 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1011 00:52:33.167200 6362 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:33Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.839773 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:33Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.903341 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.903601 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.903662 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.903749 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:33 crc kubenswrapper[4743]: I1011 00:52:33.903818 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:33Z","lastTransitionTime":"2025-10-11T00:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.006450 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.006511 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.006527 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.006550 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.006568 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:34Z","lastTransitionTime":"2025-10-11T00:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.091313 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.091355 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.091364 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.091469 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:34 crc kubenswrapper[4743]: E1011 00:52:34.091623 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:34 crc kubenswrapper[4743]: E1011 00:52:34.091757 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:34 crc kubenswrapper[4743]: E1011 00:52:34.091976 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:52:34 crc kubenswrapper[4743]: E1011 00:52:34.092157 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.114433 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.114484 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.114500 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.114525 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.114544 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:34Z","lastTransitionTime":"2025-10-11T00:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.218165 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.218222 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.218240 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.218265 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.218282 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:34Z","lastTransitionTime":"2025-10-11T00:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.321207 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.321278 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.321296 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.321321 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.321341 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:34Z","lastTransitionTime":"2025-10-11T00:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.424387 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.424477 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.424502 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.424536 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.424561 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:34Z","lastTransitionTime":"2025-10-11T00:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.494839 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48ljj_9ed16b35-862f-47f2-9e32-63c98f868fb8/ovnkube-controller/2.log" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.501167 4743 scope.go:117] "RemoveContainer" containerID="f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497" Oct 11 00:52:34 crc kubenswrapper[4743]: E1011 00:52:34.501440 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-48ljj_openshift-ovn-kubernetes(9ed16b35-862f-47f2-9e32-63c98f868fb8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.527450 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.527536 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.527558 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.527589 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.527612 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:34Z","lastTransitionTime":"2025-10-11T00:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.535361 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:33Z\\\",\\\"message\\\":\\\"d syncing service metrics on namespace openshift-kube-apiserver-operator for network=default : 3.881381ms\\\\nI1011 00:52:33.160604 6362 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}\\\\nI1011 00:52:33.160615 6362 services_controller.go:360] Finished syncing service scheduler on namespace openshift-kube-scheduler for network=default : 2.970317ms\\\\nI1011 00:52:33.165904 6362 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1011 00:52:33.165984 6362 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1011 00:52:33.166020 6362 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1011 00:52:33.166766 6362 handler.go:208] Removed *v1.Node event handler 2\\\\nI1011 00:52:33.166850 6362 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1011 00:52:33.166973 6362 factory.go:656] Stopping watch factory\\\\nI1011 00:52:33.167000 6362 ovnkube.go:599] Stopped ovnkube\\\\nI1011 00:52:33.167036 6362 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1011 00:52:33.167068 6362 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1011 00:52:33.167200 6362 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-48ljj_openshift-ovn-kubernetes(9ed16b35-862f-47f2-9e32-63c98f868fb8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:34Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.554902 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:34Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.573259 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:34Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.592121 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:34Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.609916 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:34Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.631249 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.631303 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.631315 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.631333 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.631350 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:34Z","lastTransitionTime":"2025-10-11T00:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.633652 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38b5a40ddd5390cc392fcf2240304a72163dd797511640ada5ad39a59c0f4283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:34Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.649397 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:34Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.687128 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:34Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.708670 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:34Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.724365 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:34Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.734338 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.734395 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.734414 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.734438 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.734459 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:34Z","lastTransitionTime":"2025-10-11T00:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.745289 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:34Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.767969 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:34Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.785788 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c47ab12f326ce6568bfd06d8beebdeffc07d464b747a288a14793db096ccc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b62556627f75a8ed41c68a7fb1982461fd9fb35965014b72ec15492e763e42b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptjnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:34Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.802450 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cb5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cb5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:34Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.822264 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:34Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.837080 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.837133 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.837151 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.837179 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.837196 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:34Z","lastTransitionTime":"2025-10-11T00:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.842712 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:34Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.863774 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:34Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.940301 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.940361 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.940381 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.940405 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:34 crc kubenswrapper[4743]: I1011 00:52:34.940422 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:34Z","lastTransitionTime":"2025-10-11T00:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.019243 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.031826 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.043738 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.043818 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.043844 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.043910 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.043933 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:35Z","lastTransitionTime":"2025-10-11T00:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.054392 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:35Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.078330 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:35Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.094842 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:35Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.111454 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:35Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.131047 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:35Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.181795 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.181846 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.181907 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.181984 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.182004 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:35Z","lastTransitionTime":"2025-10-11T00:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.194537 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c47ab12f326ce6568bfd06d8beebdeffc07d464b747a288a14793db096ccc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b62556627f75a8ed41c68a7fb1982461fd9fb35965014b72ec15492e763e42b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptjnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:35Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.211626 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cb5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cb5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:35Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.228964 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:35Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.247668 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:35Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.268832 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:35Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.284980 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.285080 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.285103 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.285129 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.285147 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:35Z","lastTransitionTime":"2025-10-11T00:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.289693 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:35Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.307792 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:35Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.323552 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:35Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.341171 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:35Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.366992 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38b5a40ddd5390cc392fcf2240304a72163dd797511640ada5ad39a59c0f4283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:35Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.388810 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.388895 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.388914 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.388939 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.388958 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:35Z","lastTransitionTime":"2025-10-11T00:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.399627 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:33Z\\\",\\\"message\\\":\\\"d syncing service metrics on namespace openshift-kube-apiserver-operator for network=default : 3.881381ms\\\\nI1011 00:52:33.160604 6362 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}\\\\nI1011 00:52:33.160615 6362 services_controller.go:360] Finished syncing service scheduler on namespace openshift-kube-scheduler for network=default : 2.970317ms\\\\nI1011 00:52:33.165904 6362 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1011 00:52:33.165984 6362 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1011 00:52:33.166020 6362 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1011 00:52:33.166766 6362 handler.go:208] Removed *v1.Node event handler 2\\\\nI1011 00:52:33.166850 6362 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1011 00:52:33.166973 6362 factory.go:656] Stopping watch factory\\\\nI1011 00:52:33.167000 6362 ovnkube.go:599] Stopped ovnkube\\\\nI1011 00:52:33.167036 6362 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1011 00:52:33.167068 6362 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1011 00:52:33.167200 6362 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-48ljj_openshift-ovn-kubernetes(9ed16b35-862f-47f2-9e32-63c98f868fb8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:35Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.420549 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:35Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.491969 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.492026 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.492043 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.492067 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.492086 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:35Z","lastTransitionTime":"2025-10-11T00:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.594532 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.594590 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.594608 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.594632 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.594652 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:35Z","lastTransitionTime":"2025-10-11T00:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.698104 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.698179 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.698199 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.698228 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.698245 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:35Z","lastTransitionTime":"2025-10-11T00:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.801101 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.801188 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.801208 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.801235 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.801255 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:35Z","lastTransitionTime":"2025-10-11T00:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.904121 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.904194 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.904217 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.904252 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:35 crc kubenswrapper[4743]: I1011 00:52:35.904273 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:35Z","lastTransitionTime":"2025-10-11T00:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.011120 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.011223 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.011242 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.011273 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.011292 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:36Z","lastTransitionTime":"2025-10-11T00:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.091385 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.091460 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.091462 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.091385 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:36 crc kubenswrapper[4743]: E1011 00:52:36.091612 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:52:36 crc kubenswrapper[4743]: E1011 00:52:36.091746 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:36 crc kubenswrapper[4743]: E1011 00:52:36.091928 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:36 crc kubenswrapper[4743]: E1011 00:52:36.092032 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.113775 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.113833 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.113851 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.113902 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.113923 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:36Z","lastTransitionTime":"2025-10-11T00:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.123547 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:33Z\\\",\\\"message\\\":\\\"d syncing service metrics on namespace openshift-kube-apiserver-operator for network=default : 3.881381ms\\\\nI1011 00:52:33.160604 6362 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}\\\\nI1011 00:52:33.160615 6362 services_controller.go:360] Finished syncing service scheduler on namespace openshift-kube-scheduler for network=default : 2.970317ms\\\\nI1011 00:52:33.165904 6362 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1011 00:52:33.165984 6362 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1011 00:52:33.166020 6362 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1011 00:52:33.166766 6362 handler.go:208] Removed *v1.Node event handler 2\\\\nI1011 00:52:33.166850 6362 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1011 00:52:33.166973 6362 factory.go:656] Stopping watch factory\\\\nI1011 00:52:33.167000 6362 ovnkube.go:599] Stopped ovnkube\\\\nI1011 00:52:33.167036 6362 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1011 00:52:33.167068 6362 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1011 00:52:33.167200 6362 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-48ljj_openshift-ovn-kubernetes(9ed16b35-862f-47f2-9e32-63c98f868fb8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:36Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.144720 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:36Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.162773 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:36Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.183467 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:36Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.202463 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:36Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.216473 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.216532 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.216553 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.216580 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.216598 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:36Z","lastTransitionTime":"2025-10-11T00:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.226846 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38b5a40ddd5390cc392fcf2240304a72163dd797511640ada5ad39a59c0f4283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:36Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.243842 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:36Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.263115 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab5ac12d-37cf-48f3-bb33-b93b4096c706\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4335571969a3cb66f2fefdad63b77b4a31fc2631aba5ba427b2af4db8b6c6f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://895152fd89f4c35c7d379d3a93c1b5f185275cd151d27b442b8f06280f3f74a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bc6f9e58ae313b026b5597d2569b343931bc066ac8b3751c62f39c8d849bf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3532ffcf956621ad56477cbb9ff70c2855091a6ad996d3a15fa3c9a28943cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3532ffcf956621ad56477cbb9ff70c2855091a6ad996d3a15fa3c9a28943cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:36Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.296153 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:36Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.317831 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:36Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.319480 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.319538 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.319564 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.319594 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.319618 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:36Z","lastTransitionTime":"2025-10-11T00:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.334827 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:36Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.357146 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:36Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.378103 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:36Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.398713 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c47ab12f326ce6568bfd06d8beebdeffc07d464b747a288a14793db096ccc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b62556627f75a8ed41c68a7fb1982461fd9fb35965014b72ec15492e763e42b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptjnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:36Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.415912 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cb5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cb5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:36Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.422155 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.422225 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.422246 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.422274 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.422293 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:36Z","lastTransitionTime":"2025-10-11T00:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.437396 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:36Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.456426 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:36Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.477730 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:36Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.524786 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.524919 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.524939 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.524964 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.524982 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:36Z","lastTransitionTime":"2025-10-11T00:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.628451 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.628566 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.628622 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.628647 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.628663 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:36Z","lastTransitionTime":"2025-10-11T00:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.732440 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.732502 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.732520 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.732553 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.732571 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:36Z","lastTransitionTime":"2025-10-11T00:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.835733 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.835802 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.835824 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.835897 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.835925 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:36Z","lastTransitionTime":"2025-10-11T00:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.939259 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.939374 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.939391 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.939410 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:36 crc kubenswrapper[4743]: I1011 00:52:36.939421 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:36Z","lastTransitionTime":"2025-10-11T00:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.042233 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.042288 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.042306 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.042333 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.042353 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:37Z","lastTransitionTime":"2025-10-11T00:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.072602 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.072663 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.072679 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.072704 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.072722 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:37Z","lastTransitionTime":"2025-10-11T00:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:37 crc kubenswrapper[4743]: E1011 00:52:37.096135 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:37Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.101853 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.101927 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.101943 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.101965 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.101984 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:37Z","lastTransitionTime":"2025-10-11T00:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:37 crc kubenswrapper[4743]: E1011 00:52:37.122635 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:37Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.128162 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.128241 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.128269 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.128305 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.128365 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:37Z","lastTransitionTime":"2025-10-11T00:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:37 crc kubenswrapper[4743]: E1011 00:52:37.148669 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:37Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.153627 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.153679 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.153711 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.153743 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.153761 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:37Z","lastTransitionTime":"2025-10-11T00:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:37 crc kubenswrapper[4743]: E1011 00:52:37.174405 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:37Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.179202 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.179253 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.179271 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.179292 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.179310 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:37Z","lastTransitionTime":"2025-10-11T00:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:37 crc kubenswrapper[4743]: E1011 00:52:37.199648 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:37Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:37 crc kubenswrapper[4743]: E1011 00:52:37.200018 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.202919 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.202970 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.202979 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.202997 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.203010 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:37Z","lastTransitionTime":"2025-10-11T00:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.306324 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.306359 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.306367 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.306382 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.306391 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:37Z","lastTransitionTime":"2025-10-11T00:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.409614 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.409696 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.409718 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.409748 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.409773 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:37Z","lastTransitionTime":"2025-10-11T00:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.512106 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.512177 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.512198 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.512224 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.512241 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:37Z","lastTransitionTime":"2025-10-11T00:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.615247 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.615313 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.615339 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.615367 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.615388 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:37Z","lastTransitionTime":"2025-10-11T00:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.659189 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:52:37 crc kubenswrapper[4743]: E1011 00:52:37.659422 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:53:09.659387404 +0000 UTC m=+84.312367831 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.718065 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.718134 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.718159 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.718193 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.718214 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:37Z","lastTransitionTime":"2025-10-11T00:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.821053 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.821157 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.821176 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.821200 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.821218 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:37Z","lastTransitionTime":"2025-10-11T00:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.861939 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.861997 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.862029 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.862051 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:37 crc kubenswrapper[4743]: E1011 00:52:37.862080 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 00:52:37 crc kubenswrapper[4743]: E1011 00:52:37.862167 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 00:53:09.862142118 +0000 UTC m=+84.515122545 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 00:52:37 crc kubenswrapper[4743]: E1011 00:52:37.862191 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 00:52:37 crc kubenswrapper[4743]: E1011 00:52:37.862213 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 00:52:37 crc kubenswrapper[4743]: E1011 00:52:37.862224 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:37 crc kubenswrapper[4743]: E1011 00:52:37.862286 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-11 00:53:09.862269752 +0000 UTC m=+84.515250149 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:37 crc kubenswrapper[4743]: E1011 00:52:37.862365 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 00:52:37 crc kubenswrapper[4743]: E1011 00:52:37.862430 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 00:52:37 crc kubenswrapper[4743]: E1011 00:52:37.862451 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:37 crc kubenswrapper[4743]: E1011 00:52:37.862373 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 00:52:37 crc kubenswrapper[4743]: E1011 00:52:37.862569 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-11 00:53:09.862538689 +0000 UTC m=+84.515519126 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:52:37 crc kubenswrapper[4743]: E1011 00:52:37.862676 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 00:53:09.862658822 +0000 UTC m=+84.515639259 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.923913 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.923967 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.923987 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.924081 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:37 crc kubenswrapper[4743]: I1011 00:52:37.924111 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:37Z","lastTransitionTime":"2025-10-11T00:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.027119 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.027182 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.027200 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.027224 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.027242 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:38Z","lastTransitionTime":"2025-10-11T00:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.091472 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.091538 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.091575 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:38 crc kubenswrapper[4743]: E1011 00:52:38.091709 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.091789 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:38 crc kubenswrapper[4743]: E1011 00:52:38.092007 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:38 crc kubenswrapper[4743]: E1011 00:52:38.092158 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:38 crc kubenswrapper[4743]: E1011 00:52:38.092955 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.129786 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.129840 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.129896 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.129921 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.129937 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:38Z","lastTransitionTime":"2025-10-11T00:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.164932 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs\") pod \"network-metrics-daemon-cb5z5\" (UID: \"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\") " pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:38 crc kubenswrapper[4743]: E1011 00:52:38.165141 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 00:52:38 crc kubenswrapper[4743]: E1011 00:52:38.165228 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs podName:b02b8636-a5c4-447d-b1cf-401b3dcfa02b nodeName:}" failed. No retries permitted until 2025-10-11 00:52:54.165204532 +0000 UTC m=+68.818184959 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs") pod "network-metrics-daemon-cb5z5" (UID: "b02b8636-a5c4-447d-b1cf-401b3dcfa02b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.233126 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.233182 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.233200 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.233228 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.233246 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:38Z","lastTransitionTime":"2025-10-11T00:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.336678 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.336711 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.336720 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.336734 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.336743 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:38Z","lastTransitionTime":"2025-10-11T00:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.439123 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.439473 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.439635 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.439781 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.439950 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:38Z","lastTransitionTime":"2025-10-11T00:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.542504 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.542838 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.543126 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.543343 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.543582 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:38Z","lastTransitionTime":"2025-10-11T00:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.646502 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.646907 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.647099 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.647315 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.647525 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:38Z","lastTransitionTime":"2025-10-11T00:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.751633 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.751735 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.751795 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.751821 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.751902 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:38Z","lastTransitionTime":"2025-10-11T00:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.854458 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.854893 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.855039 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.855218 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.855421 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:38Z","lastTransitionTime":"2025-10-11T00:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.958985 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.959043 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.959059 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.959083 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:38 crc kubenswrapper[4743]: I1011 00:52:38.959101 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:38Z","lastTransitionTime":"2025-10-11T00:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.062486 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.062884 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.063064 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.063214 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.063349 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:39Z","lastTransitionTime":"2025-10-11T00:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.166852 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.166942 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.166960 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.166985 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.167006 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:39Z","lastTransitionTime":"2025-10-11T00:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.270250 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.270314 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.270331 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.270356 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.270374 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:39Z","lastTransitionTime":"2025-10-11T00:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.373807 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.373921 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.373944 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.373972 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.373991 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:39Z","lastTransitionTime":"2025-10-11T00:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.477381 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.477459 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.477482 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.477514 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.477540 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:39Z","lastTransitionTime":"2025-10-11T00:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.581084 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.581146 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.581167 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.581192 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.581209 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:39Z","lastTransitionTime":"2025-10-11T00:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.684366 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.684434 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.684452 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.684478 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.684496 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:39Z","lastTransitionTime":"2025-10-11T00:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.787641 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.787708 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.787726 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.787751 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.787769 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:39Z","lastTransitionTime":"2025-10-11T00:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.890895 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.890976 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.891002 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.891030 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.891049 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:39Z","lastTransitionTime":"2025-10-11T00:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.993930 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.993988 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.994005 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.994028 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:39 crc kubenswrapper[4743]: I1011 00:52:39.994048 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:39Z","lastTransitionTime":"2025-10-11T00:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.091651 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.091682 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.091719 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:40 crc kubenswrapper[4743]: E1011 00:52:40.091827 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.092356 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:40 crc kubenswrapper[4743]: E1011 00:52:40.092509 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:52:40 crc kubenswrapper[4743]: E1011 00:52:40.092394 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:40 crc kubenswrapper[4743]: E1011 00:52:40.092892 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.097851 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.097946 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.097977 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.098048 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.098075 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:40Z","lastTransitionTime":"2025-10-11T00:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.221224 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.221284 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.221302 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.221329 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.221348 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:40Z","lastTransitionTime":"2025-10-11T00:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.324945 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.324997 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.325010 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.325030 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.325041 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:40Z","lastTransitionTime":"2025-10-11T00:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.428761 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.428798 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.428810 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.428825 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.428835 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:40Z","lastTransitionTime":"2025-10-11T00:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.531996 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.532058 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.532073 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.532095 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.532113 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:40Z","lastTransitionTime":"2025-10-11T00:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.635710 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.635790 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.635814 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.635844 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.635907 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:40Z","lastTransitionTime":"2025-10-11T00:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.738483 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.738568 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.738580 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.738621 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.738637 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:40Z","lastTransitionTime":"2025-10-11T00:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.842207 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.842273 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.842293 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.842322 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.842346 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:40Z","lastTransitionTime":"2025-10-11T00:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.945014 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.945108 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.945125 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.945150 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:40 crc kubenswrapper[4743]: I1011 00:52:40.945167 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:40Z","lastTransitionTime":"2025-10-11T00:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.048661 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.048740 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.048757 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.048787 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.048807 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:41Z","lastTransitionTime":"2025-10-11T00:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.152531 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.152593 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.152643 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.152675 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.152694 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:41Z","lastTransitionTime":"2025-10-11T00:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.255369 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.255461 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.255481 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.255505 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.255521 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:41Z","lastTransitionTime":"2025-10-11T00:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.359090 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.359157 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.359181 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.359212 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.359238 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:41Z","lastTransitionTime":"2025-10-11T00:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.462168 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.462236 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.462276 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.462306 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.462327 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:41Z","lastTransitionTime":"2025-10-11T00:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.565031 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.565094 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.565112 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.565142 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.565161 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:41Z","lastTransitionTime":"2025-10-11T00:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.668536 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.668613 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.668633 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.668659 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.668677 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:41Z","lastTransitionTime":"2025-10-11T00:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.771822 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.771895 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.771914 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.771937 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.771954 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:41Z","lastTransitionTime":"2025-10-11T00:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.875326 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.875394 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.875420 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.875461 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.875483 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:41Z","lastTransitionTime":"2025-10-11T00:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.978585 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.978646 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.978667 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.978697 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:41 crc kubenswrapper[4743]: I1011 00:52:41.978716 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:41Z","lastTransitionTime":"2025-10-11T00:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.082162 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.082213 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.082230 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.082252 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.082269 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:42Z","lastTransitionTime":"2025-10-11T00:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.090753 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.090836 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:42 crc kubenswrapper[4743]: E1011 00:52:42.090941 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.090969 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:42 crc kubenswrapper[4743]: E1011 00:52:42.091038 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:42 crc kubenswrapper[4743]: E1011 00:52:42.091211 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.091597 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:42 crc kubenswrapper[4743]: E1011 00:52:42.091800 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.184786 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.184822 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.184834 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.184852 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.184888 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:42Z","lastTransitionTime":"2025-10-11T00:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.287165 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.287201 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.287213 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.287230 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.287242 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:42Z","lastTransitionTime":"2025-10-11T00:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.390287 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.390351 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.390371 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.390398 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.390416 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:42Z","lastTransitionTime":"2025-10-11T00:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.493915 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.493968 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.493986 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.494009 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.494026 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:42Z","lastTransitionTime":"2025-10-11T00:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.597200 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.597494 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.597608 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.597710 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.597794 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:42Z","lastTransitionTime":"2025-10-11T00:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.700028 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.700073 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.700085 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.700106 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.700117 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:42Z","lastTransitionTime":"2025-10-11T00:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.802964 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.803028 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.803047 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.803114 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.803134 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:42Z","lastTransitionTime":"2025-10-11T00:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.906437 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.906498 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.906521 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.906547 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:42 crc kubenswrapper[4743]: I1011 00:52:42.906567 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:42Z","lastTransitionTime":"2025-10-11T00:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.010229 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.010334 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.010354 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.010379 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.010402 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:43Z","lastTransitionTime":"2025-10-11T00:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.112608 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.112659 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.112676 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.112702 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.112719 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:43Z","lastTransitionTime":"2025-10-11T00:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.215771 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.215890 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.215916 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.215942 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.215959 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:43Z","lastTransitionTime":"2025-10-11T00:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.319910 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.319970 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.319987 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.320013 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.320031 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:43Z","lastTransitionTime":"2025-10-11T00:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.423245 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.423300 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.423320 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.423346 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.423364 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:43Z","lastTransitionTime":"2025-10-11T00:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.526841 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.526959 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.526985 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.527017 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.527037 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:43Z","lastTransitionTime":"2025-10-11T00:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.630249 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.630359 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.630388 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.630421 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.630444 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:43Z","lastTransitionTime":"2025-10-11T00:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.733843 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.733975 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.733999 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.734033 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.734056 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:43Z","lastTransitionTime":"2025-10-11T00:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.838006 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.838073 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.838091 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.838116 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.838138 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:43Z","lastTransitionTime":"2025-10-11T00:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.941886 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.941961 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.941984 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.942019 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:43 crc kubenswrapper[4743]: I1011 00:52:43.942044 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:43Z","lastTransitionTime":"2025-10-11T00:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.044881 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.044947 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.044969 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.044995 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.045014 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:44Z","lastTransitionTime":"2025-10-11T00:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.090810 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.090902 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.090893 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.090810 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:44 crc kubenswrapper[4743]: E1011 00:52:44.091044 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:44 crc kubenswrapper[4743]: E1011 00:52:44.091192 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:44 crc kubenswrapper[4743]: E1011 00:52:44.091403 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:52:44 crc kubenswrapper[4743]: E1011 00:52:44.091698 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.148257 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.148348 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.148368 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.148394 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.148441 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:44Z","lastTransitionTime":"2025-10-11T00:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.250810 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.250899 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.250918 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.250945 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.250962 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:44Z","lastTransitionTime":"2025-10-11T00:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.354257 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.354334 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.354351 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.354377 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.354396 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:44Z","lastTransitionTime":"2025-10-11T00:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.457699 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.457763 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.457780 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.457807 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.457826 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:44Z","lastTransitionTime":"2025-10-11T00:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.560563 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.560621 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.560638 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.560662 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.560679 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:44Z","lastTransitionTime":"2025-10-11T00:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.664263 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.664328 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.664345 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.664371 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.664390 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:44Z","lastTransitionTime":"2025-10-11T00:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.767461 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.767522 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.767540 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.767566 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.767584 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:44Z","lastTransitionTime":"2025-10-11T00:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.870739 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.870802 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.870822 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.870850 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.870897 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:44Z","lastTransitionTime":"2025-10-11T00:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.973505 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.973569 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.973588 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.973617 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:44 crc kubenswrapper[4743]: I1011 00:52:44.973651 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:44Z","lastTransitionTime":"2025-10-11T00:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.076617 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.076682 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.076700 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.076725 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.076744 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:45Z","lastTransitionTime":"2025-10-11T00:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.179890 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.180232 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.180249 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.180273 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.180290 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:45Z","lastTransitionTime":"2025-10-11T00:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.283709 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.283768 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.283787 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.283811 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.283829 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:45Z","lastTransitionTime":"2025-10-11T00:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.387161 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.387223 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.387241 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.387268 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.387289 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:45Z","lastTransitionTime":"2025-10-11T00:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.491066 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.491162 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.491186 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.491217 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.491235 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:45Z","lastTransitionTime":"2025-10-11T00:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.594113 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.594197 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.594215 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.594244 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.594261 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:45Z","lastTransitionTime":"2025-10-11T00:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.697043 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.697124 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.697148 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.697174 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.697192 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:45Z","lastTransitionTime":"2025-10-11T00:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.799828 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.799943 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.799968 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.800001 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.800026 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:45Z","lastTransitionTime":"2025-10-11T00:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.902929 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.902989 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.903012 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.903044 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:45 crc kubenswrapper[4743]: I1011 00:52:45.903066 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:45Z","lastTransitionTime":"2025-10-11T00:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.006131 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.006199 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.006225 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.006261 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.006284 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:46Z","lastTransitionTime":"2025-10-11T00:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.090677 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:46 crc kubenswrapper[4743]: E1011 00:52:46.090922 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.091023 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.091657 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.092076 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:46 crc kubenswrapper[4743]: E1011 00:52:46.093173 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:52:46 crc kubenswrapper[4743]: E1011 00:52:46.094311 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:46 crc kubenswrapper[4743]: E1011 00:52:46.094504 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.096424 4743 scope.go:117] "RemoveContainer" containerID="f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497" Oct 11 00:52:46 crc kubenswrapper[4743]: E1011 00:52:46.097566 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-48ljj_openshift-ovn-kubernetes(9ed16b35-862f-47f2-9e32-63c98f868fb8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.109617 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.109675 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.109693 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.109718 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.109735 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:46Z","lastTransitionTime":"2025-10-11T00:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.118076 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:46Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.138451 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c47ab12f326ce6568bfd06d8beebdeffc07d464b747a288a14793db096ccc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b62556627f75a8ed41c68a7fb1982461fd9fb35965014b72ec15492e763e42b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptjnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:46Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.155915 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cb5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cb5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:46Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.176505 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:46Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.195788 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:46Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.212244 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.212307 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.212328 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.212353 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.212370 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:46Z","lastTransitionTime":"2025-10-11T00:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.218910 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:46Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.241789 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:46Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.261098 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:46Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.280356 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:46Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.298004 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:46Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.316121 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.316209 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.316228 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.316287 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.316307 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:46Z","lastTransitionTime":"2025-10-11T00:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.321817 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38b5a40ddd5390cc392fcf2240304a72163dd797511640ada5ad39a59c0f4283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:46Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.353328 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:33Z\\\",\\\"message\\\":\\\"d syncing service metrics on namespace openshift-kube-apiserver-operator for network=default : 3.881381ms\\\\nI1011 00:52:33.160604 6362 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}\\\\nI1011 00:52:33.160615 6362 services_controller.go:360] Finished syncing service scheduler on namespace openshift-kube-scheduler for network=default : 2.970317ms\\\\nI1011 00:52:33.165904 6362 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1011 00:52:33.165984 6362 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1011 00:52:33.166020 6362 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1011 00:52:33.166766 6362 handler.go:208] Removed *v1.Node event handler 2\\\\nI1011 00:52:33.166850 6362 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1011 00:52:33.166973 6362 factory.go:656] Stopping watch factory\\\\nI1011 00:52:33.167000 6362 ovnkube.go:599] Stopped ovnkube\\\\nI1011 00:52:33.167036 6362 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1011 00:52:33.167068 6362 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1011 00:52:33.167200 6362 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-48ljj_openshift-ovn-kubernetes(9ed16b35-862f-47f2-9e32-63c98f868fb8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:46Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.373676 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:46Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.407669 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:46Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.419330 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.419410 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.419434 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.419464 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.419487 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:46Z","lastTransitionTime":"2025-10-11T00:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.431418 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:46Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.453761 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:46Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.470891 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:46Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.491648 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab5ac12d-37cf-48f3-bb33-b93b4096c706\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4335571969a3cb66f2fefdad63b77b4a31fc2631aba5ba427b2af4db8b6c6f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://895152fd89f4c35c7d379d3a93c1b5f185275cd151d27b442b8f06280f3f74a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bc6f9e58ae313b026b5597d2569b343931bc066ac8b3751c62f39c8d849bf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3532ffcf956621ad56477cbb9ff70c2855091a6ad996d3a15fa3c9a28943cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3532ffcf956621ad56477cbb9ff70c2855091a6ad996d3a15fa3c9a28943cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:46Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.522353 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.522395 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.522408 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.522426 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.522438 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:46Z","lastTransitionTime":"2025-10-11T00:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.624488 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.624561 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.624585 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.624616 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.624642 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:46Z","lastTransitionTime":"2025-10-11T00:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.727564 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.727615 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.727629 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.727651 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.727666 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:46Z","lastTransitionTime":"2025-10-11T00:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.829917 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.829995 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.830012 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.830033 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.830048 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:46Z","lastTransitionTime":"2025-10-11T00:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.932745 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.932826 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.932840 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.932875 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:46 crc kubenswrapper[4743]: I1011 00:52:46.932888 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:46Z","lastTransitionTime":"2025-10-11T00:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.035047 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.035118 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.035142 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.035171 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.035195 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:47Z","lastTransitionTime":"2025-10-11T00:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.138238 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.138321 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.138347 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.138373 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.138390 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:47Z","lastTransitionTime":"2025-10-11T00:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.241513 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.241616 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.241684 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.241719 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.241742 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:47Z","lastTransitionTime":"2025-10-11T00:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.281083 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.281157 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.281193 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.281226 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.281248 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:47Z","lastTransitionTime":"2025-10-11T00:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:47 crc kubenswrapper[4743]: E1011 00:52:47.302218 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:47Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.307388 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.307483 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.307506 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.307539 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.307561 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:47Z","lastTransitionTime":"2025-10-11T00:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:47 crc kubenswrapper[4743]: E1011 00:52:47.328192 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:47Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.333419 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.333481 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.333498 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.333524 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.333542 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:47Z","lastTransitionTime":"2025-10-11T00:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:47 crc kubenswrapper[4743]: E1011 00:52:47.354883 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:47Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.360407 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.360457 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.360480 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.360514 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.360537 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:47Z","lastTransitionTime":"2025-10-11T00:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:47 crc kubenswrapper[4743]: E1011 00:52:47.381234 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:47Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.386559 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.386626 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.386684 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.386718 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.386742 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:47Z","lastTransitionTime":"2025-10-11T00:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:47 crc kubenswrapper[4743]: E1011 00:52:47.413850 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:47Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:47 crc kubenswrapper[4743]: E1011 00:52:47.414139 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.416454 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.416526 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.416544 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.416570 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.416587 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:47Z","lastTransitionTime":"2025-10-11T00:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.519405 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.519459 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.519476 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.519500 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.519518 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:47Z","lastTransitionTime":"2025-10-11T00:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.622208 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.622311 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.622338 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.622369 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.622390 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:47Z","lastTransitionTime":"2025-10-11T00:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.725621 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.725691 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.725713 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.725743 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.725764 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:47Z","lastTransitionTime":"2025-10-11T00:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.828683 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.828750 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.828768 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.828794 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.828815 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:47Z","lastTransitionTime":"2025-10-11T00:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.932365 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.932438 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.932462 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.932490 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:47 crc kubenswrapper[4743]: I1011 00:52:47.932512 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:47Z","lastTransitionTime":"2025-10-11T00:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.035742 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.035817 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.035840 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.035916 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.035970 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:48Z","lastTransitionTime":"2025-10-11T00:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.093501 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.093579 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.093501 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:48 crc kubenswrapper[4743]: E1011 00:52:48.093703 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.093649 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:48 crc kubenswrapper[4743]: E1011 00:52:48.093824 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:52:48 crc kubenswrapper[4743]: E1011 00:52:48.093996 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:48 crc kubenswrapper[4743]: E1011 00:52:48.094266 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.139295 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.139653 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.139829 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.140071 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.140255 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:48Z","lastTransitionTime":"2025-10-11T00:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.243714 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.243772 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.243792 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.243817 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.243834 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:48Z","lastTransitionTime":"2025-10-11T00:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.346515 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.346567 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.346578 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.346593 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.346602 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:48Z","lastTransitionTime":"2025-10-11T00:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.449396 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.449462 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.449472 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.449488 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.449498 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:48Z","lastTransitionTime":"2025-10-11T00:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.552376 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.552432 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.552451 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.552482 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.552506 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:48Z","lastTransitionTime":"2025-10-11T00:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.655611 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.655687 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.655708 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.655739 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.655761 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:48Z","lastTransitionTime":"2025-10-11T00:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.759472 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.759547 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.759569 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.759600 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.759620 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:48Z","lastTransitionTime":"2025-10-11T00:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.862950 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.863038 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.863056 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.863078 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.863094 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:48Z","lastTransitionTime":"2025-10-11T00:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.966169 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.966232 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.966253 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.966278 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:48 crc kubenswrapper[4743]: I1011 00:52:48.966299 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:48Z","lastTransitionTime":"2025-10-11T00:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.068897 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.068961 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.068977 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.069003 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.069020 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:49Z","lastTransitionTime":"2025-10-11T00:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.171659 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.172040 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.172200 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.172338 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.172463 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:49Z","lastTransitionTime":"2025-10-11T00:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.276208 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.276267 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.276294 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.276316 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.276328 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:49Z","lastTransitionTime":"2025-10-11T00:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.379635 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.379673 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.379703 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.379813 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.379833 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:49Z","lastTransitionTime":"2025-10-11T00:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.482212 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.482262 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.482275 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.482293 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.482305 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:49Z","lastTransitionTime":"2025-10-11T00:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.585451 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.585502 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.585515 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.585538 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.585551 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:49Z","lastTransitionTime":"2025-10-11T00:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.688965 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.689033 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.689117 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.689149 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.689169 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:49Z","lastTransitionTime":"2025-10-11T00:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.792310 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.792394 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.792414 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.792445 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.792468 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:49Z","lastTransitionTime":"2025-10-11T00:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.897717 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.897762 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.897774 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.897793 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:49 crc kubenswrapper[4743]: I1011 00:52:49.897806 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:49Z","lastTransitionTime":"2025-10-11T00:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:49.999995 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.000052 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.000067 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.000092 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.000110 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:50Z","lastTransitionTime":"2025-10-11T00:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.091354 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.091374 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.091420 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:50 crc kubenswrapper[4743]: E1011 00:52:50.091653 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:52:50 crc kubenswrapper[4743]: E1011 00:52:50.091885 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:50 crc kubenswrapper[4743]: E1011 00:52:50.091924 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.092113 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:50 crc kubenswrapper[4743]: E1011 00:52:50.092233 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.102940 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.102963 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.102972 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.102986 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.103008 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:50Z","lastTransitionTime":"2025-10-11T00:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.206184 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.206251 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.206269 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.206303 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.206323 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:50Z","lastTransitionTime":"2025-10-11T00:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.309425 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.309490 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.309507 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.309535 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.309552 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:50Z","lastTransitionTime":"2025-10-11T00:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.412576 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.412648 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.412737 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.412767 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.412787 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:50Z","lastTransitionTime":"2025-10-11T00:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.515941 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.515990 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.516002 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.516021 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.516034 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:50Z","lastTransitionTime":"2025-10-11T00:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.619507 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.619548 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.619558 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.619576 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.619586 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:50Z","lastTransitionTime":"2025-10-11T00:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.723168 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.723205 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.723220 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.723236 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.723247 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:50Z","lastTransitionTime":"2025-10-11T00:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.825750 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.825809 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.825826 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.825851 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.825903 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:50Z","lastTransitionTime":"2025-10-11T00:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.929506 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.929577 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.929601 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.929626 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:50 crc kubenswrapper[4743]: I1011 00:52:50.929685 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:50Z","lastTransitionTime":"2025-10-11T00:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.032810 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.032898 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.032924 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.032988 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.033012 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:51Z","lastTransitionTime":"2025-10-11T00:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.136390 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.136463 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.136488 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.136521 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.136545 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:51Z","lastTransitionTime":"2025-10-11T00:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.239485 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.239537 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.239554 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.239579 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.239596 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:51Z","lastTransitionTime":"2025-10-11T00:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.342770 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.342806 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.342816 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.342834 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.342845 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:51Z","lastTransitionTime":"2025-10-11T00:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.446623 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.446682 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.446699 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.446724 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.446742 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:51Z","lastTransitionTime":"2025-10-11T00:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.549707 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.549752 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.549762 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.549779 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.549790 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:51Z","lastTransitionTime":"2025-10-11T00:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.652970 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.653392 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.653534 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.653674 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.653795 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:51Z","lastTransitionTime":"2025-10-11T00:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.757064 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.757118 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.757160 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.757182 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.757198 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:51Z","lastTransitionTime":"2025-10-11T00:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.859981 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.860046 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.860060 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.860081 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.860093 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:51Z","lastTransitionTime":"2025-10-11T00:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.963029 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.963089 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.963112 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.963144 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:51 crc kubenswrapper[4743]: I1011 00:52:51.963164 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:51Z","lastTransitionTime":"2025-10-11T00:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.065633 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.065686 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.065711 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.065732 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.065741 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:52Z","lastTransitionTime":"2025-10-11T00:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.091028 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.091071 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.091076 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:52 crc kubenswrapper[4743]: E1011 00:52:52.091186 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.091246 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:52 crc kubenswrapper[4743]: E1011 00:52:52.091330 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:52:52 crc kubenswrapper[4743]: E1011 00:52:52.091418 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:52 crc kubenswrapper[4743]: E1011 00:52:52.091555 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.167596 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.167640 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.167650 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.167666 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.167676 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:52Z","lastTransitionTime":"2025-10-11T00:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.269475 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.269528 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.269549 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.269573 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.269591 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:52Z","lastTransitionTime":"2025-10-11T00:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.371647 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.371679 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.371686 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.371702 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.371712 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:52Z","lastTransitionTime":"2025-10-11T00:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.474688 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.474758 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.474777 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.474810 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.474828 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:52Z","lastTransitionTime":"2025-10-11T00:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.577636 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.577692 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.577710 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.577733 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.577749 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:52Z","lastTransitionTime":"2025-10-11T00:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.680385 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.680428 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.680439 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.680459 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.680471 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:52Z","lastTransitionTime":"2025-10-11T00:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.783195 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.783244 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.783256 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.783276 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.783288 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:52Z","lastTransitionTime":"2025-10-11T00:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.886183 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.886239 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.886255 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.886279 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.886296 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:52Z","lastTransitionTime":"2025-10-11T00:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.988362 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.988413 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.988429 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.988448 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:52 crc kubenswrapper[4743]: I1011 00:52:52.988460 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:52Z","lastTransitionTime":"2025-10-11T00:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.091114 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.091153 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.091165 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.091179 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.091189 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:53Z","lastTransitionTime":"2025-10-11T00:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.194377 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.194456 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.194478 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.194505 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.194523 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:53Z","lastTransitionTime":"2025-10-11T00:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.298837 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.298888 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.298898 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.298914 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.298926 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:53Z","lastTransitionTime":"2025-10-11T00:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.401756 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.401830 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.401907 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.401950 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.401973 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:53Z","lastTransitionTime":"2025-10-11T00:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.505153 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.505215 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.505232 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.505256 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.505275 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:53Z","lastTransitionTime":"2025-10-11T00:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.609421 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.609489 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.609516 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.609548 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.609566 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:53Z","lastTransitionTime":"2025-10-11T00:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.712987 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.713043 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.713056 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.713078 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.713091 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:53Z","lastTransitionTime":"2025-10-11T00:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.816218 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.816280 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.816302 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.816331 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.816370 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:53Z","lastTransitionTime":"2025-10-11T00:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.920675 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.920818 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.920931 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.920966 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:53 crc kubenswrapper[4743]: I1011 00:52:53.920990 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:53Z","lastTransitionTime":"2025-10-11T00:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.025132 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.025196 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.025207 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.025224 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.025234 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:54Z","lastTransitionTime":"2025-10-11T00:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.091645 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.091739 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.091750 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.091802 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:54 crc kubenswrapper[4743]: E1011 00:52:54.091933 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:52:54 crc kubenswrapper[4743]: E1011 00:52:54.092065 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:54 crc kubenswrapper[4743]: E1011 00:52:54.092256 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:54 crc kubenswrapper[4743]: E1011 00:52:54.092590 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.127830 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.128289 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.128430 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.128567 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.128683 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:54Z","lastTransitionTime":"2025-10-11T00:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.231528 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.231575 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.231589 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.231610 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.231623 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:54Z","lastTransitionTime":"2025-10-11T00:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.245231 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs\") pod \"network-metrics-daemon-cb5z5\" (UID: \"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\") " pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:54 crc kubenswrapper[4743]: E1011 00:52:54.245440 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 00:52:54 crc kubenswrapper[4743]: E1011 00:52:54.245539 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs podName:b02b8636-a5c4-447d-b1cf-401b3dcfa02b nodeName:}" failed. No retries permitted until 2025-10-11 00:53:26.24551351 +0000 UTC m=+100.898493917 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs") pod "network-metrics-daemon-cb5z5" (UID: "b02b8636-a5c4-447d-b1cf-401b3dcfa02b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.334404 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.334447 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.334458 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.334475 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.334487 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:54Z","lastTransitionTime":"2025-10-11T00:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.437850 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.438801 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.439005 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.439147 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.439306 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:54Z","lastTransitionTime":"2025-10-11T00:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.542089 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.542156 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.542178 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.542203 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.542220 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:54Z","lastTransitionTime":"2025-10-11T00:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.645932 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.646011 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.646028 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.646048 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.646059 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:54Z","lastTransitionTime":"2025-10-11T00:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.749836 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.750196 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.750286 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.750362 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.750434 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:54Z","lastTransitionTime":"2025-10-11T00:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.852547 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.853122 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.853274 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.853449 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.853600 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:54Z","lastTransitionTime":"2025-10-11T00:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.957225 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.957283 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.957301 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.957319 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:54 crc kubenswrapper[4743]: I1011 00:52:54.957332 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:54Z","lastTransitionTime":"2025-10-11T00:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.060807 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.060907 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.060928 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.060955 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.060973 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:55Z","lastTransitionTime":"2025-10-11T00:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.164979 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.165030 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.165046 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.165075 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.165097 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:55Z","lastTransitionTime":"2025-10-11T00:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.268657 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.268712 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.268723 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.268743 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.268758 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:55Z","lastTransitionTime":"2025-10-11T00:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.372825 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.372934 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.372962 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.372991 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.373010 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:55Z","lastTransitionTime":"2025-10-11T00:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.477345 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.477410 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.477427 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.477452 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.477480 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:55Z","lastTransitionTime":"2025-10-11T00:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.587553 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.587625 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.587640 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.587659 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.587672 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:55Z","lastTransitionTime":"2025-10-11T00:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.690937 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.691019 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.691042 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.691075 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.691098 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:55Z","lastTransitionTime":"2025-10-11T00:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.793801 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.793837 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.793848 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.793878 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.793888 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:55Z","lastTransitionTime":"2025-10-11T00:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.896399 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.896447 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.896459 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.896478 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:55 crc kubenswrapper[4743]: I1011 00:52:55.896487 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:55Z","lastTransitionTime":"2025-10-11T00:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.000422 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.000500 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.000516 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.000538 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.000553 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:56Z","lastTransitionTime":"2025-10-11T00:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.090763 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.090926 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.091017 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:56 crc kubenswrapper[4743]: E1011 00:52:56.091193 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.091228 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:56 crc kubenswrapper[4743]: E1011 00:52:56.091304 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:56 crc kubenswrapper[4743]: E1011 00:52:56.091397 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:52:56 crc kubenswrapper[4743]: E1011 00:52:56.091599 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.102846 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.102944 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.102961 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.102991 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.103007 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:56Z","lastTransitionTime":"2025-10-11T00:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.112006 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.125517 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.142512 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.155840 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38b5a40ddd5390cc392fcf2240304a72163dd797511640ada5ad39a59c0f4283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.178478 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:33Z\\\",\\\"message\\\":\\\"d syncing service metrics on namespace openshift-kube-apiserver-operator for network=default : 3.881381ms\\\\nI1011 00:52:33.160604 6362 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}\\\\nI1011 00:52:33.160615 6362 services_controller.go:360] Finished syncing service scheduler on namespace openshift-kube-scheduler for network=default : 2.970317ms\\\\nI1011 00:52:33.165904 6362 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1011 00:52:33.165984 6362 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1011 00:52:33.166020 6362 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1011 00:52:33.166766 6362 handler.go:208] Removed *v1.Node event handler 2\\\\nI1011 00:52:33.166850 6362 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1011 00:52:33.166973 6362 factory.go:656] Stopping watch factory\\\\nI1011 00:52:33.167000 6362 ovnkube.go:599] Stopped ovnkube\\\\nI1011 00:52:33.167036 6362 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1011 00:52:33.167068 6362 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1011 00:52:33.167200 6362 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-48ljj_openshift-ovn-kubernetes(9ed16b35-862f-47f2-9e32-63c98f868fb8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.192392 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.205935 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.206025 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.206037 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.206054 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.206066 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:56Z","lastTransitionTime":"2025-10-11T00:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.215707 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.230515 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.246187 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.260804 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.278647 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab5ac12d-37cf-48f3-bb33-b93b4096c706\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4335571969a3cb66f2fefdad63b77b4a31fc2631aba5ba427b2af4db8b6c6f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://895152fd89f4c35c7d379d3a93c1b5f185275cd151d27b442b8f06280f3f74a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bc6f9e58ae313b026b5597d2569b343931bc066ac8b3751c62f39c8d849bf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3532ffcf956621ad56477cbb9ff70c2855091a6ad996d3a15fa3c9a28943cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3532ffcf956621ad56477cbb9ff70c2855091a6ad996d3a15fa3c9a28943cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.295766 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.308971 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.309018 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.309032 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.309055 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.309069 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:56Z","lastTransitionTime":"2025-10-11T00:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.313526 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c47ab12f326ce6568bfd06d8beebdeffc07d464b747a288a14793db096ccc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b62556627f75a8ed41c68a7fb1982461fd9fb35965014b72ec15492e763e42b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptjnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.328821 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cb5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cb5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.350987 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.369619 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.387115 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.406616 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.411900 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.411975 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.411996 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.412023 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.412036 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:56Z","lastTransitionTime":"2025-10-11T00:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.514824 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.514884 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.514894 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.514912 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.514923 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:56Z","lastTransitionTime":"2025-10-11T00:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.589314 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9jfxn_e8c603f4-717c-4554-992a-8338b3bef24d/kube-multus/0.log" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.589395 4743 generic.go:334] "Generic (PLEG): container finished" podID="e8c603f4-717c-4554-992a-8338b3bef24d" containerID="853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad" exitCode=1 Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.589444 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9jfxn" event={"ID":"e8c603f4-717c-4554-992a-8338b3bef24d","Type":"ContainerDied","Data":"853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad"} Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.590059 4743 scope.go:117] "RemoveContainer" containerID="853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.606447 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.617990 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.618164 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.618184 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.618210 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.618228 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:56Z","lastTransitionTime":"2025-10-11T00:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.625286 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.640481 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:55Z\\\",\\\"message\\\":\\\"2025-10-11T00:52:10+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bf700974-1b5c-49f8-b8b9-3e4987959237\\\\n2025-10-11T00:52:10+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bf700974-1b5c-49f8-b8b9-3e4987959237 to /host/opt/cni/bin/\\\\n2025-10-11T00:52:10Z [verbose] multus-daemon started\\\\n2025-10-11T00:52:10Z [verbose] Readiness Indicator file check\\\\n2025-10-11T00:52:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.666377 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38b5a40ddd5390cc392fcf2240304a72163dd797511640ada5ad39a59c0f4283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.706443 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:33Z\\\",\\\"message\\\":\\\"d syncing service metrics on namespace openshift-kube-apiserver-operator for network=default : 3.881381ms\\\\nI1011 00:52:33.160604 6362 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}\\\\nI1011 00:52:33.160615 6362 services_controller.go:360] Finished syncing service scheduler on namespace openshift-kube-scheduler for network=default : 2.970317ms\\\\nI1011 00:52:33.165904 6362 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1011 00:52:33.165984 6362 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1011 00:52:33.166020 6362 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1011 00:52:33.166766 6362 handler.go:208] Removed *v1.Node event handler 2\\\\nI1011 00:52:33.166850 6362 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1011 00:52:33.166973 6362 factory.go:656] Stopping watch factory\\\\nI1011 00:52:33.167000 6362 ovnkube.go:599] Stopped ovnkube\\\\nI1011 00:52:33.167036 6362 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1011 00:52:33.167068 6362 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1011 00:52:33.167200 6362 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-48ljj_openshift-ovn-kubernetes(9ed16b35-862f-47f2-9e32-63c98f868fb8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.722811 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.722893 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.722912 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.722938 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.722957 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:56Z","lastTransitionTime":"2025-10-11T00:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.729675 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.746328 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.758983 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.768973 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.777838 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.786594 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.808119 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab5ac12d-37cf-48f3-bb33-b93b4096c706\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4335571969a3cb66f2fefdad63b77b4a31fc2631aba5ba427b2af4db8b6c6f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://895152fd89f4c35c7d379d3a93c1b5f185275cd151d27b442b8f06280f3f74a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bc6f9e58ae313b026b5597d2569b343931bc066ac8b3751c62f39c8d849bf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3532ffcf956621ad56477cbb9ff70c2855091a6ad996d3a15fa3c9a28943cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3532ffcf956621ad56477cbb9ff70c2855091a6ad996d3a15fa3c9a28943cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.825829 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.825915 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.825928 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.825948 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.825959 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:56Z","lastTransitionTime":"2025-10-11T00:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.834943 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.854369 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.870900 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.892391 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.908010 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c47ab12f326ce6568bfd06d8beebdeffc07d464b747a288a14793db096ccc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b62556627f75a8ed41c68a7fb1982461fd9fb35965014b72ec15492e763e42b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptjnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.923328 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cb5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cb5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:56Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.928641 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.928733 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.928762 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.928797 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:56 crc kubenswrapper[4743]: I1011 00:52:56.928821 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:56Z","lastTransitionTime":"2025-10-11T00:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.031699 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.031750 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.031762 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.031780 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.031795 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:57Z","lastTransitionTime":"2025-10-11T00:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.134455 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.134518 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.134537 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.134561 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.134576 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:57Z","lastTransitionTime":"2025-10-11T00:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.238205 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.238275 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.238287 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.238308 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.238321 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:57Z","lastTransitionTime":"2025-10-11T00:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.343819 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.343893 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.343910 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.343934 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.343947 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:57Z","lastTransitionTime":"2025-10-11T00:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.447517 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.447663 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.447686 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.447746 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.447777 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:57Z","lastTransitionTime":"2025-10-11T00:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.551027 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.551093 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.551111 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.551143 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.551163 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:57Z","lastTransitionTime":"2025-10-11T00:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.619326 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9jfxn_e8c603f4-717c-4554-992a-8338b3bef24d/kube-multus/0.log" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.619391 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9jfxn" event={"ID":"e8c603f4-717c-4554-992a-8338b3bef24d","Type":"ContainerStarted","Data":"21793d4ae38fc6e714912dbedbba8c45e37482a03cbd76d10461c41851e16896"} Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.644037 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:33Z\\\",\\\"message\\\":\\\"d syncing service metrics on namespace openshift-kube-apiserver-operator for network=default : 3.881381ms\\\\nI1011 00:52:33.160604 6362 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}\\\\nI1011 00:52:33.160615 6362 services_controller.go:360] Finished syncing service scheduler on namespace openshift-kube-scheduler for network=default : 2.970317ms\\\\nI1011 00:52:33.165904 6362 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1011 00:52:33.165984 6362 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1011 00:52:33.166020 6362 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1011 00:52:33.166766 6362 handler.go:208] Removed *v1.Node event handler 2\\\\nI1011 00:52:33.166850 6362 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1011 00:52:33.166973 6362 factory.go:656] Stopping watch factory\\\\nI1011 00:52:33.167000 6362 ovnkube.go:599] Stopped ovnkube\\\\nI1011 00:52:33.167036 6362 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1011 00:52:33.167068 6362 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1011 00:52:33.167200 6362 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-48ljj_openshift-ovn-kubernetes(9ed16b35-862f-47f2-9e32-63c98f868fb8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:57Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.653498 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.653571 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.653590 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.653616 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.653636 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:57Z","lastTransitionTime":"2025-10-11T00:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.664323 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:57Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.679011 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:57Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.690471 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:57Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.700816 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:57Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.715134 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38b5a40ddd5390cc392fcf2240304a72163dd797511640ada5ad39a59c0f4283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:57Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.726111 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:57Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.738212 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab5ac12d-37cf-48f3-bb33-b93b4096c706\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4335571969a3cb66f2fefdad63b77b4a31fc2631aba5ba427b2af4db8b6c6f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://895152fd89f4c35c7d379d3a93c1b5f185275cd151d27b442b8f06280f3f74a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bc6f9e58ae313b026b5597d2569b343931bc066ac8b3751c62f39c8d849bf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3532ffcf956621ad56477cbb9ff70c2855091a6ad996d3a15fa3c9a28943cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3532ffcf956621ad56477cbb9ff70c2855091a6ad996d3a15fa3c9a28943cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:57Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.757892 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:57Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.762833 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.762907 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.762925 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.762949 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.762964 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:57Z","lastTransitionTime":"2025-10-11T00:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.773264 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.773331 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.773343 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.773363 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.773375 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:57Z","lastTransitionTime":"2025-10-11T00:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.773367 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:57Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:57 crc kubenswrapper[4743]: E1011 00:52:57.784788 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:57Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.786316 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:57Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.787962 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.788028 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.788043 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.788070 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.788086 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:57Z","lastTransitionTime":"2025-10-11T00:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.798718 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:57Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:57 crc kubenswrapper[4743]: E1011 00:52:57.800352 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:57Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.803811 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.803866 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.803879 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.803902 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.803915 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:57Z","lastTransitionTime":"2025-10-11T00:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.810629 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:57Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:57 crc kubenswrapper[4743]: E1011 00:52:57.813464 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:57Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.816516 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.816578 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.816598 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.816620 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.816635 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:57Z","lastTransitionTime":"2025-10-11T00:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.820795 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c47ab12f326ce6568bfd06d8beebdeffc07d464b747a288a14793db096ccc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b62556627f75a8ed41c68a7fb1982461fd9fb35965014b72ec15492e763e42b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptjnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:57Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:57 crc kubenswrapper[4743]: E1011 00:52:57.827964 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:57Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.830511 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cb5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cb5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:57Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.831074 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.831108 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.831121 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.831139 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.831154 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:57Z","lastTransitionTime":"2025-10-11T00:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.843087 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:57Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:57 crc kubenswrapper[4743]: E1011 00:52:57.845486 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:57Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:57 crc kubenswrapper[4743]: E1011 00:52:57.845644 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.855957 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:57Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.866521 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.866546 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.866557 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.866573 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.866586 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:57Z","lastTransitionTime":"2025-10-11T00:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.868802 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21793d4ae38fc6e714912dbedbba8c45e37482a03cbd76d10461c41851e16896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:55Z\\\",\\\"message\\\":\\\"2025-10-11T00:52:10+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bf700974-1b5c-49f8-b8b9-3e4987959237\\\\n2025-10-11T00:52:10+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bf700974-1b5c-49f8-b8b9-3e4987959237 to /host/opt/cni/bin/\\\\n2025-10-11T00:52:10Z [verbose] multus-daemon started\\\\n2025-10-11T00:52:10Z [verbose] Readiness Indicator file check\\\\n2025-10-11T00:52:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:52:57Z is after 2025-08-24T17:21:41Z" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.968839 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.968897 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.968909 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.968924 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:57 crc kubenswrapper[4743]: I1011 00:52:57.968937 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:57Z","lastTransitionTime":"2025-10-11T00:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.072773 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.072809 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.072822 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.072841 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.072875 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:58Z","lastTransitionTime":"2025-10-11T00:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.091309 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.091310 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.091406 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.091424 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:52:58 crc kubenswrapper[4743]: E1011 00:52:58.091630 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:52:58 crc kubenswrapper[4743]: E1011 00:52:58.091673 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:52:58 crc kubenswrapper[4743]: E1011 00:52:58.091731 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:52:58 crc kubenswrapper[4743]: E1011 00:52:58.091935 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.175740 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.175797 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.175817 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.175843 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.175891 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:58Z","lastTransitionTime":"2025-10-11T00:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.279053 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.279147 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.279173 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.279206 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.279230 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:58Z","lastTransitionTime":"2025-10-11T00:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.382584 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.382654 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.382672 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.382699 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.382722 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:58Z","lastTransitionTime":"2025-10-11T00:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.492039 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.492125 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.492146 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.492175 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.492196 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:58Z","lastTransitionTime":"2025-10-11T00:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.595330 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.595361 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.595369 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.595384 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.595395 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:58Z","lastTransitionTime":"2025-10-11T00:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.698829 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.698921 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.698939 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.698965 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.698985 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:58Z","lastTransitionTime":"2025-10-11T00:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.802185 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.802233 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.802245 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.802261 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.802273 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:58Z","lastTransitionTime":"2025-10-11T00:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.905739 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.905808 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.905827 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.906958 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:58 crc kubenswrapper[4743]: I1011 00:52:58.907029 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:58Z","lastTransitionTime":"2025-10-11T00:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.010398 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.010460 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.010477 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.010504 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.010521 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:59Z","lastTransitionTime":"2025-10-11T00:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.114152 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.114210 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.114228 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.114254 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.114273 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:59Z","lastTransitionTime":"2025-10-11T00:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.217787 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.217844 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.217885 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.217908 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.217923 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:59Z","lastTransitionTime":"2025-10-11T00:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.321174 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.321222 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.321240 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.321265 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.321282 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:59Z","lastTransitionTime":"2025-10-11T00:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.424747 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.424821 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.424837 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.424893 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.424913 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:59Z","lastTransitionTime":"2025-10-11T00:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.529809 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.529892 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.529912 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.529937 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.529957 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:59Z","lastTransitionTime":"2025-10-11T00:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.632945 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.633823 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.634002 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.634180 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.634344 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:59Z","lastTransitionTime":"2025-10-11T00:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.737721 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.738106 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.738297 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.738491 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.738951 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:59Z","lastTransitionTime":"2025-10-11T00:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.842683 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.842739 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.842755 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.842780 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.842799 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:59Z","lastTransitionTime":"2025-10-11T00:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.945637 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.945688 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.945706 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.945726 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:52:59 crc kubenswrapper[4743]: I1011 00:52:59.945742 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:52:59Z","lastTransitionTime":"2025-10-11T00:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.048036 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.048098 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.048117 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.048146 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.048165 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:00Z","lastTransitionTime":"2025-10-11T00:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.091715 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.091788 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.092015 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:00 crc kubenswrapper[4743]: E1011 00:53:00.092116 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:00 crc kubenswrapper[4743]: E1011 00:53:00.092230 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:00 crc kubenswrapper[4743]: E1011 00:53:00.091957 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.092495 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:00 crc kubenswrapper[4743]: E1011 00:53:00.092789 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.151291 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.151506 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.151667 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.151811 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.151973 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:00Z","lastTransitionTime":"2025-10-11T00:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.255007 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.255349 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.255474 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.255609 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.255783 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:00Z","lastTransitionTime":"2025-10-11T00:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.359162 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.359206 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.359225 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.359250 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.359268 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:00Z","lastTransitionTime":"2025-10-11T00:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.461809 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.461890 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.461908 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.461928 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.461945 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:00Z","lastTransitionTime":"2025-10-11T00:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.565270 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.565325 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.565342 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.565366 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.565388 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:00Z","lastTransitionTime":"2025-10-11T00:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.668081 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.668111 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.668145 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.668160 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.668168 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:00Z","lastTransitionTime":"2025-10-11T00:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.770532 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.770584 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.770600 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.770628 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.770644 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:00Z","lastTransitionTime":"2025-10-11T00:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.873423 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.873482 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.873498 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.873537 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.873553 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:00Z","lastTransitionTime":"2025-10-11T00:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.976673 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.976719 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.976732 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.976747 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:00 crc kubenswrapper[4743]: I1011 00:53:00.976764 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:00Z","lastTransitionTime":"2025-10-11T00:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.079875 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.079931 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.079942 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.079959 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.079971 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:01Z","lastTransitionTime":"2025-10-11T00:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.092449 4743 scope.go:117] "RemoveContainer" containerID="f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.184807 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.184909 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.184933 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.184968 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.184993 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:01Z","lastTransitionTime":"2025-10-11T00:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.288774 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.288840 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.288902 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.288936 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.288959 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:01Z","lastTransitionTime":"2025-10-11T00:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.393597 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.393673 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.393696 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.393730 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.393756 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:01Z","lastTransitionTime":"2025-10-11T00:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.496947 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.497056 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.497078 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.497143 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.497165 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:01Z","lastTransitionTime":"2025-10-11T00:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.600958 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.601071 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.601099 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.601129 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.601152 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:01Z","lastTransitionTime":"2025-10-11T00:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.639762 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48ljj_9ed16b35-862f-47f2-9e32-63c98f868fb8/ovnkube-controller/2.log" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.644277 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerStarted","Data":"268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6"} Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.645489 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.665240 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:01Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.686684 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:01Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.704129 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.704212 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.704228 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.704254 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.704270 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:01Z","lastTransitionTime":"2025-10-11T00:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.713575 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c47ab12f326ce6568bfd06d8beebdeffc07d464b747a288a14793db096ccc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b62556627f75a8ed41c68a7fb1982461fd9fb35965014b72ec15492e763e42b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptjnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:01Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.730148 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cb5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cb5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:01Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.770126 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:01Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.799347 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:01Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.806885 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.806918 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.806931 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.806950 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.806961 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:01Z","lastTransitionTime":"2025-10-11T00:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.813078 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21793d4ae38fc6e714912dbedbba8c45e37482a03cbd76d10461c41851e16896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:55Z\\\",\\\"message\\\":\\\"2025-10-11T00:52:10+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bf700974-1b5c-49f8-b8b9-3e4987959237\\\\n2025-10-11T00:52:10+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bf700974-1b5c-49f8-b8b9-3e4987959237 to /host/opt/cni/bin/\\\\n2025-10-11T00:52:10Z [verbose] multus-daemon started\\\\n2025-10-11T00:52:10Z [verbose] Readiness Indicator file check\\\\n2025-10-11T00:52:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:01Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.827193 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:01Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.842517 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:01Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.857919 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:01Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.874059 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:01Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.889757 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38b5a40ddd5390cc392fcf2240304a72163dd797511640ada5ad39a59c0f4283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:01Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.909381 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.909414 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.909425 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.909440 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.909450 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:01Z","lastTransitionTime":"2025-10-11T00:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.914377 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:33Z\\\",\\\"message\\\":\\\"d syncing service metrics on namespace openshift-kube-apiserver-operator for network=default : 3.881381ms\\\\nI1011 00:52:33.160604 6362 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}\\\\nI1011 00:52:33.160615 6362 services_controller.go:360] Finished syncing service scheduler on namespace openshift-kube-scheduler for network=default : 2.970317ms\\\\nI1011 00:52:33.165904 6362 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1011 00:52:33.165984 6362 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1011 00:52:33.166020 6362 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1011 00:52:33.166766 6362 handler.go:208] Removed *v1.Node event handler 2\\\\nI1011 00:52:33.166850 6362 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1011 00:52:33.166973 6362 factory.go:656] Stopping watch factory\\\\nI1011 00:52:33.167000 6362 ovnkube.go:599] Stopped ovnkube\\\\nI1011 00:52:33.167036 6362 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1011 00:52:33.167068 6362 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1011 00:52:33.167200 6362 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:01Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.931027 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab5ac12d-37cf-48f3-bb33-b93b4096c706\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4335571969a3cb66f2fefdad63b77b4a31fc2631aba5ba427b2af4db8b6c6f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://895152fd89f4c35c7d379d3a93c1b5f185275cd151d27b442b8f06280f3f74a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bc6f9e58ae313b026b5597d2569b343931bc066ac8b3751c62f39c8d849bf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3532ffcf956621ad56477cbb9ff70c2855091a6ad996d3a15fa3c9a28943cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3532ffcf956621ad56477cbb9ff70c2855091a6ad996d3a15fa3c9a28943cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:01Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.958588 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:01Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:01 crc kubenswrapper[4743]: I1011 00:53:01.979598 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:01Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.006432 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:01Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.011503 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.011552 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.011564 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.011578 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.011587 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:02Z","lastTransitionTime":"2025-10-11T00:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.019803 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:02Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.091232 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.091282 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.091253 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:02 crc kubenswrapper[4743]: E1011 00:53:02.091357 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.091239 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:02 crc kubenswrapper[4743]: E1011 00:53:02.091475 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:02 crc kubenswrapper[4743]: E1011 00:53:02.091752 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:02 crc kubenswrapper[4743]: E1011 00:53:02.091726 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.113824 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.113868 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.113877 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.113893 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.113903 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:02Z","lastTransitionTime":"2025-10-11T00:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.218132 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.218192 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.218209 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.218417 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.218440 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:02Z","lastTransitionTime":"2025-10-11T00:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.322299 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.322358 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.322375 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.322400 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.322421 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:02Z","lastTransitionTime":"2025-10-11T00:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.425229 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.425272 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.425282 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.425315 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.425326 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:02Z","lastTransitionTime":"2025-10-11T00:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.527639 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.527713 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.527729 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.527750 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.527766 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:02Z","lastTransitionTime":"2025-10-11T00:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.631233 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.631307 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.631327 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.631356 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.631376 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:02Z","lastTransitionTime":"2025-10-11T00:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.650978 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48ljj_9ed16b35-862f-47f2-9e32-63c98f868fb8/ovnkube-controller/3.log" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.651916 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48ljj_9ed16b35-862f-47f2-9e32-63c98f868fb8/ovnkube-controller/2.log" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.655368 4743 generic.go:334] "Generic (PLEG): container finished" podID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerID="268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6" exitCode=1 Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.655420 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerDied","Data":"268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6"} Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.655461 4743 scope.go:117] "RemoveContainer" containerID="f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.656573 4743 scope.go:117] "RemoveContainer" containerID="268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6" Oct 11 00:53:02 crc kubenswrapper[4743]: E1011 00:53:02.656847 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-48ljj_openshift-ovn-kubernetes(9ed16b35-862f-47f2-9e32-63c98f868fb8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.689220 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab5ac12d-37cf-48f3-bb33-b93b4096c706\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4335571969a3cb66f2fefdad63b77b4a31fc2631aba5ba427b2af4db8b6c6f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://895152fd89f4c35c7d379d3a93c1b5f185275cd151d27b442b8f06280f3f74a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bc6f9e58ae313b026b5597d2569b343931bc066ac8b3751c62f39c8d849bf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3532ffcf956621ad56477cbb9ff70c2855091a6ad996d3a15fa3c9a28943cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3532ffcf956621ad56477cbb9ff70c2855091a6ad996d3a15fa3c9a28943cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:02Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.714746 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:02Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.735210 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.735278 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.735302 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.735332 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.735350 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:02Z","lastTransitionTime":"2025-10-11T00:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.735419 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:02Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.750851 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:02Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.763587 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:02Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.783819 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:02Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.801131 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:02Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.818517 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c47ab12f326ce6568bfd06d8beebdeffc07d464b747a288a14793db096ccc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b62556627f75a8ed41c68a7fb1982461fd9fb35965014b72ec15492e763e42b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptjnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:02Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.837784 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cb5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cb5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:02Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.840204 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.840277 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.840296 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.840323 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.840342 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:02Z","lastTransitionTime":"2025-10-11T00:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.859652 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:02Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.879890 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:02Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.900011 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21793d4ae38fc6e714912dbedbba8c45e37482a03cbd76d10461c41851e16896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:55Z\\\",\\\"message\\\":\\\"2025-10-11T00:52:10+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bf700974-1b5c-49f8-b8b9-3e4987959237\\\\n2025-10-11T00:52:10+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bf700974-1b5c-49f8-b8b9-3e4987959237 to /host/opt/cni/bin/\\\\n2025-10-11T00:52:10Z [verbose] multus-daemon started\\\\n2025-10-11T00:52:10Z [verbose] Readiness Indicator file check\\\\n2025-10-11T00:52:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:02Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.944447 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.944518 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.944537 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.944569 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.944591 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:02Z","lastTransitionTime":"2025-10-11T00:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.946655 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:02Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.963234 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:02Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:02 crc kubenswrapper[4743]: I1011 00:53:02.983495 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:02Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.001232 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:02Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.020456 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38b5a40ddd5390cc392fcf2240304a72163dd797511640ada5ad39a59c0f4283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:03Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.047888 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.047996 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.048030 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.048069 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.048094 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:03Z","lastTransitionTime":"2025-10-11T00:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.052959 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8d34980f388c932c37e8d5a9122045fe57b4dc1cba358666d3558519a785497\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:33Z\\\",\\\"message\\\":\\\"d syncing service metrics on namespace openshift-kube-apiserver-operator for network=default : 3.881381ms\\\\nI1011 00:52:33.160604 6362 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}\\\\nI1011 00:52:33.160615 6362 services_controller.go:360] Finished syncing service scheduler on namespace openshift-kube-scheduler for network=default : 2.970317ms\\\\nI1011 00:52:33.165904 6362 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1011 00:52:33.165984 6362 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1011 00:52:33.166020 6362 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1011 00:52:33.166766 6362 handler.go:208] Removed *v1.Node event handler 2\\\\nI1011 00:52:33.166850 6362 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1011 00:52:33.166973 6362 factory.go:656] Stopping watch factory\\\\nI1011 00:52:33.167000 6362 ovnkube.go:599] Stopped ovnkube\\\\nI1011 00:52:33.167036 6362 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1011 00:52:33.167068 6362 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1011 00:52:33.167200 6362 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:53:02Z\\\",\\\"message\\\":\\\"== {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1011 00:53:02.211217 6723 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1011 00:53:02.211203 6723 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:03Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.151375 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.151443 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.151462 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.151488 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.151507 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:03Z","lastTransitionTime":"2025-10-11T00:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.255558 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.255618 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.255632 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.255656 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.255671 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:03Z","lastTransitionTime":"2025-10-11T00:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.361365 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.361430 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.361448 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.361477 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.361499 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:03Z","lastTransitionTime":"2025-10-11T00:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.464850 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.464965 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.464996 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.465055 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.465087 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:03Z","lastTransitionTime":"2025-10-11T00:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.569001 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.569046 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.569057 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.569076 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.569088 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:03Z","lastTransitionTime":"2025-10-11T00:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.662548 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48ljj_9ed16b35-862f-47f2-9e32-63c98f868fb8/ovnkube-controller/3.log" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.668181 4743 scope.go:117] "RemoveContainer" containerID="268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6" Oct 11 00:53:03 crc kubenswrapper[4743]: E1011 00:53:03.668379 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-48ljj_openshift-ovn-kubernetes(9ed16b35-862f-47f2-9e32-63c98f868fb8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.671671 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.671699 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.671710 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.671728 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.671741 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:03Z","lastTransitionTime":"2025-10-11T00:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.693399 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:03Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.710001 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:03Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.728687 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21793d4ae38fc6e714912dbedbba8c45e37482a03cbd76d10461c41851e16896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:55Z\\\",\\\"message\\\":\\\"2025-10-11T00:52:10+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bf700974-1b5c-49f8-b8b9-3e4987959237\\\\n2025-10-11T00:52:10+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bf700974-1b5c-49f8-b8b9-3e4987959237 to /host/opt/cni/bin/\\\\n2025-10-11T00:52:10Z [verbose] multus-daemon started\\\\n2025-10-11T00:52:10Z [verbose] Readiness Indicator file check\\\\n2025-10-11T00:52:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:03Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.745890 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:03Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.764345 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:03Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.775000 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.775073 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.775100 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.775135 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.775159 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:03Z","lastTransitionTime":"2025-10-11T00:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.787198 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:03Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.806731 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:03Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.832589 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38b5a40ddd5390cc392fcf2240304a72163dd797511640ada5ad39a59c0f4283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:03Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.867389 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:53:02Z\\\",\\\"message\\\":\\\"== {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1011 00:53:02.211217 6723 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1011 00:53:02.211203 6723 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:53:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-48ljj_openshift-ovn-kubernetes(9ed16b35-862f-47f2-9e32-63c98f868fb8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:03Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.877913 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.878124 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.878232 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.878342 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.878523 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:03Z","lastTransitionTime":"2025-10-11T00:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.889170 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab5ac12d-37cf-48f3-bb33-b93b4096c706\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4335571969a3cb66f2fefdad63b77b4a31fc2631aba5ba427b2af4db8b6c6f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://895152fd89f4c35c7d379d3a93c1b5f185275cd151d27b442b8f06280f3f74a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bc6f9e58ae313b026b5597d2569b343931bc066ac8b3751c62f39c8d849bf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3532ffcf956621ad56477cbb9ff70c2855091a6ad996d3a15fa3c9a28943cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3532ffcf956621ad56477cbb9ff70c2855091a6ad996d3a15fa3c9a28943cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:03Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.917670 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:03Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.941697 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:03Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.959663 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:03Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.976261 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:03Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.981563 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.981729 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.981755 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.981782 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:03 crc kubenswrapper[4743]: I1011 00:53:03.981800 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:03Z","lastTransitionTime":"2025-10-11T00:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.001417 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:03Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.023066 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:04Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.044057 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c47ab12f326ce6568bfd06d8beebdeffc07d464b747a288a14793db096ccc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b62556627f75a8ed41c68a7fb1982461fd9fb35965014b72ec15492e763e42b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptjnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:04Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.062169 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cb5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cb5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:04Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.085469 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.085526 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.085545 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.085573 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.085594 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:04Z","lastTransitionTime":"2025-10-11T00:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.091698 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.091716 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:04 crc kubenswrapper[4743]: E1011 00:53:04.091920 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.092038 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:04 crc kubenswrapper[4743]: E1011 00:53:04.092195 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.092211 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:04 crc kubenswrapper[4743]: E1011 00:53:04.092407 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:04 crc kubenswrapper[4743]: E1011 00:53:04.092504 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.188586 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.188677 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.188699 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.188729 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.188750 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:04Z","lastTransitionTime":"2025-10-11T00:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.291797 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.291904 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.291925 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.291953 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.291977 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:04Z","lastTransitionTime":"2025-10-11T00:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.395545 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.395615 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.395633 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.395664 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.395683 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:04Z","lastTransitionTime":"2025-10-11T00:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.499777 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.499910 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.499935 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.499970 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.499995 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:04Z","lastTransitionTime":"2025-10-11T00:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.603942 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.604017 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.604052 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.604071 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.604083 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:04Z","lastTransitionTime":"2025-10-11T00:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.706840 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.706954 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.707010 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.707042 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.707066 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:04Z","lastTransitionTime":"2025-10-11T00:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.811430 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.811536 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.811558 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.811587 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.811612 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:04Z","lastTransitionTime":"2025-10-11T00:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.915567 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.915639 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.915657 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.915684 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:04 crc kubenswrapper[4743]: I1011 00:53:04.915703 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:04Z","lastTransitionTime":"2025-10-11T00:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.018921 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.019012 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.019062 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.019088 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.019103 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:05Z","lastTransitionTime":"2025-10-11T00:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.122421 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.122487 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.122505 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.122533 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.122552 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:05Z","lastTransitionTime":"2025-10-11T00:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.226934 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.226999 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.227018 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.227046 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.227066 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:05Z","lastTransitionTime":"2025-10-11T00:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.330276 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.330749 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.330926 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.331076 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.331209 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:05Z","lastTransitionTime":"2025-10-11T00:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.434697 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.435102 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.435290 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.435450 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.435569 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:05Z","lastTransitionTime":"2025-10-11T00:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.539025 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.539444 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.539591 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.539730 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.540014 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:05Z","lastTransitionTime":"2025-10-11T00:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.642695 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.642773 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.642800 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.642831 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.642887 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:05Z","lastTransitionTime":"2025-10-11T00:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.745795 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.746651 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.746947 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.747185 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.747712 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:05Z","lastTransitionTime":"2025-10-11T00:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.851126 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.851168 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.851177 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.851193 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.851205 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:05Z","lastTransitionTime":"2025-10-11T00:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.953964 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.954027 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.954047 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.954073 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:05 crc kubenswrapper[4743]: I1011 00:53:05.954090 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:05Z","lastTransitionTime":"2025-10-11T00:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.058490 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.058821 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.058842 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.058904 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.058927 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:06Z","lastTransitionTime":"2025-10-11T00:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.091266 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.091329 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:06 crc kubenswrapper[4743]: E1011 00:53:06.091602 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.091674 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.091677 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:06 crc kubenswrapper[4743]: E1011 00:53:06.092058 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:06 crc kubenswrapper[4743]: E1011 00:53:06.092157 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:06 crc kubenswrapper[4743]: E1011 00:53:06.092694 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.123993 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:06Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.144928 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:06Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.162433 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.162498 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.162521 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.162551 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.162575 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:06Z","lastTransitionTime":"2025-10-11T00:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.169383 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:06Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.188144 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:06Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.217101 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38b5a40ddd5390cc392fcf2240304a72163dd797511640ada5ad39a59c0f4283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:06Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.252016 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:53:02Z\\\",\\\"message\\\":\\\"== {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1011 00:53:02.211217 6723 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1011 00:53:02.211203 6723 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:53:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-48ljj_openshift-ovn-kubernetes(9ed16b35-862f-47f2-9e32-63c98f868fb8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:06Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.265752 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.265816 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.265834 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.265900 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.265921 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:06Z","lastTransitionTime":"2025-10-11T00:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.273056 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab5ac12d-37cf-48f3-bb33-b93b4096c706\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4335571969a3cb66f2fefdad63b77b4a31fc2631aba5ba427b2af4db8b6c6f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://895152fd89f4c35c7d379d3a93c1b5f185275cd151d27b442b8f06280f3f74a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bc6f9e58ae313b026b5597d2569b343931bc066ac8b3751c62f39c8d849bf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3532ffcf956621ad56477cbb9ff70c2855091a6ad996d3a15fa3c9a28943cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3532ffcf956621ad56477cbb9ff70c2855091a6ad996d3a15fa3c9a28943cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:06Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.307580 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:06Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.330967 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:06Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.350085 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:06Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.369544 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.369606 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.369623 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.369647 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.369665 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:06Z","lastTransitionTime":"2025-10-11T00:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.370097 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:06Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.393270 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:06Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.416001 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:06Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.439000 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c47ab12f326ce6568bfd06d8beebdeffc07d464b747a288a14793db096ccc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b62556627f75a8ed41c68a7fb1982461fd9fb35965014b72ec15492e763e42b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptjnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:06Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.459414 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cb5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cb5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:06Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.472673 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.472737 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.472755 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.472785 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.472807 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:06Z","lastTransitionTime":"2025-10-11T00:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.482002 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:06Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.503239 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:06Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.525943 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21793d4ae38fc6e714912dbedbba8c45e37482a03cbd76d10461c41851e16896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:55Z\\\",\\\"message\\\":\\\"2025-10-11T00:52:10+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bf700974-1b5c-49f8-b8b9-3e4987959237\\\\n2025-10-11T00:52:10+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bf700974-1b5c-49f8-b8b9-3e4987959237 to /host/opt/cni/bin/\\\\n2025-10-11T00:52:10Z [verbose] multus-daemon started\\\\n2025-10-11T00:52:10Z [verbose] Readiness Indicator file check\\\\n2025-10-11T00:52:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:06Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.576818 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.576914 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.576934 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.576963 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.576984 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:06Z","lastTransitionTime":"2025-10-11T00:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.680956 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.681031 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.681053 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.681082 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.681102 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:06Z","lastTransitionTime":"2025-10-11T00:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.784352 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.784467 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.784487 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.784517 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.784540 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:06Z","lastTransitionTime":"2025-10-11T00:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.887719 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.887782 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.887802 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.887830 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.887851 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:06Z","lastTransitionTime":"2025-10-11T00:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.991886 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.991943 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.991957 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.991977 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:06 crc kubenswrapper[4743]: I1011 00:53:06.991990 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:06Z","lastTransitionTime":"2025-10-11T00:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.096283 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.096335 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.096352 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.096376 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.096396 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:07Z","lastTransitionTime":"2025-10-11T00:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.200701 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.200771 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.200795 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.200828 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.200851 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:07Z","lastTransitionTime":"2025-10-11T00:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.304753 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.304825 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.304838 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.304887 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.304910 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:07Z","lastTransitionTime":"2025-10-11T00:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.408422 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.408498 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.408516 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.408583 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.408602 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:07Z","lastTransitionTime":"2025-10-11T00:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.511612 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.511728 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.511755 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.511794 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.511823 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:07Z","lastTransitionTime":"2025-10-11T00:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.616808 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.616898 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.616916 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.616942 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.616962 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:07Z","lastTransitionTime":"2025-10-11T00:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.719901 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.720040 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.720063 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.720091 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.720113 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:07Z","lastTransitionTime":"2025-10-11T00:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.823649 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.823718 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.823736 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.823763 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.823781 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:07Z","lastTransitionTime":"2025-10-11T00:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.926974 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.927064 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.927087 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.927119 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:07 crc kubenswrapper[4743]: I1011 00:53:07.927148 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:07Z","lastTransitionTime":"2025-10-11T00:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.023621 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.023690 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.023709 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.023737 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.023755 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:08Z","lastTransitionTime":"2025-10-11T00:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:08 crc kubenswrapper[4743]: E1011 00:53:08.047216 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.053418 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.053483 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.053496 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.053524 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.053538 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:08Z","lastTransitionTime":"2025-10-11T00:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:08 crc kubenswrapper[4743]: E1011 00:53:08.084245 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.090352 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.090400 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.090419 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.090447 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.090469 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:08Z","lastTransitionTime":"2025-10-11T00:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.091836 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:08 crc kubenswrapper[4743]: E1011 00:53:08.092142 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.092274 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.092281 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:08 crc kubenswrapper[4743]: E1011 00:53:08.092451 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.092517 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:08 crc kubenswrapper[4743]: E1011 00:53:08.092614 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:08 crc kubenswrapper[4743]: E1011 00:53:08.092768 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:08 crc kubenswrapper[4743]: E1011 00:53:08.108934 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.115152 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.115208 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.115222 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.115281 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.115304 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:08Z","lastTransitionTime":"2025-10-11T00:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:08 crc kubenswrapper[4743]: E1011 00:53:08.138104 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.143351 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.143436 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.143455 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.143476 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.143489 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:08Z","lastTransitionTime":"2025-10-11T00:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:08 crc kubenswrapper[4743]: E1011 00:53:08.160472 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:08Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:08 crc kubenswrapper[4743]: E1011 00:53:08.160635 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.163215 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.163248 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.163262 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.163277 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.163290 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:08Z","lastTransitionTime":"2025-10-11T00:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.266443 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.266507 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.266533 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.266562 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.266584 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:08Z","lastTransitionTime":"2025-10-11T00:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.369393 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.369540 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.369605 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.369633 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.369652 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:08Z","lastTransitionTime":"2025-10-11T00:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.474236 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.474316 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.474341 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.474381 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.474409 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:08Z","lastTransitionTime":"2025-10-11T00:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.577022 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.577083 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.577099 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.577123 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.577145 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:08Z","lastTransitionTime":"2025-10-11T00:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.680564 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.680663 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.680681 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.680714 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.680732 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:08Z","lastTransitionTime":"2025-10-11T00:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.785061 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.785124 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.785142 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.785171 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.785192 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:08Z","lastTransitionTime":"2025-10-11T00:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.888811 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.888902 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.888922 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.888986 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.889006 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:08Z","lastTransitionTime":"2025-10-11T00:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.992495 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.992561 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.992634 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.992662 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:08 crc kubenswrapper[4743]: I1011 00:53:08.992682 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:08Z","lastTransitionTime":"2025-10-11T00:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.096950 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.097044 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.097065 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.097096 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.097118 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:09Z","lastTransitionTime":"2025-10-11T00:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.199717 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.199795 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.199818 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.199849 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.199895 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:09Z","lastTransitionTime":"2025-10-11T00:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.303619 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.303698 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.303715 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.303741 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.303760 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:09Z","lastTransitionTime":"2025-10-11T00:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.406647 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.406740 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.406765 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.406800 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.406829 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:09Z","lastTransitionTime":"2025-10-11T00:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.509779 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.509840 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.509887 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.509918 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.509938 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:09Z","lastTransitionTime":"2025-10-11T00:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.612574 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.612643 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.612662 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.612689 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.612710 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:09Z","lastTransitionTime":"2025-10-11T00:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.661120 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:53:09 crc kubenswrapper[4743]: E1011 00:53:09.662177 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:13.662114164 +0000 UTC m=+148.315094601 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.716306 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.716374 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.716394 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.716423 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.716444 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:09Z","lastTransitionTime":"2025-10-11T00:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.824804 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.824914 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.824935 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.824968 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.824987 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:09Z","lastTransitionTime":"2025-10-11T00:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.863630 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.863710 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.863784 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.863831 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:09 crc kubenswrapper[4743]: E1011 00:53:09.863983 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 00:53:09 crc kubenswrapper[4743]: E1011 00:53:09.864062 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 00:53:09 crc kubenswrapper[4743]: E1011 00:53:09.864100 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 00:53:09 crc kubenswrapper[4743]: E1011 00:53:09.864105 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 00:53:09 crc kubenswrapper[4743]: E1011 00:53:09.864126 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:53:09 crc kubenswrapper[4743]: E1011 00:53:09.864129 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 00:54:13.864099597 +0000 UTC m=+148.517080174 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 11 00:53:09 crc kubenswrapper[4743]: E1011 00:53:09.864123 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 11 00:53:09 crc kubenswrapper[4743]: E1011 00:53:09.864264 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-11 00:54:13.86422485 +0000 UTC m=+148.517205287 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:53:09 crc kubenswrapper[4743]: E1011 00:53:09.864294 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 11 00:53:09 crc kubenswrapper[4743]: E1011 00:53:09.864310 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-11 00:54:13.864294062 +0000 UTC m=+148.517274499 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 11 00:53:09 crc kubenswrapper[4743]: E1011 00:53:09.864329 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:53:09 crc kubenswrapper[4743]: E1011 00:53:09.864417 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-11 00:54:13.864395865 +0000 UTC m=+148.517376302 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.928424 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.928520 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.928543 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.928581 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:09 crc kubenswrapper[4743]: I1011 00:53:09.928602 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:09Z","lastTransitionTime":"2025-10-11T00:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.031450 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.031517 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.031540 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.031572 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.031594 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:10Z","lastTransitionTime":"2025-10-11T00:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.091526 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.091587 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.091587 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.091781 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:10 crc kubenswrapper[4743]: E1011 00:53:10.091778 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:10 crc kubenswrapper[4743]: E1011 00:53:10.092001 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:10 crc kubenswrapper[4743]: E1011 00:53:10.092315 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:10 crc kubenswrapper[4743]: E1011 00:53:10.092425 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.135064 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.135154 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.135175 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.135204 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.135227 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:10Z","lastTransitionTime":"2025-10-11T00:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.238799 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.238918 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.238947 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.238980 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.239007 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:10Z","lastTransitionTime":"2025-10-11T00:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.342313 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.342382 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.342407 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.342438 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.342461 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:10Z","lastTransitionTime":"2025-10-11T00:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.446182 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.446230 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.446247 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.446273 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.446289 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:10Z","lastTransitionTime":"2025-10-11T00:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.549194 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.549255 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.549274 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.549304 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.549326 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:10Z","lastTransitionTime":"2025-10-11T00:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.652985 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.653050 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.653067 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.653091 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.653105 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:10Z","lastTransitionTime":"2025-10-11T00:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.756458 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.756514 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.756526 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.756546 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.756558 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:10Z","lastTransitionTime":"2025-10-11T00:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.860670 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.860719 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.860728 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.860746 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.860758 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:10Z","lastTransitionTime":"2025-10-11T00:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.963609 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.963675 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.963694 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.963720 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:10 crc kubenswrapper[4743]: I1011 00:53:10.963740 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:10Z","lastTransitionTime":"2025-10-11T00:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.067968 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.068086 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.068109 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.068183 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.068202 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:11Z","lastTransitionTime":"2025-10-11T00:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.172364 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.172433 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.172451 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.172481 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.172500 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:11Z","lastTransitionTime":"2025-10-11T00:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.276078 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.276249 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.276274 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.276304 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.276326 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:11Z","lastTransitionTime":"2025-10-11T00:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.380511 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.380585 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.380611 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.380640 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.380663 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:11Z","lastTransitionTime":"2025-10-11T00:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.484005 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.484105 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.484134 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.484172 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.484195 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:11Z","lastTransitionTime":"2025-10-11T00:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.588084 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.588146 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.588164 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.588189 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.588208 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:11Z","lastTransitionTime":"2025-10-11T00:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.692678 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.692751 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.692769 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.692801 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.692822 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:11Z","lastTransitionTime":"2025-10-11T00:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.795392 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.795457 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.795477 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.795508 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.795535 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:11Z","lastTransitionTime":"2025-10-11T00:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.898884 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.898945 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.898958 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.898979 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:11 crc kubenswrapper[4743]: I1011 00:53:11.898993 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:11Z","lastTransitionTime":"2025-10-11T00:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.001812 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.002449 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.002469 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.002556 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.002577 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:12Z","lastTransitionTime":"2025-10-11T00:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.091623 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:12 crc kubenswrapper[4743]: E1011 00:53:12.091797 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.091929 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.091954 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.092055 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:12 crc kubenswrapper[4743]: E1011 00:53:12.092180 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:12 crc kubenswrapper[4743]: E1011 00:53:12.092319 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:12 crc kubenswrapper[4743]: E1011 00:53:12.092424 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.106459 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.106506 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.106517 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.106536 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.106549 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:12Z","lastTransitionTime":"2025-10-11T00:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.209328 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.209408 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.209427 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.209456 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.209481 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:12Z","lastTransitionTime":"2025-10-11T00:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.313115 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.313222 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.313259 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.313293 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.313318 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:12Z","lastTransitionTime":"2025-10-11T00:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.417062 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.417146 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.417169 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.417212 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.417237 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:12Z","lastTransitionTime":"2025-10-11T00:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.522908 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.522973 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.522995 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.523030 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.523054 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:12Z","lastTransitionTime":"2025-10-11T00:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.627201 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.627548 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.627696 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.627879 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.628029 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:12Z","lastTransitionTime":"2025-10-11T00:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.731447 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.731668 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.731796 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.732011 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.732169 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:12Z","lastTransitionTime":"2025-10-11T00:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.835279 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.835537 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.835674 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.835818 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.836049 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:12Z","lastTransitionTime":"2025-10-11T00:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.939245 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.939609 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.939768 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.939952 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:12 crc kubenswrapper[4743]: I1011 00:53:12.940103 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:12Z","lastTransitionTime":"2025-10-11T00:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.043911 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.043973 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.043990 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.044015 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.044032 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:13Z","lastTransitionTime":"2025-10-11T00:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.147694 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.147756 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.147775 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.147803 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.147821 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:13Z","lastTransitionTime":"2025-10-11T00:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.251456 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.251521 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.251539 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.251566 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.251585 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:13Z","lastTransitionTime":"2025-10-11T00:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.355761 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.355898 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.355923 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.355957 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.355977 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:13Z","lastTransitionTime":"2025-10-11T00:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.459526 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.459607 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.459632 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.459665 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.459688 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:13Z","lastTransitionTime":"2025-10-11T00:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.563083 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.563136 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.563158 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.563197 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.563248 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:13Z","lastTransitionTime":"2025-10-11T00:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.665986 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.666056 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.666076 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.666102 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.666121 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:13Z","lastTransitionTime":"2025-10-11T00:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.770205 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.770274 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.770294 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.770321 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.770340 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:13Z","lastTransitionTime":"2025-10-11T00:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.874013 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.874090 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.874111 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.874148 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.874171 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:13Z","lastTransitionTime":"2025-10-11T00:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.979840 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.979946 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.979964 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.979991 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:13 crc kubenswrapper[4743]: I1011 00:53:13.980010 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:13Z","lastTransitionTime":"2025-10-11T00:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.083215 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.083274 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.083291 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.083316 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.083338 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:14Z","lastTransitionTime":"2025-10-11T00:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.091440 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.091518 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.091544 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.091564 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:14 crc kubenswrapper[4743]: E1011 00:53:14.092151 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:14 crc kubenswrapper[4743]: E1011 00:53:14.092338 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:14 crc kubenswrapper[4743]: E1011 00:53:14.092515 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:14 crc kubenswrapper[4743]: E1011 00:53:14.092611 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.186890 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.186998 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.187021 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.187048 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.187067 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:14Z","lastTransitionTime":"2025-10-11T00:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.290186 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.290250 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.290270 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.290296 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.290315 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:14Z","lastTransitionTime":"2025-10-11T00:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.394141 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.394207 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.394225 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.394252 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.394270 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:14Z","lastTransitionTime":"2025-10-11T00:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.497657 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.497724 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.497741 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.497768 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.497787 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:14Z","lastTransitionTime":"2025-10-11T00:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.601363 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.601422 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.601438 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.601463 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.601481 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:14Z","lastTransitionTime":"2025-10-11T00:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.705390 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.705524 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.705550 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.705575 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.705592 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:14Z","lastTransitionTime":"2025-10-11T00:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.808967 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.809009 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.809021 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.809040 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.809057 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:14Z","lastTransitionTime":"2025-10-11T00:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.912511 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.912585 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.912609 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.912641 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:14 crc kubenswrapper[4743]: I1011 00:53:14.912662 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:14Z","lastTransitionTime":"2025-10-11T00:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.016052 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.016142 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.016169 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.016202 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.016232 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:15Z","lastTransitionTime":"2025-10-11T00:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.093250 4743 scope.go:117] "RemoveContainer" containerID="268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6" Oct 11 00:53:15 crc kubenswrapper[4743]: E1011 00:53:15.093700 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-48ljj_openshift-ovn-kubernetes(9ed16b35-862f-47f2-9e32-63c98f868fb8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.117904 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.120919 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.120974 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.120994 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.121021 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.121044 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:15Z","lastTransitionTime":"2025-10-11T00:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.224959 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.225047 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.225069 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.225102 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.225126 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:15Z","lastTransitionTime":"2025-10-11T00:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.328931 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.329003 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.329021 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.329049 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.329069 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:15Z","lastTransitionTime":"2025-10-11T00:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.433470 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.433553 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.433578 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.433615 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.433646 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:15Z","lastTransitionTime":"2025-10-11T00:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.536604 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.536673 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.536696 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.536731 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.536754 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:15Z","lastTransitionTime":"2025-10-11T00:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.640248 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.640326 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.640344 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.640377 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.640398 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:15Z","lastTransitionTime":"2025-10-11T00:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.743540 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.743606 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.743629 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.743659 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.743705 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:15Z","lastTransitionTime":"2025-10-11T00:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.848267 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.848319 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.848329 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.848351 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.848371 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:15Z","lastTransitionTime":"2025-10-11T00:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.951182 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.951250 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.951273 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.951305 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:15 crc kubenswrapper[4743]: I1011 00:53:15.951325 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:15Z","lastTransitionTime":"2025-10-11T00:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.054695 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.054750 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.054765 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.054785 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.054796 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:16Z","lastTransitionTime":"2025-10-11T00:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.091325 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.091467 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.091350 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:16 crc kubenswrapper[4743]: E1011 00:53:16.091556 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.091671 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:16 crc kubenswrapper[4743]: E1011 00:53:16.091791 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:16 crc kubenswrapper[4743]: E1011 00:53:16.091972 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:16 crc kubenswrapper[4743]: E1011 00:53:16.092080 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.114386 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.135364 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd049f7c9c78e28d7cfdeb7087cf419e5db915fecd46a83ae0bdc8317fff9728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.156037 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.158914 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.158966 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.158984 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.159009 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.159026 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:16Z","lastTransitionTime":"2025-10-11T00:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.173944 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add92263-e252-446b-95de-092585b4357f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9668b466043d0e86521a2ea925416a4987cb051ca0ab8764fd0fd3095991f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d42sg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cvm72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.198195 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a7b971-8779-491c-8d3f-e7d5b4d60968\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38b5a40ddd5390cc392fcf2240304a72163dd797511640ada5ad39a59c0f4283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8adfcc8627f1a6ce588d50eabcc56c6814c3cff75d50144def98466d2dabb69a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://800cdd03b73991fe9c1cfbe9a392b5f08854ca2f5f9e5106022bb67307578ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49661bf59f8e1d64e5980f0ed00dcc50c4c2ab52cc2232ade890883f0d0158d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27d3d6d9f65eb7b710148bc4e077c74a15207b637727163afe631acfe3ec90f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00edafd52ba2a97f21af491b5233bc2ad945e6b77106eb94a947b777a0129cfb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://276d1e264f541d55720a57945800ec969959c0828071dcda4fb1c0f13b17118b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jkrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6wcnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.232179 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed16b35-862f-47f2-9e32-63c98f868fb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:53:02Z\\\",\\\"message\\\":\\\"== {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1011 00:53:02.211217 6723 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1011 00:53:02.211203 6723 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:53:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-48ljj_openshift-ovn-kubernetes(9ed16b35-862f-47f2-9e32-63c98f868fb8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xw92h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-48ljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.252531 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab5ac12d-37cf-48f3-bb33-b93b4096c706\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4335571969a3cb66f2fefdad63b77b4a31fc2631aba5ba427b2af4db8b6c6f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://895152fd89f4c35c7d379d3a93c1b5f185275cd151d27b442b8f06280f3f74a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2bc6f9e58ae313b026b5597d2569b343931bc066ac8b3751c62f39c8d849bf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3532ffcf956621ad56477cbb9ff70c2855091a6ad996d3a15fa3c9a28943cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3532ffcf956621ad56477cbb9ff70c2855091a6ad996d3a15fa3c9a28943cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.262118 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.262208 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.262232 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.262295 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.262319 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:16Z","lastTransitionTime":"2025-10-11T00:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.266988 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9427484d-7bed-49b3-ab70-55c84fb02a93\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496005e7eab2c6d4e670411965f6761742df49342e67085345d2a0bc7edb484c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceb1b6fd70dcc7f3acfd121861698011e37b5edd0a022432ed140d9508ed1f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ceb1b6fd70dcc7f3acfd121861698011e37b5edd0a022432ed140d9508ed1f1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.300600 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83857db2-dda0-4b62-9336-3393a2c23f3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a46e8adc9aa9255991b607e98eaf6fc411aeb2a6b0edf45e3e9a7935a2eae5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b59a1efc62279df4a9766fae0e700a435812160cf3a40a915c967f7a99c9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe9d2bc75aa162a0cab825a1f13cf35f20491a0317000d5a9775977fb7f6b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf3820ae4ab59d420d2e7f001b2953f751c6dd75f90b5fcc76fdaee8c8df722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f14caad841305b82f04cf14411e2308edbcdd8ea2261ff8401edc95900855ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://818ca392b918c66a8ca013f3ec5b938595fc78b726af7ae31524c09dbe9302f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44506333e8bd3a20777cf0c382b7652d6d3fa1cdc13f056412cf4899e25115e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e870a1289513fe9e864779c13e63c0991fdaded247b35ac75b7810ab27683ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.317026 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a657286e-3a6d-4265-ae33-a7d2ce8b64ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1558bde7a3b8b01c9ce7f6907ecc43abc6e27587236be7b41487968effa33bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2616059cd6d57ff41cdaa2e1b35e0e8b474398c434022ba6b9e926e19e64c0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0392332521983f1ea8939df48e05618035ca97895122067daef61ee1fdb7af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b297361a1a2f13aa53d7a5a58948dbe86e8e00f888e6a69cc884154828c29a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://530c39a66f06ca33cba6a3ab75cb68148425f5a732b503d054b6c74a9d48fabf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-11T00:52:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1011 00:51:59.674497 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1011 00:51:59.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3177302691/tls.crt::/tmp/serving-cert-3177302691/tls.key\\\\\\\"\\\\nI1011 00:52:05.851548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1011 00:52:05.855373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1011 00:52:05.855414 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1011 00:52:05.855455 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1011 00:52:05.855472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1011 00:52:05.866907 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1011 00:52:05.866935 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866939 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1011 00:52:05.866944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1011 00:52:05.866947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1011 00:52:05.866951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1011 00:52:05.866954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1011 00:52:05.866970 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1011 00:52:05.870518 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1c0227825fa743bb310a1dfb08333cea5ee16a2a2f30b77c97e2f28a2319542\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d411bef56dd0ccdad159fc018c15ad4f0df6adee56150b1143e2dcf442a0fe18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-11T00:51:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.332288 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vlxgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d677d122-c8be-4938-8d2c-bde4a088a63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dbaee9cca1154a094e092eaaf1dea2aeb6a61fbf9323611275485f75ea911e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95k6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vlxgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.345056 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-t9nsf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f07b1e57-3c09-4e75-866d-a4292db4e151\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc297cc894f36c7afa9ce15a8e9c71b2d7bf66042c55d0c7686f20b6a3ece7be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c97lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-t9nsf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.366037 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.366279 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.366371 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.366458 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.366539 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:16Z","lastTransitionTime":"2025-10-11T00:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.366425 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cd23d7ad070c4d2f029567ac2186d797b082b8da8ff2c9811982a1c5c7b0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.389830 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bca7230a67c579c0fc54921d292f52e18ce2c25f79cd4a351a9f6b576df3553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab9cd20bc5b28791e21984a0edf4fbf50acc05d6160780d0696698715242b5d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.409048 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cb3d634-b381-4ee0-a819-ce6f87fa8afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c47ab12f326ce6568bfd06d8beebdeffc07d464b747a288a14793db096ccc6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b62556627f75a8ed41c68a7fb1982461fd9fb35965014b72ec15492e763e42b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4dnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ptjnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.433283 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cb5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2zdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cb5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.453054 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc432b8f-1fc8-4b7d-a202-bf1452921fc3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45ed857549a3f84b624b8bf411ac4359bad74bbf8eea28bda8ebeda3b12dd34e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://748faceccab50f961795c16fb7c31665af75d570094d1dd2c765bb2215b65c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc8d989925344208a60f51034c7c486de0165d5c5a9a1f966df45ac5685c89f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34de48136b54818c60eb929d4b5374689378190620a6502448759f172856d6af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:51:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.469476 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.469641 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.469723 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.469810 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.469927 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:16Z","lastTransitionTime":"2025-10-11T00:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.475001 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:06Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.497368 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9jfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c603f4-717c-4554-992a-8338b3bef24d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-11T00:52:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21793d4ae38fc6e714912dbedbba8c45e37482a03cbd76d10461c41851e16896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-11T00:52:55Z\\\",\\\"message\\\":\\\"2025-10-11T00:52:10+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bf700974-1b5c-49f8-b8b9-3e4987959237\\\\n2025-10-11T00:52:10+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bf700974-1b5c-49f8-b8b9-3e4987959237 to /host/opt/cni/bin/\\\\n2025-10-11T00:52:10Z [verbose] multus-daemon started\\\\n2025-10-11T00:52:10Z [verbose] Readiness Indicator file check\\\\n2025-10-11T00:52:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-11T00:52:09Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-11T00:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5b2pr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-11T00:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9jfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:16Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.574126 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.574177 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.574196 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.574223 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.574243 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:16Z","lastTransitionTime":"2025-10-11T00:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.677579 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.677650 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.677673 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.677706 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.677732 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:16Z","lastTransitionTime":"2025-10-11T00:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.780758 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.780801 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.780812 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.780830 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.780842 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:16Z","lastTransitionTime":"2025-10-11T00:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.884727 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.884786 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.884803 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.884829 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.884849 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:16Z","lastTransitionTime":"2025-10-11T00:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.987622 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.987693 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.987722 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.987753 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:16 crc kubenswrapper[4743]: I1011 00:53:16.987775 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:16Z","lastTransitionTime":"2025-10-11T00:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.090928 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.090996 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.091021 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.091049 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.091069 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:17Z","lastTransitionTime":"2025-10-11T00:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.194129 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.194191 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.194208 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.194232 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.194251 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:17Z","lastTransitionTime":"2025-10-11T00:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.296918 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.297012 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.297062 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.297087 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.297104 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:17Z","lastTransitionTime":"2025-10-11T00:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.399723 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.400258 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.400557 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.400761 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.401002 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:17Z","lastTransitionTime":"2025-10-11T00:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.505110 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.505189 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.505207 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.505236 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.505257 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:17Z","lastTransitionTime":"2025-10-11T00:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.608159 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.608218 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.608236 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.608262 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.608282 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:17Z","lastTransitionTime":"2025-10-11T00:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.711143 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.711207 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.711224 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.711245 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.711264 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:17Z","lastTransitionTime":"2025-10-11T00:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.814206 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.814273 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.814297 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.814329 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.814352 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:17Z","lastTransitionTime":"2025-10-11T00:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.917276 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.917321 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.917337 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.917358 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:17 crc kubenswrapper[4743]: I1011 00:53:17.917374 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:17Z","lastTransitionTime":"2025-10-11T00:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.020036 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.020088 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.020106 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.020146 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.020164 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:18Z","lastTransitionTime":"2025-10-11T00:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.091090 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.091088 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.091221 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.091286 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:18 crc kubenswrapper[4743]: E1011 00:53:18.091647 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:18 crc kubenswrapper[4743]: E1011 00:53:18.091651 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:18 crc kubenswrapper[4743]: E1011 00:53:18.091792 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:18 crc kubenswrapper[4743]: E1011 00:53:18.091820 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.122266 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.122330 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.122347 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.122373 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.122392 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:18Z","lastTransitionTime":"2025-10-11T00:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.225198 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.225261 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.225277 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.225301 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.225318 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:18Z","lastTransitionTime":"2025-10-11T00:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.328485 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.328546 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.328565 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.328590 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.328608 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:18Z","lastTransitionTime":"2025-10-11T00:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.388471 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.388520 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.388532 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.388550 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.388561 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:18Z","lastTransitionTime":"2025-10-11T00:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:18 crc kubenswrapper[4743]: E1011 00:53:18.407714 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:18Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.413095 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.413131 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.413140 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.413155 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.413164 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:18Z","lastTransitionTime":"2025-10-11T00:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:18 crc kubenswrapper[4743]: E1011 00:53:18.430030 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:18Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.434568 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.434609 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.434626 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.434648 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.434668 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:18Z","lastTransitionTime":"2025-10-11T00:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:18 crc kubenswrapper[4743]: E1011 00:53:18.454383 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:18Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.458667 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.458727 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.458744 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.458769 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.458785 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:18Z","lastTransitionTime":"2025-10-11T00:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:18 crc kubenswrapper[4743]: E1011 00:53:18.479014 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:18Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.484508 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.484550 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.484562 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.484582 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.484596 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:18Z","lastTransitionTime":"2025-10-11T00:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:18 crc kubenswrapper[4743]: E1011 00:53:18.503082 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-11T00:53:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a117022-09d2-46e0-826b-22308ec25890\\\",\\\"systemUUID\\\":\\\"407eb137-47d1-41e8-9c72-65f09e76d21a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-11T00:53:18Z is after 2025-08-24T17:21:41Z" Oct 11 00:53:18 crc kubenswrapper[4743]: E1011 00:53:18.503306 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.505665 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.505766 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.505833 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.505927 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.505956 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:18Z","lastTransitionTime":"2025-10-11T00:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.609359 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.609426 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.609445 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.609476 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.609497 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:18Z","lastTransitionTime":"2025-10-11T00:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.712065 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.712123 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.712141 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.712169 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.712187 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:18Z","lastTransitionTime":"2025-10-11T00:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.814590 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.814654 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.814671 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.814697 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.814714 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:18Z","lastTransitionTime":"2025-10-11T00:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.918465 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.918534 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.918552 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.918576 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:18 crc kubenswrapper[4743]: I1011 00:53:18.918594 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:18Z","lastTransitionTime":"2025-10-11T00:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.021688 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.021744 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.021760 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.021785 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.021803 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:19Z","lastTransitionTime":"2025-10-11T00:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.125669 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.125735 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.125759 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.125792 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.125814 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:19Z","lastTransitionTime":"2025-10-11T00:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.228540 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.228605 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.228628 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.228660 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.228679 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:19Z","lastTransitionTime":"2025-10-11T00:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.331927 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.331988 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.332011 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.332043 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.332067 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:19Z","lastTransitionTime":"2025-10-11T00:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.435569 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.435631 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.435649 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.435681 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.435698 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:19Z","lastTransitionTime":"2025-10-11T00:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.538691 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.538756 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.538774 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.538802 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.538820 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:19Z","lastTransitionTime":"2025-10-11T00:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.641896 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.641968 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.641988 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.642014 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.642032 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:19Z","lastTransitionTime":"2025-10-11T00:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.746461 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.746514 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.746532 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.746567 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.746588 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:19Z","lastTransitionTime":"2025-10-11T00:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.850408 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.850517 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.850542 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.850575 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.850597 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:19Z","lastTransitionTime":"2025-10-11T00:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.953157 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.953211 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.953247 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.953265 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:19 crc kubenswrapper[4743]: I1011 00:53:19.953276 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:19Z","lastTransitionTime":"2025-10-11T00:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.056139 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.056190 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.056199 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.056214 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.056224 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:20Z","lastTransitionTime":"2025-10-11T00:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.091033 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.091085 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:20 crc kubenswrapper[4743]: E1011 00:53:20.091159 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.091033 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:20 crc kubenswrapper[4743]: E1011 00:53:20.091316 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:20 crc kubenswrapper[4743]: E1011 00:53:20.091504 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.091537 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:20 crc kubenswrapper[4743]: E1011 00:53:20.091718 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.159003 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.159094 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.159110 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.159134 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.159151 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:20Z","lastTransitionTime":"2025-10-11T00:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.261819 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.262038 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.262058 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.262145 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.262172 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:20Z","lastTransitionTime":"2025-10-11T00:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.365111 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.365188 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.365205 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.365231 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.365248 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:20Z","lastTransitionTime":"2025-10-11T00:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.468518 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.468584 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.468606 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.468636 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.468663 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:20Z","lastTransitionTime":"2025-10-11T00:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.571458 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.571522 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.571554 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.571579 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.571595 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:20Z","lastTransitionTime":"2025-10-11T00:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.674932 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.674989 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.675018 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.675040 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.675055 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:20Z","lastTransitionTime":"2025-10-11T00:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.777832 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.777918 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.777937 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.777961 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.777977 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:20Z","lastTransitionTime":"2025-10-11T00:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.880307 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.880357 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.880376 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.880402 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.880422 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:20Z","lastTransitionTime":"2025-10-11T00:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.983774 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.983879 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.983903 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.983934 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:20 crc kubenswrapper[4743]: I1011 00:53:20.983952 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:20Z","lastTransitionTime":"2025-10-11T00:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.086752 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.086816 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.086838 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.086888 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.086906 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:21Z","lastTransitionTime":"2025-10-11T00:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.189408 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.189440 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.189451 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.189469 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.189480 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:21Z","lastTransitionTime":"2025-10-11T00:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.292214 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.292257 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.292273 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.292297 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.292313 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:21Z","lastTransitionTime":"2025-10-11T00:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.396215 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.396261 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.396278 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.396301 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.396317 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:21Z","lastTransitionTime":"2025-10-11T00:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.498586 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.498639 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.498656 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.498678 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.498697 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:21Z","lastTransitionTime":"2025-10-11T00:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.603162 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.603229 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.603248 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.603275 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.603293 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:21Z","lastTransitionTime":"2025-10-11T00:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.705935 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.705999 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.706020 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.706049 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.706066 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:21Z","lastTransitionTime":"2025-10-11T00:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.808842 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.808944 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.808964 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.808992 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.809012 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:21Z","lastTransitionTime":"2025-10-11T00:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.911492 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.911566 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.911589 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.911618 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:21 crc kubenswrapper[4743]: I1011 00:53:21.911638 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:21Z","lastTransitionTime":"2025-10-11T00:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.014253 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.014310 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.014352 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.014380 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.014397 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:22Z","lastTransitionTime":"2025-10-11T00:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.091574 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:22 crc kubenswrapper[4743]: E1011 00:53:22.091782 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.092100 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.092158 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:22 crc kubenswrapper[4743]: E1011 00:53:22.092203 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.092272 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:22 crc kubenswrapper[4743]: E1011 00:53:22.092384 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:22 crc kubenswrapper[4743]: E1011 00:53:22.092530 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.116979 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.117050 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.117072 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.117099 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.117122 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:22Z","lastTransitionTime":"2025-10-11T00:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.220311 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.220674 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.220834 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.221060 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.221228 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:22Z","lastTransitionTime":"2025-10-11T00:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.324411 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.324463 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.324484 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.324511 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.324532 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:22Z","lastTransitionTime":"2025-10-11T00:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.427928 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.427979 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.427997 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.428023 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.428039 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:22Z","lastTransitionTime":"2025-10-11T00:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.531281 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.531340 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.531356 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.531382 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.531400 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:22Z","lastTransitionTime":"2025-10-11T00:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.635060 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.635135 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.635155 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.635195 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.635215 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:22Z","lastTransitionTime":"2025-10-11T00:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.739275 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.739384 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.739408 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.739434 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.739452 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:22Z","lastTransitionTime":"2025-10-11T00:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.841798 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.841837 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.841850 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.841910 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.841922 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:22Z","lastTransitionTime":"2025-10-11T00:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.944363 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.944434 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.944457 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.944489 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:22 crc kubenswrapper[4743]: I1011 00:53:22.944509 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:22Z","lastTransitionTime":"2025-10-11T00:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.047393 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.047468 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.047491 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.047526 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.047551 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:23Z","lastTransitionTime":"2025-10-11T00:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.150956 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.151010 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.151031 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.151056 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.151074 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:23Z","lastTransitionTime":"2025-10-11T00:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.254737 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.254801 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.254818 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.254845 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.254907 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:23Z","lastTransitionTime":"2025-10-11T00:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.358154 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.358216 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.358234 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.358259 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.358276 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:23Z","lastTransitionTime":"2025-10-11T00:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.462230 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.462301 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.462318 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.462342 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.462360 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:23Z","lastTransitionTime":"2025-10-11T00:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.565573 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.565636 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.565655 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.565679 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.565700 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:23Z","lastTransitionTime":"2025-10-11T00:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.668547 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.668610 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.668629 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.668651 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.668669 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:23Z","lastTransitionTime":"2025-10-11T00:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.771044 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.771119 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.771139 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.771170 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.771191 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:23Z","lastTransitionTime":"2025-10-11T00:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.874190 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.874266 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.874303 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.874324 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.874337 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:23Z","lastTransitionTime":"2025-10-11T00:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.977693 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.977762 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.977785 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.977813 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:23 crc kubenswrapper[4743]: I1011 00:53:23.977833 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:23Z","lastTransitionTime":"2025-10-11T00:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.081280 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.081345 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.081362 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.081387 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.081405 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:24Z","lastTransitionTime":"2025-10-11T00:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.090948 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.090986 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.091092 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:24 crc kubenswrapper[4743]: E1011 00:53:24.091134 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.091194 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:24 crc kubenswrapper[4743]: E1011 00:53:24.091435 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:24 crc kubenswrapper[4743]: E1011 00:53:24.091527 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:24 crc kubenswrapper[4743]: E1011 00:53:24.091747 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.185105 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.185164 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.185181 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.185206 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.185223 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:24Z","lastTransitionTime":"2025-10-11T00:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.288666 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.288714 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.288730 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.288754 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.288774 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:24Z","lastTransitionTime":"2025-10-11T00:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.391824 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.391941 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.391962 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.391989 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.392009 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:24Z","lastTransitionTime":"2025-10-11T00:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.494759 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.494814 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.494831 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.494881 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.494900 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:24Z","lastTransitionTime":"2025-10-11T00:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.597357 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.597464 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.597483 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.597520 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.597538 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:24Z","lastTransitionTime":"2025-10-11T00:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.700192 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.700247 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.700263 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.700287 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.700304 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:24Z","lastTransitionTime":"2025-10-11T00:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.802587 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.802620 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.802631 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.802646 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.802656 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:24Z","lastTransitionTime":"2025-10-11T00:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.904774 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.904824 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.904839 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.904883 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:24 crc kubenswrapper[4743]: I1011 00:53:24.904900 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:24Z","lastTransitionTime":"2025-10-11T00:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.007050 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.007119 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.007141 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.007168 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.007190 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:25Z","lastTransitionTime":"2025-10-11T00:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.109668 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.109735 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.109754 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.109779 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.109798 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:25Z","lastTransitionTime":"2025-10-11T00:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.213372 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.213490 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.213512 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.213544 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.213565 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:25Z","lastTransitionTime":"2025-10-11T00:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.315675 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.315730 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.315746 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.315768 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.315785 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:25Z","lastTransitionTime":"2025-10-11T00:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.419640 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.419702 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.419720 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.419747 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.419764 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:25Z","lastTransitionTime":"2025-10-11T00:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.523481 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.523548 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.523572 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.523608 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.523630 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:25Z","lastTransitionTime":"2025-10-11T00:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.626702 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.626751 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.626767 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.626791 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.626809 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:25Z","lastTransitionTime":"2025-10-11T00:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.730138 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.730190 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.730207 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.730232 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.730249 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:25Z","lastTransitionTime":"2025-10-11T00:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.833390 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.833459 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.833482 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.833518 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.833540 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:25Z","lastTransitionTime":"2025-10-11T00:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.937071 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.937155 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.937179 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.937211 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:25 crc kubenswrapper[4743]: I1011 00:53:25.937235 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:25Z","lastTransitionTime":"2025-10-11T00:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.041137 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.041203 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.041220 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.041251 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.041268 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:26Z","lastTransitionTime":"2025-10-11T00:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.091055 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.091100 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.091126 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.091070 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:26 crc kubenswrapper[4743]: E1011 00:53:26.091272 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:26 crc kubenswrapper[4743]: E1011 00:53:26.091412 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:26 crc kubenswrapper[4743]: E1011 00:53:26.091547 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:26 crc kubenswrapper[4743]: E1011 00:53:26.091691 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.146032 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.146127 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.146179 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.146213 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.146280 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:26Z","lastTransitionTime":"2025-10-11T00:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.179080 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ptjnt" podStartSLOduration=79.179044856 podStartE2EDuration="1m19.179044856s" podCreationTimestamp="2025-10-11 00:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:53:26.16014008 +0000 UTC m=+100.813120537" watchObservedRunningTime="2025-10-11 00:53:26.179044856 +0000 UTC m=+100.832025303" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.229565 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=76.229546177 podStartE2EDuration="1m16.229546177s" podCreationTimestamp="2025-10-11 00:52:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:53:26.206391511 +0000 UTC m=+100.859371948" watchObservedRunningTime="2025-10-11 00:53:26.229546177 +0000 UTC m=+100.882526584" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.248474 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.248530 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.248546 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.248571 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.248589 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:26Z","lastTransitionTime":"2025-10-11T00:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.276224 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9jfxn" podStartSLOduration=80.276204699 podStartE2EDuration="1m20.276204699s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:53:26.25718615 +0000 UTC m=+100.910166587" watchObservedRunningTime="2025-10-11 00:53:26.276204699 +0000 UTC m=+100.929185106" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.292187 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs\") pod \"network-metrics-daemon-cb5z5\" (UID: \"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\") " pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:26 crc kubenswrapper[4743]: E1011 00:53:26.292346 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 00:53:26 crc kubenswrapper[4743]: E1011 00:53:26.292405 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs podName:b02b8636-a5c4-447d-b1cf-401b3dcfa02b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:30.292389967 +0000 UTC m=+164.945370374 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs") pod "network-metrics-daemon-cb5z5" (UID: "b02b8636-a5c4-447d-b1cf-401b3dcfa02b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.348582 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podStartSLOduration=80.348560099 podStartE2EDuration="1m20.348560099s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:53:26.327838022 +0000 UTC m=+100.980818429" watchObservedRunningTime="2025-10-11 00:53:26.348560099 +0000 UTC m=+101.001540506" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.351390 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.351416 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.351426 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.351441 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.351453 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:26Z","lastTransitionTime":"2025-10-11T00:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.387811 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6wcnk" podStartSLOduration=80.38778308 podStartE2EDuration="1m20.38778308s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:53:26.349508646 +0000 UTC m=+101.002489053" watchObservedRunningTime="2025-10-11 00:53:26.38778308 +0000 UTC m=+101.040763517" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.414910 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=51.414887158 podStartE2EDuration="51.414887158s" podCreationTimestamp="2025-10-11 00:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:53:26.403875336 +0000 UTC m=+101.056855773" watchObservedRunningTime="2025-10-11 00:53:26.414887158 +0000 UTC m=+101.067867575" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.449261 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=11.44923293 podStartE2EDuration="11.44923293s" podCreationTimestamp="2025-10-11 00:53:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:53:26.415048943 +0000 UTC m=+101.068029420" watchObservedRunningTime="2025-10-11 00:53:26.44923293 +0000 UTC m=+101.102213367" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.450077 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=77.450064744 podStartE2EDuration="1m17.450064744s" podCreationTimestamp="2025-10-11 00:52:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:53:26.446001349 +0000 UTC m=+101.098981786" watchObservedRunningTime="2025-10-11 00:53:26.450064744 +0000 UTC m=+101.103045181" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.453544 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.453581 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.453590 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.453605 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.453614 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:26Z","lastTransitionTime":"2025-10-11T00:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.480500 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=80.480483366 podStartE2EDuration="1m20.480483366s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:53:26.467850518 +0000 UTC m=+101.120830925" watchObservedRunningTime="2025-10-11 00:53:26.480483366 +0000 UTC m=+101.133463773" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.481240 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-vlxgw" podStartSLOduration=81.481235307 podStartE2EDuration="1m21.481235307s" podCreationTimestamp="2025-10-11 00:52:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:53:26.481110054 +0000 UTC m=+101.134090461" watchObservedRunningTime="2025-10-11 00:53:26.481235307 +0000 UTC m=+101.134215724" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.492503 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-t9nsf" podStartSLOduration=80.492482886 podStartE2EDuration="1m20.492482886s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:53:26.491415276 +0000 UTC m=+101.144395683" watchObservedRunningTime="2025-10-11 00:53:26.492482886 +0000 UTC m=+101.145463343" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.556481 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.556520 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.556532 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.556551 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.556564 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:26Z","lastTransitionTime":"2025-10-11T00:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.659899 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.659956 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.659973 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.659996 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.660017 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:26Z","lastTransitionTime":"2025-10-11T00:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.765238 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.765278 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.765290 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.765308 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.765324 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:26Z","lastTransitionTime":"2025-10-11T00:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.875414 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.875450 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.875461 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.875477 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.875486 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:26Z","lastTransitionTime":"2025-10-11T00:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.978357 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.978437 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.978468 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.978502 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:26 crc kubenswrapper[4743]: I1011 00:53:26.978527 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:26Z","lastTransitionTime":"2025-10-11T00:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.081506 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.081583 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.081607 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.081644 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.081666 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:27Z","lastTransitionTime":"2025-10-11T00:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.184849 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.184938 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.184956 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.184981 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.184998 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:27Z","lastTransitionTime":"2025-10-11T00:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.288379 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.288442 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.288459 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.288484 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.288501 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:27Z","lastTransitionTime":"2025-10-11T00:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.392076 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.392121 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.392132 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.392150 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.392161 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:27Z","lastTransitionTime":"2025-10-11T00:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.494689 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.494757 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.494774 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.494798 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.494815 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:27Z","lastTransitionTime":"2025-10-11T00:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.597544 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.597625 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.597647 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.597673 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.597690 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:27Z","lastTransitionTime":"2025-10-11T00:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.700433 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.700512 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.700533 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.700561 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.700580 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:27Z","lastTransitionTime":"2025-10-11T00:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.803986 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.804048 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.804067 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.804090 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.804107 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:27Z","lastTransitionTime":"2025-10-11T00:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.907263 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.907316 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.907333 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.907359 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:27 crc kubenswrapper[4743]: I1011 00:53:27.907376 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:27Z","lastTransitionTime":"2025-10-11T00:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.010017 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.010073 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.010090 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.010114 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.010132 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:28Z","lastTransitionTime":"2025-10-11T00:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.091068 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.091195 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.091102 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.091079 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:28 crc kubenswrapper[4743]: E1011 00:53:28.091335 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:28 crc kubenswrapper[4743]: E1011 00:53:28.091449 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:28 crc kubenswrapper[4743]: E1011 00:53:28.091557 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:28 crc kubenswrapper[4743]: E1011 00:53:28.092119 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.112686 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.112740 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.112760 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.112783 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.112800 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:28Z","lastTransitionTime":"2025-10-11T00:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.215673 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.215756 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.215780 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.215812 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.215836 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:28Z","lastTransitionTime":"2025-10-11T00:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.319168 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.319234 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.319254 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.319281 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.319300 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:28Z","lastTransitionTime":"2025-10-11T00:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.422372 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.422429 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.422447 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.422474 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.422493 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:28Z","lastTransitionTime":"2025-10-11T00:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.525450 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.525494 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.525505 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.525523 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.525536 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:28Z","lastTransitionTime":"2025-10-11T00:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.526758 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.526827 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.526849 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.526919 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.526992 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-11T00:53:28Z","lastTransitionTime":"2025-10-11T00:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.591797 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-2cjz2"] Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.592317 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2cjz2" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.594468 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.595571 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.595788 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.596371 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.616806 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aaa1c884-4356-4bd6-91b2-55c0e8b9ff63-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2cjz2\" (UID: \"aaa1c884-4356-4bd6-91b2-55c0e8b9ff63\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2cjz2" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.616993 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/aaa1c884-4356-4bd6-91b2-55c0e8b9ff63-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2cjz2\" (UID: \"aaa1c884-4356-4bd6-91b2-55c0e8b9ff63\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2cjz2" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.617061 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/aaa1c884-4356-4bd6-91b2-55c0e8b9ff63-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2cjz2\" (UID: \"aaa1c884-4356-4bd6-91b2-55c0e8b9ff63\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2cjz2" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.617352 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aaa1c884-4356-4bd6-91b2-55c0e8b9ff63-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2cjz2\" (UID: \"aaa1c884-4356-4bd6-91b2-55c0e8b9ff63\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2cjz2" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.617525 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aaa1c884-4356-4bd6-91b2-55c0e8b9ff63-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2cjz2\" (UID: \"aaa1c884-4356-4bd6-91b2-55c0e8b9ff63\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2cjz2" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.718587 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aaa1c884-4356-4bd6-91b2-55c0e8b9ff63-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2cjz2\" (UID: \"aaa1c884-4356-4bd6-91b2-55c0e8b9ff63\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2cjz2" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.718671 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aaa1c884-4356-4bd6-91b2-55c0e8b9ff63-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2cjz2\" (UID: \"aaa1c884-4356-4bd6-91b2-55c0e8b9ff63\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2cjz2" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.718720 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/aaa1c884-4356-4bd6-91b2-55c0e8b9ff63-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2cjz2\" (UID: \"aaa1c884-4356-4bd6-91b2-55c0e8b9ff63\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2cjz2" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.718762 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/aaa1c884-4356-4bd6-91b2-55c0e8b9ff63-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2cjz2\" (UID: \"aaa1c884-4356-4bd6-91b2-55c0e8b9ff63\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2cjz2" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.718811 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aaa1c884-4356-4bd6-91b2-55c0e8b9ff63-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2cjz2\" (UID: \"aaa1c884-4356-4bd6-91b2-55c0e8b9ff63\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2cjz2" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.718942 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/aaa1c884-4356-4bd6-91b2-55c0e8b9ff63-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2cjz2\" (UID: \"aaa1c884-4356-4bd6-91b2-55c0e8b9ff63\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2cjz2" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.718972 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/aaa1c884-4356-4bd6-91b2-55c0e8b9ff63-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2cjz2\" (UID: \"aaa1c884-4356-4bd6-91b2-55c0e8b9ff63\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2cjz2" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.720370 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aaa1c884-4356-4bd6-91b2-55c0e8b9ff63-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2cjz2\" (UID: \"aaa1c884-4356-4bd6-91b2-55c0e8b9ff63\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2cjz2" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.727144 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aaa1c884-4356-4bd6-91b2-55c0e8b9ff63-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2cjz2\" (UID: \"aaa1c884-4356-4bd6-91b2-55c0e8b9ff63\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2cjz2" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.747062 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aaa1c884-4356-4bd6-91b2-55c0e8b9ff63-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2cjz2\" (UID: \"aaa1c884-4356-4bd6-91b2-55c0e8b9ff63\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2cjz2" Oct 11 00:53:28 crc kubenswrapper[4743]: I1011 00:53:28.908404 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2cjz2" Oct 11 00:53:28 crc kubenswrapper[4743]: W1011 00:53:28.931292 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaa1c884_4356_4bd6_91b2_55c0e8b9ff63.slice/crio-b60426ad3e7d2bcc4b8874d77b02ad19dfc217e97bd3bf2409e701c9969ca727 WatchSource:0}: Error finding container b60426ad3e7d2bcc4b8874d77b02ad19dfc217e97bd3bf2409e701c9969ca727: Status 404 returned error can't find the container with id b60426ad3e7d2bcc4b8874d77b02ad19dfc217e97bd3bf2409e701c9969ca727 Oct 11 00:53:29 crc kubenswrapper[4743]: I1011 00:53:29.092752 4743 scope.go:117] "RemoveContainer" containerID="268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6" Oct 11 00:53:29 crc kubenswrapper[4743]: E1011 00:53:29.093292 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-48ljj_openshift-ovn-kubernetes(9ed16b35-862f-47f2-9e32-63c98f868fb8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" Oct 11 00:53:29 crc kubenswrapper[4743]: I1011 00:53:29.768172 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2cjz2" event={"ID":"aaa1c884-4356-4bd6-91b2-55c0e8b9ff63","Type":"ContainerStarted","Data":"ec4fa677b316117f87b719323d3c9ac37e79e9c9f92812680cbd474fb8cd5472"} Oct 11 00:53:29 crc kubenswrapper[4743]: I1011 00:53:29.768255 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2cjz2" event={"ID":"aaa1c884-4356-4bd6-91b2-55c0e8b9ff63","Type":"ContainerStarted","Data":"b60426ad3e7d2bcc4b8874d77b02ad19dfc217e97bd3bf2409e701c9969ca727"} Oct 11 00:53:30 crc kubenswrapper[4743]: I1011 00:53:30.091206 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:30 crc kubenswrapper[4743]: E1011 00:53:30.091397 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:30 crc kubenswrapper[4743]: I1011 00:53:30.091783 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:30 crc kubenswrapper[4743]: E1011 00:53:30.091960 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:30 crc kubenswrapper[4743]: I1011 00:53:30.092206 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:30 crc kubenswrapper[4743]: E1011 00:53:30.092344 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:30 crc kubenswrapper[4743]: I1011 00:53:30.092743 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:30 crc kubenswrapper[4743]: E1011 00:53:30.092944 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:32 crc kubenswrapper[4743]: I1011 00:53:32.091043 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:32 crc kubenswrapper[4743]: I1011 00:53:32.091063 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:32 crc kubenswrapper[4743]: E1011 00:53:32.091620 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:32 crc kubenswrapper[4743]: I1011 00:53:32.091129 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:32 crc kubenswrapper[4743]: E1011 00:53:32.091890 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:32 crc kubenswrapper[4743]: I1011 00:53:32.091087 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:32 crc kubenswrapper[4743]: E1011 00:53:32.092041 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:32 crc kubenswrapper[4743]: E1011 00:53:32.091749 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:34 crc kubenswrapper[4743]: I1011 00:53:34.091069 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:34 crc kubenswrapper[4743]: E1011 00:53:34.091402 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:34 crc kubenswrapper[4743]: I1011 00:53:34.091446 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:34 crc kubenswrapper[4743]: I1011 00:53:34.091485 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:34 crc kubenswrapper[4743]: I1011 00:53:34.091503 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:34 crc kubenswrapper[4743]: E1011 00:53:34.091601 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:34 crc kubenswrapper[4743]: E1011 00:53:34.091714 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:34 crc kubenswrapper[4743]: E1011 00:53:34.091850 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:36 crc kubenswrapper[4743]: I1011 00:53:36.090902 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:36 crc kubenswrapper[4743]: I1011 00:53:36.091061 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:36 crc kubenswrapper[4743]: E1011 00:53:36.092924 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:36 crc kubenswrapper[4743]: I1011 00:53:36.093190 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:36 crc kubenswrapper[4743]: I1011 00:53:36.093283 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:36 crc kubenswrapper[4743]: E1011 00:53:36.093413 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:36 crc kubenswrapper[4743]: E1011 00:53:36.093758 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:36 crc kubenswrapper[4743]: E1011 00:53:36.094029 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:38 crc kubenswrapper[4743]: I1011 00:53:38.091461 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:38 crc kubenswrapper[4743]: E1011 00:53:38.091620 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:38 crc kubenswrapper[4743]: I1011 00:53:38.091743 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:38 crc kubenswrapper[4743]: I1011 00:53:38.091773 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:38 crc kubenswrapper[4743]: E1011 00:53:38.091986 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:38 crc kubenswrapper[4743]: E1011 00:53:38.092132 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:38 crc kubenswrapper[4743]: I1011 00:53:38.092975 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:38 crc kubenswrapper[4743]: E1011 00:53:38.093038 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:40 crc kubenswrapper[4743]: I1011 00:53:40.091397 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:40 crc kubenswrapper[4743]: I1011 00:53:40.091424 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:40 crc kubenswrapper[4743]: I1011 00:53:40.091501 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:40 crc kubenswrapper[4743]: I1011 00:53:40.091511 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:40 crc kubenswrapper[4743]: E1011 00:53:40.091826 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:40 crc kubenswrapper[4743]: E1011 00:53:40.092070 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:40 crc kubenswrapper[4743]: E1011 00:53:40.092183 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:40 crc kubenswrapper[4743]: E1011 00:53:40.092282 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:42 crc kubenswrapper[4743]: I1011 00:53:42.090721 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:42 crc kubenswrapper[4743]: I1011 00:53:42.090810 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:42 crc kubenswrapper[4743]: I1011 00:53:42.090923 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:42 crc kubenswrapper[4743]: E1011 00:53:42.091121 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:42 crc kubenswrapper[4743]: E1011 00:53:42.091031 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:42 crc kubenswrapper[4743]: E1011 00:53:42.092171 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:42 crc kubenswrapper[4743]: I1011 00:53:42.092222 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:42 crc kubenswrapper[4743]: E1011 00:53:42.092369 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:42 crc kubenswrapper[4743]: I1011 00:53:42.093304 4743 scope.go:117] "RemoveContainer" containerID="268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6" Oct 11 00:53:42 crc kubenswrapper[4743]: E1011 00:53:42.093600 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-48ljj_openshift-ovn-kubernetes(9ed16b35-862f-47f2-9e32-63c98f868fb8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" Oct 11 00:53:42 crc kubenswrapper[4743]: I1011 00:53:42.821625 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9jfxn_e8c603f4-717c-4554-992a-8338b3bef24d/kube-multus/1.log" Oct 11 00:53:42 crc kubenswrapper[4743]: I1011 00:53:42.822667 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9jfxn_e8c603f4-717c-4554-992a-8338b3bef24d/kube-multus/0.log" Oct 11 00:53:42 crc kubenswrapper[4743]: I1011 00:53:42.822847 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9jfxn" event={"ID":"e8c603f4-717c-4554-992a-8338b3bef24d","Type":"ContainerDied","Data":"21793d4ae38fc6e714912dbedbba8c45e37482a03cbd76d10461c41851e16896"} Oct 11 00:53:42 crc kubenswrapper[4743]: I1011 00:53:42.822945 4743 scope.go:117] "RemoveContainer" containerID="853ca9eee244aad8c7006298d6e541ea07022917bee14083b2ded233009257ad" Oct 11 00:53:42 crc kubenswrapper[4743]: I1011 00:53:42.822763 4743 generic.go:334] "Generic (PLEG): container finished" podID="e8c603f4-717c-4554-992a-8338b3bef24d" containerID="21793d4ae38fc6e714912dbedbba8c45e37482a03cbd76d10461c41851e16896" exitCode=1 Oct 11 00:53:42 crc kubenswrapper[4743]: I1011 00:53:42.823917 4743 scope.go:117] "RemoveContainer" containerID="21793d4ae38fc6e714912dbedbba8c45e37482a03cbd76d10461c41851e16896" Oct 11 00:53:42 crc kubenswrapper[4743]: E1011 00:53:42.824247 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-9jfxn_openshift-multus(e8c603f4-717c-4554-992a-8338b3bef24d)\"" pod="openshift-multus/multus-9jfxn" podUID="e8c603f4-717c-4554-992a-8338b3bef24d" Oct 11 00:53:42 crc kubenswrapper[4743]: I1011 00:53:42.851111 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2cjz2" podStartSLOduration=96.851095381 podStartE2EDuration="1m36.851095381s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:53:29.789333917 +0000 UTC m=+104.442314354" watchObservedRunningTime="2025-10-11 00:53:42.851095381 +0000 UTC m=+117.504075768" Oct 11 00:53:43 crc kubenswrapper[4743]: I1011 00:53:43.830429 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9jfxn_e8c603f4-717c-4554-992a-8338b3bef24d/kube-multus/1.log" Oct 11 00:53:44 crc kubenswrapper[4743]: I1011 00:53:44.091203 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:44 crc kubenswrapper[4743]: I1011 00:53:44.091249 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:44 crc kubenswrapper[4743]: I1011 00:53:44.091280 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:44 crc kubenswrapper[4743]: E1011 00:53:44.091368 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:44 crc kubenswrapper[4743]: I1011 00:53:44.091539 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:44 crc kubenswrapper[4743]: E1011 00:53:44.091532 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:44 crc kubenswrapper[4743]: E1011 00:53:44.091666 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:44 crc kubenswrapper[4743]: E1011 00:53:44.091940 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:46 crc kubenswrapper[4743]: I1011 00:53:46.091772 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:46 crc kubenswrapper[4743]: I1011 00:53:46.091779 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:46 crc kubenswrapper[4743]: I1011 00:53:46.091781 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:46 crc kubenswrapper[4743]: I1011 00:53:46.091796 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:46 crc kubenswrapper[4743]: E1011 00:53:46.093012 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:46 crc kubenswrapper[4743]: E1011 00:53:46.093118 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:46 crc kubenswrapper[4743]: E1011 00:53:46.093243 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:46 crc kubenswrapper[4743]: E1011 00:53:46.093345 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:46 crc kubenswrapper[4743]: E1011 00:53:46.116940 4743 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 11 00:53:46 crc kubenswrapper[4743]: E1011 00:53:46.234421 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 11 00:53:48 crc kubenswrapper[4743]: I1011 00:53:48.090977 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:48 crc kubenswrapper[4743]: I1011 00:53:48.091030 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:48 crc kubenswrapper[4743]: I1011 00:53:48.091050 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:48 crc kubenswrapper[4743]: E1011 00:53:48.091161 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:48 crc kubenswrapper[4743]: I1011 00:53:48.091232 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:48 crc kubenswrapper[4743]: E1011 00:53:48.091399 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:48 crc kubenswrapper[4743]: E1011 00:53:48.091545 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:48 crc kubenswrapper[4743]: E1011 00:53:48.091713 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:50 crc kubenswrapper[4743]: I1011 00:53:50.091141 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:50 crc kubenswrapper[4743]: I1011 00:53:50.091253 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:50 crc kubenswrapper[4743]: E1011 00:53:50.091331 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:50 crc kubenswrapper[4743]: I1011 00:53:50.091642 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:50 crc kubenswrapper[4743]: E1011 00:53:50.091742 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:50 crc kubenswrapper[4743]: I1011 00:53:50.092027 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:50 crc kubenswrapper[4743]: E1011 00:53:50.092241 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:50 crc kubenswrapper[4743]: E1011 00:53:50.092638 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:51 crc kubenswrapper[4743]: E1011 00:53:51.235495 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 11 00:53:52 crc kubenswrapper[4743]: I1011 00:53:52.091140 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:52 crc kubenswrapper[4743]: I1011 00:53:52.091140 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:52 crc kubenswrapper[4743]: E1011 00:53:52.091313 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:52 crc kubenswrapper[4743]: I1011 00:53:52.091419 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:52 crc kubenswrapper[4743]: E1011 00:53:52.091568 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:52 crc kubenswrapper[4743]: E1011 00:53:52.091986 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:52 crc kubenswrapper[4743]: I1011 00:53:52.091281 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:52 crc kubenswrapper[4743]: E1011 00:53:52.092421 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:54 crc kubenswrapper[4743]: I1011 00:53:54.091162 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:54 crc kubenswrapper[4743]: I1011 00:53:54.091213 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:54 crc kubenswrapper[4743]: I1011 00:53:54.091235 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:54 crc kubenswrapper[4743]: E1011 00:53:54.091384 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:54 crc kubenswrapper[4743]: I1011 00:53:54.091487 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:54 crc kubenswrapper[4743]: E1011 00:53:54.091628 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:54 crc kubenswrapper[4743]: E1011 00:53:54.091788 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:54 crc kubenswrapper[4743]: E1011 00:53:54.091985 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:54 crc kubenswrapper[4743]: I1011 00:53:54.093231 4743 scope.go:117] "RemoveContainer" containerID="268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6" Oct 11 00:53:54 crc kubenswrapper[4743]: I1011 00:53:54.871286 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48ljj_9ed16b35-862f-47f2-9e32-63c98f868fb8/ovnkube-controller/3.log" Oct 11 00:53:54 crc kubenswrapper[4743]: I1011 00:53:54.875039 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerStarted","Data":"4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae"} Oct 11 00:53:54 crc kubenswrapper[4743]: I1011 00:53:54.875559 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:53:54 crc kubenswrapper[4743]: I1011 00:53:54.915595 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" podStartSLOduration=108.91557887 podStartE2EDuration="1m48.91557887s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:53:54.915057015 +0000 UTC m=+129.568037492" watchObservedRunningTime="2025-10-11 00:53:54.91557887 +0000 UTC m=+129.568559277" Oct 11 00:53:55 crc kubenswrapper[4743]: I1011 00:53:55.014879 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cb5z5"] Oct 11 00:53:55 crc kubenswrapper[4743]: I1011 00:53:55.015009 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:55 crc kubenswrapper[4743]: E1011 00:53:55.015139 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:56 crc kubenswrapper[4743]: I1011 00:53:56.090883 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:56 crc kubenswrapper[4743]: I1011 00:53:56.090965 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:56 crc kubenswrapper[4743]: E1011 00:53:56.092992 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:56 crc kubenswrapper[4743]: I1011 00:53:56.093031 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:56 crc kubenswrapper[4743]: E1011 00:53:56.093167 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:56 crc kubenswrapper[4743]: E1011 00:53:56.093334 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:56 crc kubenswrapper[4743]: E1011 00:53:56.236196 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 11 00:53:57 crc kubenswrapper[4743]: I1011 00:53:57.091261 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:57 crc kubenswrapper[4743]: E1011 00:53:57.092200 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:53:57 crc kubenswrapper[4743]: I1011 00:53:57.092330 4743 scope.go:117] "RemoveContainer" containerID="21793d4ae38fc6e714912dbedbba8c45e37482a03cbd76d10461c41851e16896" Oct 11 00:53:57 crc kubenswrapper[4743]: I1011 00:53:57.888632 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9jfxn_e8c603f4-717c-4554-992a-8338b3bef24d/kube-multus/1.log" Oct 11 00:53:57 crc kubenswrapper[4743]: I1011 00:53:57.889250 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9jfxn" event={"ID":"e8c603f4-717c-4554-992a-8338b3bef24d","Type":"ContainerStarted","Data":"bdc42fd21a8b6982fc5516915cecbd0521737b5b4fd27556f887dbf66219ef33"} Oct 11 00:53:58 crc kubenswrapper[4743]: I1011 00:53:58.091667 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:53:58 crc kubenswrapper[4743]: I1011 00:53:58.091733 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:53:58 crc kubenswrapper[4743]: I1011 00:53:58.091684 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:53:58 crc kubenswrapper[4743]: E1011 00:53:58.091848 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:53:58 crc kubenswrapper[4743]: E1011 00:53:58.092030 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:53:58 crc kubenswrapper[4743]: E1011 00:53:58.092134 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:53:59 crc kubenswrapper[4743]: I1011 00:53:59.090685 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:53:59 crc kubenswrapper[4743]: E1011 00:53:59.091294 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:54:00 crc kubenswrapper[4743]: I1011 00:54:00.091568 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:54:00 crc kubenswrapper[4743]: I1011 00:54:00.091610 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:54:00 crc kubenswrapper[4743]: I1011 00:54:00.091622 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:54:00 crc kubenswrapper[4743]: E1011 00:54:00.091757 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 11 00:54:00 crc kubenswrapper[4743]: E1011 00:54:00.091918 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 11 00:54:00 crc kubenswrapper[4743]: E1011 00:54:00.092103 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 11 00:54:01 crc kubenswrapper[4743]: I1011 00:54:01.091158 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:54:01 crc kubenswrapper[4743]: E1011 00:54:01.091445 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb5z5" podUID="b02b8636-a5c4-447d-b1cf-401b3dcfa02b" Oct 11 00:54:02 crc kubenswrapper[4743]: I1011 00:54:02.091798 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:54:02 crc kubenswrapper[4743]: I1011 00:54:02.091896 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:54:02 crc kubenswrapper[4743]: I1011 00:54:02.092031 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:54:02 crc kubenswrapper[4743]: I1011 00:54:02.094985 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 11 00:54:02 crc kubenswrapper[4743]: I1011 00:54:02.095170 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 11 00:54:02 crc kubenswrapper[4743]: I1011 00:54:02.095349 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 11 00:54:02 crc kubenswrapper[4743]: I1011 00:54:02.095790 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 11 00:54:03 crc kubenswrapper[4743]: I1011 00:54:03.091086 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:54:03 crc kubenswrapper[4743]: I1011 00:54:03.094887 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 11 00:54:03 crc kubenswrapper[4743]: I1011 00:54:03.095202 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.339696 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.396287 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29335680-fmgl6"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.396782 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29335680-fmgl6" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.399517 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.400406 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.402968 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lxzlb"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.403474 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lxzlb" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.405283 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-p6l9k"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.412788 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wlvjw"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.413556 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6l9k" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.421041 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xwxfx"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.422304 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.422509 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.422718 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.422722 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wlvjw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.424213 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.424303 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.424453 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.424699 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.425104 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.425518 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.425752 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.426160 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.428948 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.429990 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.435363 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-468m5"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.436283 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.437688 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.438003 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.438235 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.440739 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fv4x7"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.441365 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.441994 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-229tw"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.442646 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-229tw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.445235 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lq56c"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.445829 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lq56c" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.447691 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tljzf"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.448482 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-tljzf" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.448829 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kzxfv"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.449337 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kzxfv" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.449432 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.451258 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.451424 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-qksm9"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.451980 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qksm9" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.452912 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.453837 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.454077 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.454187 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.454284 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.454497 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.454775 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.455051 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.455223 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.455271 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.455380 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.455427 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.455530 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.455568 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.455629 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.456092 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.456104 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.456206 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.456274 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.456420 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.456571 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.457563 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.457680 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.457782 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.457906 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.457957 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.458042 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.458138 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.458212 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.458271 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.458379 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.458475 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.458508 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.459036 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.459196 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.455386 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.459426 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.459576 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.459585 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.459786 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.459914 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.459999 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.460089 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.461031 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.461159 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.461160 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.461325 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.461333 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.461548 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.463146 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wswrq"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.463664 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wswrq" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.463728 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-rtt9m"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.464226 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-rtt9m" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.465115 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.465244 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.465631 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.465919 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.468129 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.470805 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.470935 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9npzd"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.471407 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-85wtv"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.482341 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9npzd" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.488342 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-24m6m"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.491566 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sjtsw"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.491840 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.500118 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-85wtv" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.535803 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.536157 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.536360 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gd5tn"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.536621 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f677q"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.536847 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537120 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lxzlb"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537199 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537334 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gd5tn" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537412 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f689d34-10fa-427c-8db0-cfc9324ae9de-metrics-tls\") pod \"dns-operator-744455d44c-9npzd\" (UID: \"7f689d34-10fa-427c-8db0-cfc9324ae9de\") " pod="openshift-dns-operator/dns-operator-744455d44c-9npzd" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537450 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f677q" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537450 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-node-pullsecrets\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537479 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/703e145f-c49a-40b7-b50f-a4902208d939-config\") pod \"openshift-apiserver-operator-796bbdcf4f-kzxfv\" (UID: \"703e145f-c49a-40b7-b50f-a4902208d939\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kzxfv" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537505 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61413bd4-c42d-4336-92db-d443ed8ea1de-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lq56c\" (UID: \"61413bd4-c42d-4336-92db-d443ed8ea1de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lq56c" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537521 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537528 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0c2c2460-fedf-4109-8fd9-986749f1e021-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wswrq\" (UID: \"0c2c2460-fedf-4109-8fd9-986749f1e021\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wswrq" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537552 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3d2036-8156-4d9f-9e11-2a8133ab5295-config\") pod \"console-operator-58897d9998-wlvjw\" (UID: \"8b3d2036-8156-4d9f-9e11-2a8133ab5295\") " pod="openshift-console-operator/console-operator-58897d9998-wlvjw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537576 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-etcd-client\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537599 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s59h4\" (UniqueName: \"kubernetes.io/projected/61413bd4-c42d-4336-92db-d443ed8ea1de-kube-api-access-s59h4\") pod \"openshift-controller-manager-operator-756b6f6bc6-lq56c\" (UID: \"61413bd4-c42d-4336-92db-d443ed8ea1de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lq56c" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537632 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5776f9a-8455-4c34-8496-0b4c4e821135-service-ca\") pod \"console-f9d7485db-468m5\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537655 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61413bd4-c42d-4336-92db-d443ed8ea1de-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lq56c\" (UID: \"61413bd4-c42d-4336-92db-d443ed8ea1de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lq56c" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537678 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9x79\" (UniqueName: \"kubernetes.io/projected/79462f0e-13e0-4ee7-af5f-02e6e5cd849d-kube-api-access-l9x79\") pod \"machine-api-operator-5694c8668f-tljzf\" (UID: \"79462f0e-13e0-4ee7-af5f-02e6e5cd849d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tljzf" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537701 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/121b64d6-750f-4720-8297-6f3a91dc3a3a-etcd-service-ca\") pod \"etcd-operator-b45778765-229tw\" (UID: \"121b64d6-750f-4720-8297-6f3a91dc3a3a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-229tw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537722 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7p6h\" (UniqueName: \"kubernetes.io/projected/8693817b-7cf6-486d-a055-93c4c0308d95-kube-api-access-f7p6h\") pod \"controller-manager-879f6c89f-fv4x7\" (UID: \"8693817b-7cf6-486d-a055-93c4c0308d95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537748 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/149214a5-565a-452f-abbb-1479919b6104-serving-cert\") pod \"authentication-operator-69f744f599-rtt9m\" (UID: \"149214a5-565a-452f-abbb-1479919b6104\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rtt9m" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537771 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn665\" (UniqueName: \"kubernetes.io/projected/149214a5-565a-452f-abbb-1479919b6104-kube-api-access-bn665\") pod \"authentication-operator-69f744f599-rtt9m\" (UID: \"149214a5-565a-452f-abbb-1479919b6104\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rtt9m" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537791 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-audit\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537813 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rxkv\" (UniqueName: \"kubernetes.io/projected/b5776f9a-8455-4c34-8496-0b4c4e821135-kube-api-access-6rxkv\") pod \"console-f9d7485db-468m5\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537835 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b5776f9a-8455-4c34-8496-0b4c4e821135-oauth-serving-cert\") pod \"console-f9d7485db-468m5\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537877 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-image-import-ca\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537899 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9b43d826-75ca-4c81-9f93-11b4398b96fa-machine-approver-tls\") pod \"machine-approver-56656f9798-p6l9k\" (UID: \"9b43d826-75ca-4c81-9f93-11b4398b96fa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6l9k" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.537921 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4v7x\" (UniqueName: \"kubernetes.io/projected/58a40e2a-9789-4d4c-8817-8aa7920baa39-kube-api-access-l4v7x\") pod \"apiserver-7bbb656c7d-cgd8f\" (UID: \"58a40e2a-9789-4d4c-8817-8aa7920baa39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538016 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-config\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538061 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9b43d826-75ca-4c81-9f93-11b4398b96fa-auth-proxy-config\") pod \"machine-approver-56656f9798-p6l9k\" (UID: \"9b43d826-75ca-4c81-9f93-11b4398b96fa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6l9k" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538098 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b3d2036-8156-4d9f-9e11-2a8133ab5295-serving-cert\") pod \"console-operator-58897d9998-wlvjw\" (UID: \"8b3d2036-8156-4d9f-9e11-2a8133ab5295\") " pod="openshift-console-operator/console-operator-58897d9998-wlvjw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538133 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9wfk\" (UniqueName: \"kubernetes.io/projected/e8a051b9-029d-4b92-a9a1-380c8d18f051-kube-api-access-l9wfk\") pod \"downloads-7954f5f757-qksm9\" (UID: \"e8a051b9-029d-4b92-a9a1-380c8d18f051\") " pod="openshift-console/downloads-7954f5f757-qksm9" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538163 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b43d826-75ca-4c81-9f93-11b4398b96fa-config\") pod \"machine-approver-56656f9798-p6l9k\" (UID: \"9b43d826-75ca-4c81-9f93-11b4398b96fa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6l9k" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538184 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0f26ac0d-8683-415e-850c-5aef3da4b59f-serviceca\") pod \"image-pruner-29335680-fmgl6\" (UID: \"0f26ac0d-8683-415e-850c-5aef3da4b59f\") " pod="openshift-image-registry/image-pruner-29335680-fmgl6" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538211 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r962j\" (UniqueName: \"kubernetes.io/projected/0c2c2460-fedf-4109-8fd9-986749f1e021-kube-api-access-r962j\") pod \"openshift-config-operator-7777fb866f-wswrq\" (UID: \"0c2c2460-fedf-4109-8fd9-986749f1e021\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wswrq" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538236 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/149214a5-565a-452f-abbb-1479919b6104-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-rtt9m\" (UID: \"149214a5-565a-452f-abbb-1479919b6104\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rtt9m" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538262 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5776f9a-8455-4c34-8496-0b4c4e821135-console-serving-cert\") pod \"console-f9d7485db-468m5\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538282 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-audit-dir\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538304 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glbks\" (UniqueName: \"kubernetes.io/projected/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-kube-api-access-glbks\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538323 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8693817b-7cf6-486d-a055-93c4c0308d95-client-ca\") pod \"controller-manager-879f6c89f-fv4x7\" (UID: \"8693817b-7cf6-486d-a055-93c4c0308d95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538358 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58a40e2a-9789-4d4c-8817-8aa7920baa39-audit-dir\") pod \"apiserver-7bbb656c7d-cgd8f\" (UID: \"58a40e2a-9789-4d4c-8817-8aa7920baa39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538381 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b3d2036-8156-4d9f-9e11-2a8133ab5295-trusted-ca\") pod \"console-operator-58897d9998-wlvjw\" (UID: \"8b3d2036-8156-4d9f-9e11-2a8133ab5295\") " pod="openshift-console-operator/console-operator-58897d9998-wlvjw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538403 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538425 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhlsf\" (UniqueName: \"kubernetes.io/projected/703e145f-c49a-40b7-b50f-a4902208d939-kube-api-access-xhlsf\") pod \"openshift-apiserver-operator-796bbdcf4f-kzxfv\" (UID: \"703e145f-c49a-40b7-b50f-a4902208d939\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kzxfv" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538444 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58a40e2a-9789-4d4c-8817-8aa7920baa39-serving-cert\") pod \"apiserver-7bbb656c7d-cgd8f\" (UID: \"58a40e2a-9789-4d4c-8817-8aa7920baa39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538516 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-etcd-serving-ca\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538542 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxhz4\" (UniqueName: \"kubernetes.io/projected/a0dfb25d-8789-4754-83a0-0e3ee1888e3e-kube-api-access-nxhz4\") pod \"cluster-samples-operator-665b6dd947-lxzlb\" (UID: \"a0dfb25d-8789-4754-83a0-0e3ee1888e3e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lxzlb" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538565 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c2c2460-fedf-4109-8fd9-986749f1e021-serving-cert\") pod \"openshift-config-operator-7777fb866f-wswrq\" (UID: \"0c2c2460-fedf-4109-8fd9-986749f1e021\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wswrq" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538588 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58a40e2a-9789-4d4c-8817-8aa7920baa39-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cgd8f\" (UID: \"58a40e2a-9789-4d4c-8817-8aa7920baa39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538642 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b5776f9a-8455-4c34-8496-0b4c4e821135-console-oauth-config\") pod \"console-f9d7485db-468m5\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538669 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-serving-cert\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538731 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/149214a5-565a-452f-abbb-1479919b6104-config\") pod \"authentication-operator-69f744f599-rtt9m\" (UID: \"149214a5-565a-452f-abbb-1479919b6104\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rtt9m" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538758 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxpbw\" (UniqueName: \"kubernetes.io/projected/7f689d34-10fa-427c-8db0-cfc9324ae9de-kube-api-access-gxpbw\") pod \"dns-operator-744455d44c-9npzd\" (UID: \"7f689d34-10fa-427c-8db0-cfc9324ae9de\") " pod="openshift-dns-operator/dns-operator-744455d44c-9npzd" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538779 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/703e145f-c49a-40b7-b50f-a4902208d939-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-kzxfv\" (UID: \"703e145f-c49a-40b7-b50f-a4902208d939\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kzxfv" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538806 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79462f0e-13e0-4ee7-af5f-02e6e5cd849d-config\") pod \"machine-api-operator-5694c8668f-tljzf\" (UID: \"79462f0e-13e0-4ee7-af5f-02e6e5cd849d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tljzf" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538886 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/121b64d6-750f-4720-8297-6f3a91dc3a3a-config\") pod \"etcd-operator-b45778765-229tw\" (UID: \"121b64d6-750f-4720-8297-6f3a91dc3a3a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-229tw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.538916 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbbx5\" (UniqueName: \"kubernetes.io/projected/0f26ac0d-8683-415e-850c-5aef3da4b59f-kube-api-access-fbbx5\") pod \"image-pruner-29335680-fmgl6\" (UID: \"0f26ac0d-8683-415e-850c-5aef3da4b59f\") " pod="openshift-image-registry/image-pruner-29335680-fmgl6" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.539142 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0dfb25d-8789-4754-83a0-0e3ee1888e3e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-lxzlb\" (UID: \"a0dfb25d-8789-4754-83a0-0e3ee1888e3e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lxzlb" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.539189 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44zhq\" (UniqueName: \"kubernetes.io/projected/121b64d6-750f-4720-8297-6f3a91dc3a3a-kube-api-access-44zhq\") pod \"etcd-operator-b45778765-229tw\" (UID: \"121b64d6-750f-4720-8297-6f3a91dc3a3a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-229tw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.539222 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8693817b-7cf6-486d-a055-93c4c0308d95-config\") pod \"controller-manager-879f6c89f-fv4x7\" (UID: \"8693817b-7cf6-486d-a055-93c4c0308d95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.539276 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8693817b-7cf6-486d-a055-93c4c0308d95-serving-cert\") pod \"controller-manager-879f6c89f-fv4x7\" (UID: \"8693817b-7cf6-486d-a055-93c4c0308d95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.539344 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/58a40e2a-9789-4d4c-8817-8aa7920baa39-etcd-client\") pod \"apiserver-7bbb656c7d-cgd8f\" (UID: \"58a40e2a-9789-4d4c-8817-8aa7920baa39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.539373 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/58a40e2a-9789-4d4c-8817-8aa7920baa39-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cgd8f\" (UID: \"58a40e2a-9789-4d4c-8817-8aa7920baa39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.539426 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5776f9a-8455-4c34-8496-0b4c4e821135-trusted-ca-bundle\") pod \"console-f9d7485db-468m5\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.539467 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/149214a5-565a-452f-abbb-1479919b6104-service-ca-bundle\") pod \"authentication-operator-69f744f599-rtt9m\" (UID: \"149214a5-565a-452f-abbb-1479919b6104\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rtt9m" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.539506 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-encryption-config\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.539575 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/121b64d6-750f-4720-8297-6f3a91dc3a3a-etcd-client\") pod \"etcd-operator-b45778765-229tw\" (UID: \"121b64d6-750f-4720-8297-6f3a91dc3a3a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-229tw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.539659 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8693817b-7cf6-486d-a055-93c4c0308d95-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fv4x7\" (UID: \"8693817b-7cf6-486d-a055-93c4c0308d95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.539726 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/79462f0e-13e0-4ee7-af5f-02e6e5cd849d-images\") pod \"machine-api-operator-5694c8668f-tljzf\" (UID: \"79462f0e-13e0-4ee7-af5f-02e6e5cd849d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tljzf" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.539910 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/121b64d6-750f-4720-8297-6f3a91dc3a3a-serving-cert\") pod \"etcd-operator-b45778765-229tw\" (UID: \"121b64d6-750f-4720-8297-6f3a91dc3a3a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-229tw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.540002 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.540014 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6qv9\" (UniqueName: \"kubernetes.io/projected/9b43d826-75ca-4c81-9f93-11b4398b96fa-kube-api-access-z6qv9\") pod \"machine-approver-56656f9798-p6l9k\" (UID: \"9b43d826-75ca-4c81-9f93-11b4398b96fa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6l9k" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.540042 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58a40e2a-9789-4d4c-8817-8aa7920baa39-audit-policies\") pod \"apiserver-7bbb656c7d-cgd8f\" (UID: \"58a40e2a-9789-4d4c-8817-8aa7920baa39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.540079 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/58a40e2a-9789-4d4c-8817-8aa7920baa39-encryption-config\") pod \"apiserver-7bbb656c7d-cgd8f\" (UID: \"58a40e2a-9789-4d4c-8817-8aa7920baa39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.540099 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.540106 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qrxh\" (UniqueName: \"kubernetes.io/projected/8b3d2036-8156-4d9f-9e11-2a8133ab5295-kube-api-access-6qrxh\") pod \"console-operator-58897d9998-wlvjw\" (UID: \"8b3d2036-8156-4d9f-9e11-2a8133ab5295\") " pod="openshift-console-operator/console-operator-58897d9998-wlvjw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.540130 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b5776f9a-8455-4c34-8496-0b4c4e821135-console-config\") pod \"console-f9d7485db-468m5\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.540152 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/121b64d6-750f-4720-8297-6f3a91dc3a3a-etcd-ca\") pod \"etcd-operator-b45778765-229tw\" (UID: \"121b64d6-750f-4720-8297-6f3a91dc3a3a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-229tw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.540173 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/79462f0e-13e0-4ee7-af5f-02e6e5cd849d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tljzf\" (UID: \"79462f0e-13e0-4ee7-af5f-02e6e5cd849d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tljzf" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.545576 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.545866 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.546065 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.552122 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w57f"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.559286 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rdwmd"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.559670 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29335680-fmgl6"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.559744 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-v98q2"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.560008 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w57f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.555474 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.560372 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v98q2" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.560242 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bt5pp"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.555593 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.560754 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rdwmd" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.561209 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bt5pp" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.561319 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.546571 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.546619 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.546700 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.546777 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.546814 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.546982 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.547025 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.547081 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.548129 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.550092 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.550768 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.553361 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.553406 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.553583 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.553610 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.554376 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.555711 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.556594 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.546503 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.565190 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bc6rx"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.565820 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-6srbj"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.566196 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6srbj" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.566215 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bc6rx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.566832 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rt2nd"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.567600 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rt2nd" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.568656 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-z2mc9"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.571257 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wm7rl"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.571368 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z2mc9" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.571940 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wm7rl" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.578886 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqmp6"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.579515 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8jrj"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.579944 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8jrj" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.580261 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-62mz2"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.580366 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqmp6" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.580889 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-62mz2" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.587760 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-65mgg"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.589484 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-65mgg" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.591208 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335725-pg67v"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.592705 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335725-pg67v" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.595300 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ccjcp"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.595701 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.597666 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ccjcp" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.604761 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.613930 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.613913 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.614701 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.616878 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4k997"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.617584 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4k997" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.617932 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xgtfn"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.618325 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xgtfn" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.618883 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wlvjw"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.624163 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.626320 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xwxfx"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.627392 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-smtjk"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.628057 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-smtjk" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.628127 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.629365 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fv4x7"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.630448 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-229tw"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.636081 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wswrq"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.636972 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qksm9"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.637897 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f677q"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.638739 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.638901 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-24m6m"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.640553 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f689d34-10fa-427c-8db0-cfc9324ae9de-metrics-tls\") pod \"dns-operator-744455d44c-9npzd\" (UID: \"7f689d34-10fa-427c-8db0-cfc9324ae9de\") " pod="openshift-dns-operator/dns-operator-744455d44c-9npzd" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.640581 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-node-pullsecrets\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.640601 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/703e145f-c49a-40b7-b50f-a4902208d939-config\") pod \"openshift-apiserver-operator-796bbdcf4f-kzxfv\" (UID: \"703e145f-c49a-40b7-b50f-a4902208d939\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kzxfv" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.640618 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61413bd4-c42d-4336-92db-d443ed8ea1de-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lq56c\" (UID: \"61413bd4-c42d-4336-92db-d443ed8ea1de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lq56c" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.640636 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0c2c2460-fedf-4109-8fd9-986749f1e021-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wswrq\" (UID: \"0c2c2460-fedf-4109-8fd9-986749f1e021\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wswrq" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.640654 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3d2036-8156-4d9f-9e11-2a8133ab5295-config\") pod \"console-operator-58897d9998-wlvjw\" (UID: \"8b3d2036-8156-4d9f-9e11-2a8133ab5295\") " pod="openshift-console-operator/console-operator-58897d9998-wlvjw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.640669 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-etcd-client\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.640686 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd684c3f-8689-4d7b-ab5e-ff1c14ab9747-config\") pod \"kube-apiserver-operator-766d6c64bb-9w57f\" (UID: \"bd684c3f-8689-4d7b-ab5e-ff1c14ab9747\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w57f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.640704 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s59h4\" (UniqueName: \"kubernetes.io/projected/61413bd4-c42d-4336-92db-d443ed8ea1de-kube-api-access-s59h4\") pod \"openshift-controller-manager-operator-756b6f6bc6-lq56c\" (UID: \"61413bd4-c42d-4336-92db-d443ed8ea1de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lq56c" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.640728 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5776f9a-8455-4c34-8496-0b4c4e821135-service-ca\") pod \"console-f9d7485db-468m5\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.640791 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rt2nd"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.640743 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61413bd4-c42d-4336-92db-d443ed8ea1de-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lq56c\" (UID: \"61413bd4-c42d-4336-92db-d443ed8ea1de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lq56c" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.640830 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9x79\" (UniqueName: \"kubernetes.io/projected/79462f0e-13e0-4ee7-af5f-02e6e5cd849d-kube-api-access-l9x79\") pod \"machine-api-operator-5694c8668f-tljzf\" (UID: \"79462f0e-13e0-4ee7-af5f-02e6e5cd849d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tljzf" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.640870 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e59e36c1-5053-44bc-bb35-cab447a646fb-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gd5tn\" (UID: \"e59e36c1-5053-44bc-bb35-cab447a646fb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gd5tn" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.640890 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/121b64d6-750f-4720-8297-6f3a91dc3a3a-etcd-service-ca\") pod \"etcd-operator-b45778765-229tw\" (UID: \"121b64d6-750f-4720-8297-6f3a91dc3a3a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-229tw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.640906 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7p6h\" (UniqueName: \"kubernetes.io/projected/8693817b-7cf6-486d-a055-93c4c0308d95-kube-api-access-f7p6h\") pod \"controller-manager-879f6c89f-fv4x7\" (UID: \"8693817b-7cf6-486d-a055-93c4c0308d95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.640923 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/149214a5-565a-452f-abbb-1479919b6104-serving-cert\") pod \"authentication-operator-69f744f599-rtt9m\" (UID: \"149214a5-565a-452f-abbb-1479919b6104\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rtt9m" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.640939 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn665\" (UniqueName: \"kubernetes.io/projected/149214a5-565a-452f-abbb-1479919b6104-kube-api-access-bn665\") pod \"authentication-operator-69f744f599-rtt9m\" (UID: \"149214a5-565a-452f-abbb-1479919b6104\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rtt9m" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.640956 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-audit\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.640974 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e59e36c1-5053-44bc-bb35-cab447a646fb-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gd5tn\" (UID: \"e59e36c1-5053-44bc-bb35-cab447a646fb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gd5tn" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.640991 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd684c3f-8689-4d7b-ab5e-ff1c14ab9747-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9w57f\" (UID: \"bd684c3f-8689-4d7b-ab5e-ff1c14ab9747\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w57f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641008 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rxkv\" (UniqueName: \"kubernetes.io/projected/b5776f9a-8455-4c34-8496-0b4c4e821135-kube-api-access-6rxkv\") pod \"console-f9d7485db-468m5\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641027 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b5776f9a-8455-4c34-8496-0b4c4e821135-oauth-serving-cert\") pod \"console-f9d7485db-468m5\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641042 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-image-import-ca\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641059 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9b43d826-75ca-4c81-9f93-11b4398b96fa-machine-approver-tls\") pod \"machine-approver-56656f9798-p6l9k\" (UID: \"9b43d826-75ca-4c81-9f93-11b4398b96fa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6l9k" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641076 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4v7x\" (UniqueName: \"kubernetes.io/projected/58a40e2a-9789-4d4c-8817-8aa7920baa39-kube-api-access-l4v7x\") pod \"apiserver-7bbb656c7d-cgd8f\" (UID: \"58a40e2a-9789-4d4c-8817-8aa7920baa39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641091 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-config\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641106 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e59e36c1-5053-44bc-bb35-cab447a646fb-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gd5tn\" (UID: \"e59e36c1-5053-44bc-bb35-cab447a646fb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gd5tn" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641129 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9b43d826-75ca-4c81-9f93-11b4398b96fa-auth-proxy-config\") pod \"machine-approver-56656f9798-p6l9k\" (UID: \"9b43d826-75ca-4c81-9f93-11b4398b96fa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6l9k" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641150 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b3d2036-8156-4d9f-9e11-2a8133ab5295-serving-cert\") pod \"console-operator-58897d9998-wlvjw\" (UID: \"8b3d2036-8156-4d9f-9e11-2a8133ab5295\") " pod="openshift-console-operator/console-operator-58897d9998-wlvjw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641166 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9wfk\" (UniqueName: \"kubernetes.io/projected/e8a051b9-029d-4b92-a9a1-380c8d18f051-kube-api-access-l9wfk\") pod \"downloads-7954f5f757-qksm9\" (UID: \"e8a051b9-029d-4b92-a9a1-380c8d18f051\") " pod="openshift-console/downloads-7954f5f757-qksm9" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641181 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b43d826-75ca-4c81-9f93-11b4398b96fa-config\") pod \"machine-approver-56656f9798-p6l9k\" (UID: \"9b43d826-75ca-4c81-9f93-11b4398b96fa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6l9k" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641196 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0f26ac0d-8683-415e-850c-5aef3da4b59f-serviceca\") pod \"image-pruner-29335680-fmgl6\" (UID: \"0f26ac0d-8683-415e-850c-5aef3da4b59f\") " pod="openshift-image-registry/image-pruner-29335680-fmgl6" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641211 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r962j\" (UniqueName: \"kubernetes.io/projected/0c2c2460-fedf-4109-8fd9-986749f1e021-kube-api-access-r962j\") pod \"openshift-config-operator-7777fb866f-wswrq\" (UID: \"0c2c2460-fedf-4109-8fd9-986749f1e021\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wswrq" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641229 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/149214a5-565a-452f-abbb-1479919b6104-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-rtt9m\" (UID: \"149214a5-565a-452f-abbb-1479919b6104\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rtt9m" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641248 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5776f9a-8455-4c34-8496-0b4c4e821135-console-serving-cert\") pod \"console-f9d7485db-468m5\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641262 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-audit-dir\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641281 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glbks\" (UniqueName: \"kubernetes.io/projected/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-kube-api-access-glbks\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641297 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8693817b-7cf6-486d-a055-93c4c0308d95-client-ca\") pod \"controller-manager-879f6c89f-fv4x7\" (UID: \"8693817b-7cf6-486d-a055-93c4c0308d95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641313 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58a40e2a-9789-4d4c-8817-8aa7920baa39-audit-dir\") pod \"apiserver-7bbb656c7d-cgd8f\" (UID: \"58a40e2a-9789-4d4c-8817-8aa7920baa39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641330 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b3d2036-8156-4d9f-9e11-2a8133ab5295-trusted-ca\") pod \"console-operator-58897d9998-wlvjw\" (UID: \"8b3d2036-8156-4d9f-9e11-2a8133ab5295\") " pod="openshift-console-operator/console-operator-58897d9998-wlvjw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641344 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641361 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhlsf\" (UniqueName: \"kubernetes.io/projected/703e145f-c49a-40b7-b50f-a4902208d939-kube-api-access-xhlsf\") pod \"openshift-apiserver-operator-796bbdcf4f-kzxfv\" (UID: \"703e145f-c49a-40b7-b50f-a4902208d939\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kzxfv" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641375 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58a40e2a-9789-4d4c-8817-8aa7920baa39-serving-cert\") pod \"apiserver-7bbb656c7d-cgd8f\" (UID: \"58a40e2a-9789-4d4c-8817-8aa7920baa39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641390 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-etcd-serving-ca\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641406 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxhz4\" (UniqueName: \"kubernetes.io/projected/a0dfb25d-8789-4754-83a0-0e3ee1888e3e-kube-api-access-nxhz4\") pod \"cluster-samples-operator-665b6dd947-lxzlb\" (UID: \"a0dfb25d-8789-4754-83a0-0e3ee1888e3e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lxzlb" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641421 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c2c2460-fedf-4109-8fd9-986749f1e021-serving-cert\") pod \"openshift-config-operator-7777fb866f-wswrq\" (UID: \"0c2c2460-fedf-4109-8fd9-986749f1e021\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wswrq" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641436 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58a40e2a-9789-4d4c-8817-8aa7920baa39-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cgd8f\" (UID: \"58a40e2a-9789-4d4c-8817-8aa7920baa39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641452 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l7ks\" (UniqueName: \"kubernetes.io/projected/e59e36c1-5053-44bc-bb35-cab447a646fb-kube-api-access-6l7ks\") pod \"cluster-image-registry-operator-dc59b4c8b-gd5tn\" (UID: \"e59e36c1-5053-44bc-bb35-cab447a646fb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gd5tn" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641469 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b5776f9a-8455-4c34-8496-0b4c4e821135-console-oauth-config\") pod \"console-f9d7485db-468m5\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641484 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-serving-cert\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641500 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/149214a5-565a-452f-abbb-1479919b6104-config\") pod \"authentication-operator-69f744f599-rtt9m\" (UID: \"149214a5-565a-452f-abbb-1479919b6104\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rtt9m" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641517 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxpbw\" (UniqueName: \"kubernetes.io/projected/7f689d34-10fa-427c-8db0-cfc9324ae9de-kube-api-access-gxpbw\") pod \"dns-operator-744455d44c-9npzd\" (UID: \"7f689d34-10fa-427c-8db0-cfc9324ae9de\") " pod="openshift-dns-operator/dns-operator-744455d44c-9npzd" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641534 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/703e145f-c49a-40b7-b50f-a4902208d939-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-kzxfv\" (UID: \"703e145f-c49a-40b7-b50f-a4902208d939\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kzxfv" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641550 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79462f0e-13e0-4ee7-af5f-02e6e5cd849d-config\") pod \"machine-api-operator-5694c8668f-tljzf\" (UID: \"79462f0e-13e0-4ee7-af5f-02e6e5cd849d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tljzf" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641574 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/121b64d6-750f-4720-8297-6f3a91dc3a3a-config\") pod \"etcd-operator-b45778765-229tw\" (UID: \"121b64d6-750f-4720-8297-6f3a91dc3a3a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-229tw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641589 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbbx5\" (UniqueName: \"kubernetes.io/projected/0f26ac0d-8683-415e-850c-5aef3da4b59f-kube-api-access-fbbx5\") pod \"image-pruner-29335680-fmgl6\" (UID: \"0f26ac0d-8683-415e-850c-5aef3da4b59f\") " pod="openshift-image-registry/image-pruner-29335680-fmgl6" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641605 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0dfb25d-8789-4754-83a0-0e3ee1888e3e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-lxzlb\" (UID: \"a0dfb25d-8789-4754-83a0-0e3ee1888e3e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lxzlb" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641621 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44zhq\" (UniqueName: \"kubernetes.io/projected/121b64d6-750f-4720-8297-6f3a91dc3a3a-kube-api-access-44zhq\") pod \"etcd-operator-b45778765-229tw\" (UID: \"121b64d6-750f-4720-8297-6f3a91dc3a3a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-229tw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641618 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lq56c"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641636 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8693817b-7cf6-486d-a055-93c4c0308d95-config\") pod \"controller-manager-879f6c89f-fv4x7\" (UID: \"8693817b-7cf6-486d-a055-93c4c0308d95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641652 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/58a40e2a-9789-4d4c-8817-8aa7920baa39-etcd-client\") pod \"apiserver-7bbb656c7d-cgd8f\" (UID: \"58a40e2a-9789-4d4c-8817-8aa7920baa39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641667 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/58a40e2a-9789-4d4c-8817-8aa7920baa39-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cgd8f\" (UID: \"58a40e2a-9789-4d4c-8817-8aa7920baa39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641682 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8693817b-7cf6-486d-a055-93c4c0308d95-serving-cert\") pod \"controller-manager-879f6c89f-fv4x7\" (UID: \"8693817b-7cf6-486d-a055-93c4c0308d95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641682 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0c2c2460-fedf-4109-8fd9-986749f1e021-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wswrq\" (UID: \"0c2c2460-fedf-4109-8fd9-986749f1e021\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wswrq" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641697 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd684c3f-8689-4d7b-ab5e-ff1c14ab9747-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9w57f\" (UID: \"bd684c3f-8689-4d7b-ab5e-ff1c14ab9747\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w57f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641713 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/149214a5-565a-452f-abbb-1479919b6104-service-ca-bundle\") pod \"authentication-operator-69f744f599-rtt9m\" (UID: \"149214a5-565a-452f-abbb-1479919b6104\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rtt9m" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641730 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-encryption-config\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641745 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5776f9a-8455-4c34-8496-0b4c4e821135-trusted-ca-bundle\") pod \"console-f9d7485db-468m5\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641762 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/121b64d6-750f-4720-8297-6f3a91dc3a3a-etcd-client\") pod \"etcd-operator-b45778765-229tw\" (UID: \"121b64d6-750f-4720-8297-6f3a91dc3a3a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-229tw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641778 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8693817b-7cf6-486d-a055-93c4c0308d95-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fv4x7\" (UID: \"8693817b-7cf6-486d-a055-93c4c0308d95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641817 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/79462f0e-13e0-4ee7-af5f-02e6e5cd849d-images\") pod \"machine-api-operator-5694c8668f-tljzf\" (UID: \"79462f0e-13e0-4ee7-af5f-02e6e5cd849d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tljzf" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641844 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/121b64d6-750f-4720-8297-6f3a91dc3a3a-serving-cert\") pod \"etcd-operator-b45778765-229tw\" (UID: \"121b64d6-750f-4720-8297-6f3a91dc3a3a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-229tw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641875 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6qv9\" (UniqueName: \"kubernetes.io/projected/9b43d826-75ca-4c81-9f93-11b4398b96fa-kube-api-access-z6qv9\") pod \"machine-approver-56656f9798-p6l9k\" (UID: \"9b43d826-75ca-4c81-9f93-11b4398b96fa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6l9k" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641890 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58a40e2a-9789-4d4c-8817-8aa7920baa39-audit-policies\") pod \"apiserver-7bbb656c7d-cgd8f\" (UID: \"58a40e2a-9789-4d4c-8817-8aa7920baa39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641909 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/58a40e2a-9789-4d4c-8817-8aa7920baa39-encryption-config\") pod \"apiserver-7bbb656c7d-cgd8f\" (UID: \"58a40e2a-9789-4d4c-8817-8aa7920baa39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641924 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qrxh\" (UniqueName: \"kubernetes.io/projected/8b3d2036-8156-4d9f-9e11-2a8133ab5295-kube-api-access-6qrxh\") pod \"console-operator-58897d9998-wlvjw\" (UID: \"8b3d2036-8156-4d9f-9e11-2a8133ab5295\") " pod="openshift-console-operator/console-operator-58897d9998-wlvjw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641940 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b5776f9a-8455-4c34-8496-0b4c4e821135-console-config\") pod \"console-f9d7485db-468m5\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641954 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/121b64d6-750f-4720-8297-6f3a91dc3a3a-etcd-ca\") pod \"etcd-operator-b45778765-229tw\" (UID: \"121b64d6-750f-4720-8297-6f3a91dc3a3a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-229tw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.641969 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/79462f0e-13e0-4ee7-af5f-02e6e5cd849d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tljzf\" (UID: \"79462f0e-13e0-4ee7-af5f-02e6e5cd849d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tljzf" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.642201 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/703e145f-c49a-40b7-b50f-a4902208d939-config\") pod \"openshift-apiserver-operator-796bbdcf4f-kzxfv\" (UID: \"703e145f-c49a-40b7-b50f-a4902208d939\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kzxfv" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.642346 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3d2036-8156-4d9f-9e11-2a8133ab5295-config\") pod \"console-operator-58897d9998-wlvjw\" (UID: \"8b3d2036-8156-4d9f-9e11-2a8133ab5295\") " pod="openshift-console-operator/console-operator-58897d9998-wlvjw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.642998 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58a40e2a-9789-4d4c-8817-8aa7920baa39-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cgd8f\" (UID: \"58a40e2a-9789-4d4c-8817-8aa7920baa39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.643561 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-audit\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.643828 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/121b64d6-750f-4720-8297-6f3a91dc3a3a-etcd-service-ca\") pod \"etcd-operator-b45778765-229tw\" (UID: \"121b64d6-750f-4720-8297-6f3a91dc3a3a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-229tw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.643890 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5776f9a-8455-4c34-8496-0b4c4e821135-service-ca\") pod \"console-f9d7485db-468m5\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.644261 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b5776f9a-8455-4c34-8496-0b4c4e821135-oauth-serving-cert\") pod \"console-f9d7485db-468m5\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.644395 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61413bd4-c42d-4336-92db-d443ed8ea1de-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lq56c\" (UID: \"61413bd4-c42d-4336-92db-d443ed8ea1de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lq56c" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.644986 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-image-import-ca\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.645147 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/79462f0e-13e0-4ee7-af5f-02e6e5cd849d-images\") pod \"machine-api-operator-5694c8668f-tljzf\" (UID: \"79462f0e-13e0-4ee7-af5f-02e6e5cd849d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tljzf" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.646567 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5776f9a-8455-4c34-8496-0b4c4e821135-trusted-ca-bundle\") pod \"console-f9d7485db-468m5\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.646905 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-node-pullsecrets\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.647361 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-85wtv"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.647390 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kzxfv"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.647400 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rdwmd"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.647803 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-etcd-client\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.649391 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/121b64d6-750f-4720-8297-6f3a91dc3a3a-etcd-client\") pod \"etcd-operator-b45778765-229tw\" (UID: \"121b64d6-750f-4720-8297-6f3a91dc3a3a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-229tw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.649688 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/121b64d6-750f-4720-8297-6f3a91dc3a3a-serving-cert\") pod \"etcd-operator-b45778765-229tw\" (UID: \"121b64d6-750f-4720-8297-6f3a91dc3a3a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-229tw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.649909 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b5776f9a-8455-4c34-8496-0b4c4e821135-console-config\") pod \"console-f9d7485db-468m5\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.650307 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/121b64d6-750f-4720-8297-6f3a91dc3a3a-etcd-ca\") pod \"etcd-operator-b45778765-229tw\" (UID: \"121b64d6-750f-4720-8297-6f3a91dc3a3a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-229tw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.650341 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.650630 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-468m5"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.650825 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/121b64d6-750f-4720-8297-6f3a91dc3a3a-config\") pod \"etcd-operator-b45778765-229tw\" (UID: \"121b64d6-750f-4720-8297-6f3a91dc3a3a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-229tw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.651432 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c2c2460-fedf-4109-8fd9-986749f1e021-serving-cert\") pod \"openshift-config-operator-7777fb866f-wswrq\" (UID: \"0c2c2460-fedf-4109-8fd9-986749f1e021\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wswrq" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.651623 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wm7rl"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.652148 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/79462f0e-13e0-4ee7-af5f-02e6e5cd849d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tljzf\" (UID: \"79462f0e-13e0-4ee7-af5f-02e6e5cd849d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tljzf" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.653145 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8693817b-7cf6-486d-a055-93c4c0308d95-serving-cert\") pod \"controller-manager-879f6c89f-fv4x7\" (UID: \"8693817b-7cf6-486d-a055-93c4c0308d95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.653445 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0dfb25d-8789-4754-83a0-0e3ee1888e3e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-lxzlb\" (UID: \"a0dfb25d-8789-4754-83a0-0e3ee1888e3e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lxzlb" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.653751 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/149214a5-565a-452f-abbb-1479919b6104-serving-cert\") pod \"authentication-operator-69f744f599-rtt9m\" (UID: \"149214a5-565a-452f-abbb-1479919b6104\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rtt9m" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.654030 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-encryption-config\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.654039 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f689d34-10fa-427c-8db0-cfc9324ae9de-metrics-tls\") pod \"dns-operator-744455d44c-9npzd\" (UID: \"7f689d34-10fa-427c-8db0-cfc9324ae9de\") " pod="openshift-dns-operator/dns-operator-744455d44c-9npzd" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.654753 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-rtt9m"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.654756 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9b43d826-75ca-4c81-9f93-11b4398b96fa-machine-approver-tls\") pod \"machine-approver-56656f9798-p6l9k\" (UID: \"9b43d826-75ca-4c81-9f93-11b4398b96fa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6l9k" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.654849 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-config\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.654990 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8693817b-7cf6-486d-a055-93c4c0308d95-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fv4x7\" (UID: \"8693817b-7cf6-486d-a055-93c4c0308d95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.655280 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9b43d826-75ca-4c81-9f93-11b4398b96fa-auth-proxy-config\") pod \"machine-approver-56656f9798-p6l9k\" (UID: \"9b43d826-75ca-4c81-9f93-11b4398b96fa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6l9k" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.655381 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/149214a5-565a-452f-abbb-1479919b6104-service-ca-bundle\") pod \"authentication-operator-69f744f599-rtt9m\" (UID: \"149214a5-565a-452f-abbb-1479919b6104\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rtt9m" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.655602 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b43d826-75ca-4c81-9f93-11b4398b96fa-config\") pod \"machine-approver-56656f9798-p6l9k\" (UID: \"9b43d826-75ca-4c81-9f93-11b4398b96fa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6l9k" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.656469 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-serving-cert\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.656697 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/149214a5-565a-452f-abbb-1479919b6104-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-rtt9m\" (UID: \"149214a5-565a-452f-abbb-1479919b6104\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rtt9m" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.657111 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-62mz2"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.657139 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9npzd"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.657164 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-audit-dir\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.657767 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8693817b-7cf6-486d-a055-93c4c0308d95-client-ca\") pod \"controller-manager-879f6c89f-fv4x7\" (UID: \"8693817b-7cf6-486d-a055-93c4c0308d95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.657801 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58a40e2a-9789-4d4c-8817-8aa7920baa39-audit-dir\") pod \"apiserver-7bbb656c7d-cgd8f\" (UID: \"58a40e2a-9789-4d4c-8817-8aa7920baa39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.658116 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5776f9a-8455-4c34-8496-0b4c4e821135-console-serving-cert\") pod \"console-f9d7485db-468m5\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.658465 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0f26ac0d-8683-415e-850c-5aef3da4b59f-serviceca\") pod \"image-pruner-29335680-fmgl6\" (UID: \"0f26ac0d-8683-415e-850c-5aef3da4b59f\") " pod="openshift-image-registry/image-pruner-29335680-fmgl6" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.658753 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b3d2036-8156-4d9f-9e11-2a8133ab5295-trusted-ca\") pod \"console-operator-58897d9998-wlvjw\" (UID: \"8b3d2036-8156-4d9f-9e11-2a8133ab5295\") " pod="openshift-console-operator/console-operator-58897d9998-wlvjw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.659157 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.659486 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79462f0e-13e0-4ee7-af5f-02e6e5cd849d-config\") pod \"machine-api-operator-5694c8668f-tljzf\" (UID: \"79462f0e-13e0-4ee7-af5f-02e6e5cd849d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tljzf" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.659729 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8693817b-7cf6-486d-a055-93c4c0308d95-config\") pod \"controller-manager-879f6c89f-fv4x7\" (UID: \"8693817b-7cf6-486d-a055-93c4c0308d95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.660044 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58a40e2a-9789-4d4c-8817-8aa7920baa39-audit-policies\") pod \"apiserver-7bbb656c7d-cgd8f\" (UID: \"58a40e2a-9789-4d4c-8817-8aa7920baa39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.660046 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-6zk27"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.660100 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b3d2036-8156-4d9f-9e11-2a8133ab5295-serving-cert\") pod \"console-operator-58897d9998-wlvjw\" (UID: \"8b3d2036-8156-4d9f-9e11-2a8133ab5295\") " pod="openshift-console-operator/console-operator-58897d9998-wlvjw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.660232 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-etcd-serving-ca\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.660578 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/58a40e2a-9789-4d4c-8817-8aa7920baa39-etcd-client\") pod \"apiserver-7bbb656c7d-cgd8f\" (UID: \"58a40e2a-9789-4d4c-8817-8aa7920baa39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.660656 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-6zk27" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.660831 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/149214a5-565a-452f-abbb-1479919b6104-config\") pod \"authentication-operator-69f744f599-rtt9m\" (UID: \"149214a5-565a-452f-abbb-1479919b6104\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rtt9m" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.661174 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/58a40e2a-9789-4d4c-8817-8aa7920baa39-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cgd8f\" (UID: \"58a40e2a-9789-4d4c-8817-8aa7920baa39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.662567 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61413bd4-c42d-4336-92db-d443ed8ea1de-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lq56c\" (UID: \"61413bd4-c42d-4336-92db-d443ed8ea1de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lq56c" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.662848 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/58a40e2a-9789-4d4c-8817-8aa7920baa39-encryption-config\") pod \"apiserver-7bbb656c7d-cgd8f\" (UID: \"58a40e2a-9789-4d4c-8817-8aa7920baa39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.663638 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/703e145f-c49a-40b7-b50f-a4902208d939-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-kzxfv\" (UID: \"703e145f-c49a-40b7-b50f-a4902208d939\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kzxfv" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.663782 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.663989 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-grqvq"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.665018 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-grqvq" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.665123 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sjtsw"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.667821 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b5776f9a-8455-4c34-8496-0b4c4e821135-console-oauth-config\") pod \"console-f9d7485db-468m5\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.667877 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8jrj"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.667894 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqmp6"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.670458 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bc6rx"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.670489 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-z2mc9"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.670505 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-smtjk"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.671042 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58a40e2a-9789-4d4c-8817-8aa7920baa39-serving-cert\") pod \"apiserver-7bbb656c7d-cgd8f\" (UID: \"58a40e2a-9789-4d4c-8817-8aa7920baa39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.672667 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335725-pg67v"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.673464 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ccjcp"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.675600 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bt5pp"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.676608 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gd5tn"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.678174 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-65mgg"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.679193 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.680751 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tljzf"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.682771 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-v98q2"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.686320 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w57f"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.692913 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4k997"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.694127 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xgtfn"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.696011 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-grqvq"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.699416 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.699927 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-mlnnl"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.703311 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mlnnl"] Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.703481 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mlnnl" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.719070 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.738401 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.742769 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l7ks\" (UniqueName: \"kubernetes.io/projected/e59e36c1-5053-44bc-bb35-cab447a646fb-kube-api-access-6l7ks\") pod \"cluster-image-registry-operator-dc59b4c8b-gd5tn\" (UID: \"e59e36c1-5053-44bc-bb35-cab447a646fb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gd5tn" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.743012 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd684c3f-8689-4d7b-ab5e-ff1c14ab9747-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9w57f\" (UID: \"bd684c3f-8689-4d7b-ab5e-ff1c14ab9747\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w57f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.743085 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd684c3f-8689-4d7b-ab5e-ff1c14ab9747-config\") pod \"kube-apiserver-operator-766d6c64bb-9w57f\" (UID: \"bd684c3f-8689-4d7b-ab5e-ff1c14ab9747\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w57f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.743136 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e59e36c1-5053-44bc-bb35-cab447a646fb-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gd5tn\" (UID: \"e59e36c1-5053-44bc-bb35-cab447a646fb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gd5tn" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.743166 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e59e36c1-5053-44bc-bb35-cab447a646fb-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gd5tn\" (UID: \"e59e36c1-5053-44bc-bb35-cab447a646fb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gd5tn" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.743191 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd684c3f-8689-4d7b-ab5e-ff1c14ab9747-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9w57f\" (UID: \"bd684c3f-8689-4d7b-ab5e-ff1c14ab9747\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w57f" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.743224 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e59e36c1-5053-44bc-bb35-cab447a646fb-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gd5tn\" (UID: \"e59e36c1-5053-44bc-bb35-cab447a646fb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gd5tn" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.744524 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e59e36c1-5053-44bc-bb35-cab447a646fb-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gd5tn\" (UID: \"e59e36c1-5053-44bc-bb35-cab447a646fb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gd5tn" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.746319 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e59e36c1-5053-44bc-bb35-cab447a646fb-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gd5tn\" (UID: \"e59e36c1-5053-44bc-bb35-cab447a646fb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gd5tn" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.759189 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.779239 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.799314 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.818603 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.839380 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.864422 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.894707 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.898573 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.919206 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.938777 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.959267 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.979264 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 11 00:54:09 crc kubenswrapper[4743]: I1011 00:54:09.999111 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.026372 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.040426 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.059357 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.079226 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.099454 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.120205 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.139211 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.154469 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd684c3f-8689-4d7b-ab5e-ff1c14ab9747-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9w57f\" (UID: \"bd684c3f-8689-4d7b-ab5e-ff1c14ab9747\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w57f" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.160545 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.165515 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd684c3f-8689-4d7b-ab5e-ff1c14ab9747-config\") pod \"kube-apiserver-operator-766d6c64bb-9w57f\" (UID: \"bd684c3f-8689-4d7b-ab5e-ff1c14ab9747\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w57f" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.179318 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.199585 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.219238 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.239427 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.259681 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.279934 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.300479 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.319503 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.338981 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.360285 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.379727 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.399266 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.420027 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.440049 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.464409 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.480759 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.499922 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.519296 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.539292 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.559498 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.578018 4743 request.go:700] Waited for 1.011440617s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmco-proxy-tls&limit=500&resourceVersion=0 Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.580350 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.599737 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.619900 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.640236 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.660077 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.680884 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.699693 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.719796 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.739623 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.760009 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.780025 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.798609 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.819314 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.840296 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.859177 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.899697 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.920136 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.939596 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.959755 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.980139 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 11 00:54:10 crc kubenswrapper[4743]: I1011 00:54:10.999678 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.020144 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.039189 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.059712 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.080180 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.100054 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.119781 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.139645 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.159672 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.180354 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.199500 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.219525 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.250086 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.279526 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.298762 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.318788 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.366439 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn665\" (UniqueName: \"kubernetes.io/projected/149214a5-565a-452f-abbb-1479919b6104-kube-api-access-bn665\") pod \"authentication-operator-69f744f599-rtt9m\" (UID: \"149214a5-565a-452f-abbb-1479919b6104\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rtt9m" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.386487 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9x79\" (UniqueName: \"kubernetes.io/projected/79462f0e-13e0-4ee7-af5f-02e6e5cd849d-kube-api-access-l9x79\") pod \"machine-api-operator-5694c8668f-tljzf\" (UID: \"79462f0e-13e0-4ee7-af5f-02e6e5cd849d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tljzf" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.407385 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s59h4\" (UniqueName: \"kubernetes.io/projected/61413bd4-c42d-4336-92db-d443ed8ea1de-kube-api-access-s59h4\") pod \"openshift-controller-manager-operator-756b6f6bc6-lq56c\" (UID: \"61413bd4-c42d-4336-92db-d443ed8ea1de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lq56c" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.427813 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rxkv\" (UniqueName: \"kubernetes.io/projected/b5776f9a-8455-4c34-8496-0b4c4e821135-kube-api-access-6rxkv\") pod \"console-f9d7485db-468m5\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.437172 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lq56c" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.443094 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7p6h\" (UniqueName: \"kubernetes.io/projected/8693817b-7cf6-486d-a055-93c4c0308d95-kube-api-access-f7p6h\") pod \"controller-manager-879f6c89f-fv4x7\" (UID: \"8693817b-7cf6-486d-a055-93c4c0308d95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.462411 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxhz4\" (UniqueName: \"kubernetes.io/projected/a0dfb25d-8789-4754-83a0-0e3ee1888e3e-kube-api-access-nxhz4\") pod \"cluster-samples-operator-665b6dd947-lxzlb\" (UID: \"a0dfb25d-8789-4754-83a0-0e3ee1888e3e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lxzlb" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.484891 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-tljzf" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.490148 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qrxh\" (UniqueName: \"kubernetes.io/projected/8b3d2036-8156-4d9f-9e11-2a8133ab5295-kube-api-access-6qrxh\") pod \"console-operator-58897d9998-wlvjw\" (UID: \"8b3d2036-8156-4d9f-9e11-2a8133ab5295\") " pod="openshift-console-operator/console-operator-58897d9998-wlvjw" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.510705 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44zhq\" (UniqueName: \"kubernetes.io/projected/121b64d6-750f-4720-8297-6f3a91dc3a3a-kube-api-access-44zhq\") pod \"etcd-operator-b45778765-229tw\" (UID: \"121b64d6-750f-4720-8297-6f3a91dc3a3a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-229tw" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.513208 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-rtt9m" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.528095 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbbx5\" (UniqueName: \"kubernetes.io/projected/0f26ac0d-8683-415e-850c-5aef3da4b59f-kube-api-access-fbbx5\") pod \"image-pruner-29335680-fmgl6\" (UID: \"0f26ac0d-8683-415e-850c-5aef3da4b59f\") " pod="openshift-image-registry/image-pruner-29335680-fmgl6" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.541174 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29335680-fmgl6" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.548353 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4v7x\" (UniqueName: \"kubernetes.io/projected/58a40e2a-9789-4d4c-8817-8aa7920baa39-kube-api-access-l4v7x\") pod \"apiserver-7bbb656c7d-cgd8f\" (UID: \"58a40e2a-9789-4d4c-8817-8aa7920baa39\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.562358 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9wfk\" (UniqueName: \"kubernetes.io/projected/e8a051b9-029d-4b92-a9a1-380c8d18f051-kube-api-access-l9wfk\") pod \"downloads-7954f5f757-qksm9\" (UID: \"e8a051b9-029d-4b92-a9a1-380c8d18f051\") " pod="openshift-console/downloads-7954f5f757-qksm9" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.564189 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lxzlb" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.591257 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r962j\" (UniqueName: \"kubernetes.io/projected/0c2c2460-fedf-4109-8fd9-986749f1e021-kube-api-access-r962j\") pod \"openshift-config-operator-7777fb866f-wswrq\" (UID: \"0c2c2460-fedf-4109-8fd9-986749f1e021\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wswrq" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.597954 4743 request.go:700] Waited for 1.938334845s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-machine-approver/serviceaccounts/machine-approver-sa/token Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.605681 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glbks\" (UniqueName: \"kubernetes.io/projected/3abfab8d-967c-42bf-9f48-bd69bbe6f8f2-kube-api-access-glbks\") pod \"apiserver-76f77b778f-xwxfx\" (UID: \"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2\") " pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.623237 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6qv9\" (UniqueName: \"kubernetes.io/projected/9b43d826-75ca-4c81-9f93-11b4398b96fa-kube-api-access-z6qv9\") pod \"machine-approver-56656f9798-p6l9k\" (UID: \"9b43d826-75ca-4c81-9f93-11b4398b96fa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6l9k" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.646387 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhlsf\" (UniqueName: \"kubernetes.io/projected/703e145f-c49a-40b7-b50f-a4902208d939-kube-api-access-xhlsf\") pod \"openshift-apiserver-operator-796bbdcf4f-kzxfv\" (UID: \"703e145f-c49a-40b7-b50f-a4902208d939\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kzxfv" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.648080 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wlvjw" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.654650 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.659486 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.661309 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxpbw\" (UniqueName: \"kubernetes.io/projected/7f689d34-10fa-427c-8db0-cfc9324ae9de-kube-api-access-gxpbw\") pod \"dns-operator-744455d44c-9npzd\" (UID: \"7f689d34-10fa-427c-8db0-cfc9324ae9de\") " pod="openshift-dns-operator/dns-operator-744455d44c-9npzd" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.674619 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.681203 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.693889 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.704727 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.707919 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.719650 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.721767 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-229tw" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.743348 4743 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.760344 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.779457 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.789474 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kzxfv" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.801174 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.801986 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qksm9" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.807811 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wswrq" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.819902 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9npzd" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.820063 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.841500 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.876717 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l7ks\" (UniqueName: \"kubernetes.io/projected/e59e36c1-5053-44bc-bb35-cab447a646fb-kube-api-access-6l7ks\") pod \"cluster-image-registry-operator-dc59b4c8b-gd5tn\" (UID: \"e59e36c1-5053-44bc-bb35-cab447a646fb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gd5tn" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.888108 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6l9k" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.896884 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd684c3f-8689-4d7b-ab5e-ff1c14ab9747-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9w57f\" (UID: \"bd684c3f-8689-4d7b-ab5e-ff1c14ab9747\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w57f" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.914023 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lxzlb"] Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.917169 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e59e36c1-5053-44bc-bb35-cab447a646fb-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gd5tn\" (UID: \"e59e36c1-5053-44bc-bb35-cab447a646fb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gd5tn" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.956195 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6l9k" event={"ID":"9b43d826-75ca-4c81-9f93-11b4398b96fa","Type":"ContainerStarted","Data":"a984bb7bb62d3bdef67518d5bca0abd504c226d17cd023c2eb62fb5395bc0787"} Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.971337 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lq56c"] Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.971822 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv7cl\" (UniqueName: \"kubernetes.io/projected/122090c8-fe9b-404f-a9f3-a2ee9520091a-kube-api-access-lv7cl\") pod \"package-server-manager-789f6589d5-62mz2\" (UID: \"122090c8-fe9b-404f-a9f3-a2ee9520091a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-62mz2" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.971870 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.971889 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9d6b106d-6589-40f2-b694-eb158c541d82-installation-pull-secrets\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.971904 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e2550ee-2c0c-437b-a4e3-3332ffba4e48-proxy-tls\") pod \"machine-config-operator-74547568cd-bc6rx\" (UID: \"6e2550ee-2c0c-437b-a4e3-3332ffba4e48\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bc6rx" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.972106 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1c78dcdb-20e7-4e37-95ba-c3576a63b2cf-profile-collector-cert\") pod \"olm-operator-6b444d44fb-s8jrj\" (UID: \"1c78dcdb-20e7-4e37-95ba-c3576a63b2cf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8jrj" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.972139 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f8744f8-b334-4487-a52b-c6b6f226401d-trusted-ca\") pod \"ingress-operator-5b745b69d9-85wtv\" (UID: \"3f8744f8-b334-4487-a52b-c6b6f226401d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-85wtv" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.972174 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdxls\" (UniqueName: \"kubernetes.io/projected/7ffea29d-93cb-4136-b0bc-7861c20751d4-kube-api-access-qdxls\") pod \"multus-admission-controller-857f4d67dd-wm7rl\" (UID: \"7ffea29d-93cb-4136-b0bc-7861c20751d4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wm7rl" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.972199 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7ffea29d-93cb-4136-b0bc-7861c20751d4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wm7rl\" (UID: \"7ffea29d-93cb-4136-b0bc-7861c20751d4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wm7rl" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.972216 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bc629a2-3424-4180-a506-1accee2c0246-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bt5pp\" (UID: \"5bc629a2-3424-4180-a506-1accee2c0246\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bt5pp" Oct 11 00:54:11 crc kubenswrapper[4743]: E1011 00:54:11.972253 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:12.472237349 +0000 UTC m=+147.125217746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.972343 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv5d7\" (UniqueName: \"kubernetes.io/projected/f3263799-b7d3-4b1b-a61b-4768a061e502-kube-api-access-mv5d7\") pod \"machine-config-controller-84d6567774-v98q2\" (UID: \"f3263799-b7d3-4b1b-a61b-4768a061e502\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v98q2" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.972414 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-audit-dir\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.972466 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f3263799-b7d3-4b1b-a61b-4768a061e502-proxy-tls\") pod \"machine-config-controller-84d6567774-v98q2\" (UID: \"f3263799-b7d3-4b1b-a61b-4768a061e502\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v98q2" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.972486 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.972590 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9d6b106d-6589-40f2-b694-eb158c541d82-registry-certificates\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.972606 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f8744f8-b334-4487-a52b-c6b6f226401d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-85wtv\" (UID: \"3f8744f8-b334-4487-a52b-c6b6f226401d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-85wtv" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.972648 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9d6b106d-6589-40f2-b694-eb158c541d82-ca-trust-extracted\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.972707 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.972867 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5bc629a2-3424-4180-a506-1accee2c0246-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bt5pp\" (UID: \"5bc629a2-3424-4180-a506-1accee2c0246\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bt5pp" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.973362 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/006ce2c3-4b48-4621-acc0-50428bb1c862-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rdwmd\" (UID: \"006ce2c3-4b48-4621-acc0-50428bb1c862\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rdwmd" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.973502 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/224b66ec-aec1-4d9e-96c0-b6c7ddeb37e7-signing-cabundle\") pod \"service-ca-9c57cc56f-65mgg\" (UID: \"224b66ec-aec1-4d9e-96c0-b6c7ddeb37e7\") " pod="openshift-service-ca/service-ca-9c57cc56f-65mgg" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.973535 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5690cb54-1fbe-4d33-a809-b7bdca4df6c0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rt2nd\" (UID: \"5690cb54-1fbe-4d33-a809-b7bdca4df6c0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rt2nd" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.973683 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pttq\" (UniqueName: \"kubernetes.io/projected/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-kube-api-access-4pttq\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.973733 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1c78dcdb-20e7-4e37-95ba-c3576a63b2cf-srv-cert\") pod \"olm-operator-6b444d44fb-s8jrj\" (UID: \"1c78dcdb-20e7-4e37-95ba-c3576a63b2cf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8jrj" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.973823 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1ae9d5c5-6a97-483f-9b76-45c73259c85b-stats-auth\") pod \"router-default-5444994796-6srbj\" (UID: \"1ae9d5c5-6a97-483f-9b76-45c73259c85b\") " pod="openshift-ingress/router-default-5444994796-6srbj" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.974004 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.974113 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d6b106d-6589-40f2-b694-eb158c541d82-trusted-ca\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.974425 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e2550ee-2c0c-437b-a4e3-3332ffba4e48-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bc6rx\" (UID: \"6e2550ee-2c0c-437b-a4e3-3332ffba4e48\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bc6rx" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.974587 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1ae9d5c5-6a97-483f-9b76-45c73259c85b-default-certificate\") pod \"router-default-5444994796-6srbj\" (UID: \"1ae9d5c5-6a97-483f-9b76-45c73259c85b\") " pod="openshift-ingress/router-default-5444994796-6srbj" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.974629 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f3263799-b7d3-4b1b-a61b-4768a061e502-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-v98q2\" (UID: \"f3263799-b7d3-4b1b-a61b-4768a061e502\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v98q2" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.987997 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.988494 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ae9d5c5-6a97-483f-9b76-45c73259c85b-metrics-certs\") pod \"router-default-5444994796-6srbj\" (UID: \"1ae9d5c5-6a97-483f-9b76-45c73259c85b\") " pod="openshift-ingress/router-default-5444994796-6srbj" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.988629 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.988710 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f8744f8-b334-4487-a52b-c6b6f226401d-metrics-tls\") pod \"ingress-operator-5b745b69d9-85wtv\" (UID: \"3f8744f8-b334-4487-a52b-c6b6f226401d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-85wtv" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.988781 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/224b66ec-aec1-4d9e-96c0-b6c7ddeb37e7-signing-key\") pod \"service-ca-9c57cc56f-65mgg\" (UID: \"224b66ec-aec1-4d9e-96c0-b6c7ddeb37e7\") " pod="openshift-service-ca/service-ca-9c57cc56f-65mgg" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.988954 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68717faf-e06c-4833-9b59-3e6279b38a6d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-f677q\" (UID: \"68717faf-e06c-4833-9b59-3e6279b38a6d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f677q" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.989035 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68717faf-e06c-4833-9b59-3e6279b38a6d-config\") pod \"kube-controller-manager-operator-78b949d7b-f677q\" (UID: \"68717faf-e06c-4833-9b59-3e6279b38a6d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f677q" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.991166 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgjjc\" (UniqueName: \"kubernetes.io/projected/eaad8a68-7587-4d59-9b94-e8dcdc1869f8-kube-api-access-pgjjc\") pod \"migrator-59844c95c7-z2mc9\" (UID: \"eaad8a68-7587-4d59-9b94-e8dcdc1869f8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z2mc9" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.991311 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6e2550ee-2c0c-437b-a4e3-3332ffba4e48-images\") pod \"machine-config-operator-74547568cd-bc6rx\" (UID: \"6e2550ee-2c0c-437b-a4e3-3332ffba4e48\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bc6rx" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.991444 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/122090c8-fe9b-404f-a9f3-a2ee9520091a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-62mz2\" (UID: \"122090c8-fe9b-404f-a9f3-a2ee9520091a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-62mz2" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.991557 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/88ae6ca7-3631-4aee-820c-5c979a0d4f02-profile-collector-cert\") pod \"catalog-operator-68c6474976-cqmp6\" (UID: \"88ae6ca7-3631-4aee-820c-5c979a0d4f02\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqmp6" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.991665 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmjtw\" (UniqueName: \"kubernetes.io/projected/006ce2c3-4b48-4621-acc0-50428bb1c862-kube-api-access-gmjtw\") pod \"kube-storage-version-migrator-operator-b67b599dd-rdwmd\" (UID: \"006ce2c3-4b48-4621-acc0-50428bb1c862\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rdwmd" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.992052 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bbkn\" (UniqueName: \"kubernetes.io/projected/1ae9d5c5-6a97-483f-9b76-45c73259c85b-kube-api-access-2bbkn\") pod \"router-default-5444994796-6srbj\" (UID: \"1ae9d5c5-6a97-483f-9b76-45c73259c85b\") " pod="openshift-ingress/router-default-5444994796-6srbj" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.992142 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.992245 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk5j7\" (UniqueName: \"kubernetes.io/projected/88ae6ca7-3631-4aee-820c-5c979a0d4f02-kube-api-access-wk5j7\") pod \"catalog-operator-68c6474976-cqmp6\" (UID: \"88ae6ca7-3631-4aee-820c-5c979a0d4f02\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqmp6" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.992353 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-audit-policies\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.992441 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.992528 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2305c756-5c61-4d13-aac9-1ff5d3c6b2ad-config\") pod \"route-controller-manager-6576b87f9c-f2gjt\" (UID: \"2305c756-5c61-4d13-aac9-1ff5d3c6b2ad\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.992600 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ae9d5c5-6a97-483f-9b76-45c73259c85b-service-ca-bundle\") pod \"router-default-5444994796-6srbj\" (UID: \"1ae9d5c5-6a97-483f-9b76-45c73259c85b\") " pod="openshift-ingress/router-default-5444994796-6srbj" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.992676 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.992823 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swbdz\" (UniqueName: \"kubernetes.io/projected/9d6b106d-6589-40f2-b694-eb158c541d82-kube-api-access-swbdz\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.994456 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bc629a2-3424-4180-a506-1accee2c0246-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bt5pp\" (UID: \"5bc629a2-3424-4180-a506-1accee2c0246\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bt5pp" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.994565 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/88ae6ca7-3631-4aee-820c-5c979a0d4f02-srv-cert\") pod \"catalog-operator-68c6474976-cqmp6\" (UID: \"88ae6ca7-3631-4aee-820c-5c979a0d4f02\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqmp6" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.994721 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.994925 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjmcn\" (UniqueName: \"kubernetes.io/projected/2305c756-5c61-4d13-aac9-1ff5d3c6b2ad-kube-api-access-hjmcn\") pod \"route-controller-manager-6576b87f9c-f2gjt\" (UID: \"2305c756-5c61-4d13-aac9-1ff5d3c6b2ad\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.995040 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68717faf-e06c-4833-9b59-3e6279b38a6d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-f677q\" (UID: \"68717faf-e06c-4833-9b59-3e6279b38a6d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f677q" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.995182 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z52pv\" (UniqueName: \"kubernetes.io/projected/3f8744f8-b334-4487-a52b-c6b6f226401d-kube-api-access-z52pv\") pod \"ingress-operator-5b745b69d9-85wtv\" (UID: \"3f8744f8-b334-4487-a52b-c6b6f226401d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-85wtv" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.995278 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgpg8\" (UniqueName: \"kubernetes.io/projected/224b66ec-aec1-4d9e-96c0-b6c7ddeb37e7-kube-api-access-hgpg8\") pod \"service-ca-9c57cc56f-65mgg\" (UID: \"224b66ec-aec1-4d9e-96c0-b6c7ddeb37e7\") " pod="openshift-service-ca/service-ca-9c57cc56f-65mgg" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.995445 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9d6b106d-6589-40f2-b694-eb158c541d82-registry-tls\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.995579 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mfxs\" (UniqueName: \"kubernetes.io/projected/5690cb54-1fbe-4d33-a809-b7bdca4df6c0-kube-api-access-2mfxs\") pod \"control-plane-machine-set-operator-78cbb6b69f-rt2nd\" (UID: \"5690cb54-1fbe-4d33-a809-b7bdca4df6c0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rt2nd" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.995655 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2305c756-5c61-4d13-aac9-1ff5d3c6b2ad-client-ca\") pod \"route-controller-manager-6576b87f9c-f2gjt\" (UID: \"2305c756-5c61-4d13-aac9-1ff5d3c6b2ad\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.995733 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnbhz\" (UniqueName: \"kubernetes.io/projected/1c78dcdb-20e7-4e37-95ba-c3576a63b2cf-kube-api-access-hnbhz\") pod \"olm-operator-6b444d44fb-s8jrj\" (UID: \"1c78dcdb-20e7-4e37-95ba-c3576a63b2cf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8jrj" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.995885 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d6b106d-6589-40f2-b694-eb158c541d82-bound-sa-token\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.995974 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kth6j\" (UniqueName: \"kubernetes.io/projected/6e2550ee-2c0c-437b-a4e3-3332ffba4e48-kube-api-access-kth6j\") pod \"machine-config-operator-74547568cd-bc6rx\" (UID: \"6e2550ee-2c0c-437b-a4e3-3332ffba4e48\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bc6rx" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.996351 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/006ce2c3-4b48-4621-acc0-50428bb1c862-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rdwmd\" (UID: \"006ce2c3-4b48-4621-acc0-50428bb1c862\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rdwmd" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.996458 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2305c756-5c61-4d13-aac9-1ff5d3c6b2ad-serving-cert\") pod \"route-controller-manager-6576b87f9c-f2gjt\" (UID: \"2305c756-5c61-4d13-aac9-1ff5d3c6b2ad\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.996571 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:11 crc kubenswrapper[4743]: I1011 00:54:11.996669 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.017536 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tljzf"] Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.022345 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xwxfx"] Oct 11 00:54:12 crc kubenswrapper[4743]: W1011 00:54:12.028416 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3abfab8d_967c_42bf_9f48_bd69bbe6f8f2.slice/crio-a7c214e50c8cb4fca7352d646cc7defe43ba54bb92a6a9fc4df867521e371270 WatchSource:0}: Error finding container a7c214e50c8cb4fca7352d646cc7defe43ba54bb92a6a9fc4df867521e371270: Status 404 returned error can't find the container with id a7c214e50c8cb4fca7352d646cc7defe43ba54bb92a6a9fc4df867521e371270 Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.033807 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29335680-fmgl6"] Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.042520 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-rtt9m"] Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098278 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098391 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/544ad287-fc9b-4033-b139-3893a3e10a00-apiservice-cert\") pod \"packageserver-d55dfcdfc-ccjcp\" (UID: \"544ad287-fc9b-4033-b139-3893a3e10a00\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ccjcp" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098416 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f8744f8-b334-4487-a52b-c6b6f226401d-trusted-ca\") pod \"ingress-operator-5b745b69d9-85wtv\" (UID: \"3f8744f8-b334-4487-a52b-c6b6f226401d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-85wtv" Oct 11 00:54:12 crc kubenswrapper[4743]: E1011 00:54:12.098448 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:12.598419478 +0000 UTC m=+147.251399875 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098485 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1c78dcdb-20e7-4e37-95ba-c3576a63b2cf-profile-collector-cert\") pod \"olm-operator-6b444d44fb-s8jrj\" (UID: \"1c78dcdb-20e7-4e37-95ba-c3576a63b2cf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8jrj" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098521 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/962b1e46-ba63-4aa2-9882-a04487a05813-config-volume\") pod \"collect-profiles-29335725-pg67v\" (UID: \"962b1e46-ba63-4aa2-9882-a04487a05813\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335725-pg67v" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098546 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdxls\" (UniqueName: \"kubernetes.io/projected/7ffea29d-93cb-4136-b0bc-7861c20751d4-kube-api-access-qdxls\") pod \"multus-admission-controller-857f4d67dd-wm7rl\" (UID: \"7ffea29d-93cb-4136-b0bc-7861c20751d4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wm7rl" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098563 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtqv5\" (UniqueName: \"kubernetes.io/projected/544ad287-fc9b-4033-b139-3893a3e10a00-kube-api-access-jtqv5\") pod \"packageserver-d55dfcdfc-ccjcp\" (UID: \"544ad287-fc9b-4033-b139-3893a3e10a00\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ccjcp" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098580 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7ffea29d-93cb-4136-b0bc-7861c20751d4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wm7rl\" (UID: \"7ffea29d-93cb-4136-b0bc-7861c20751d4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wm7rl" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098601 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bc629a2-3424-4180-a506-1accee2c0246-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bt5pp\" (UID: \"5bc629a2-3424-4180-a506-1accee2c0246\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bt5pp" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098618 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv5d7\" (UniqueName: \"kubernetes.io/projected/f3263799-b7d3-4b1b-a61b-4768a061e502-kube-api-access-mv5d7\") pod \"machine-config-controller-84d6567774-v98q2\" (UID: \"f3263799-b7d3-4b1b-a61b-4768a061e502\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v98q2" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098634 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-audit-dir\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098655 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f3263799-b7d3-4b1b-a61b-4768a061e502-proxy-tls\") pod \"machine-config-controller-84d6567774-v98q2\" (UID: \"f3263799-b7d3-4b1b-a61b-4768a061e502\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v98q2" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098671 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098689 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/37c86724-d259-449c-baf5-b8965ea658a9-node-bootstrap-token\") pod \"machine-config-server-6zk27\" (UID: \"37c86724-d259-449c-baf5-b8965ea658a9\") " pod="openshift-machine-config-operator/machine-config-server-6zk27" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098716 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9d6b106d-6589-40f2-b694-eb158c541d82-registry-certificates\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098732 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f8744f8-b334-4487-a52b-c6b6f226401d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-85wtv\" (UID: \"3f8744f8-b334-4487-a52b-c6b6f226401d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-85wtv" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098747 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/4a97f1cb-48e1-4049-be4a-0151201d5cc3-mountpoint-dir\") pod \"csi-hostpathplugin-grqvq\" (UID: \"4a97f1cb-48e1-4049-be4a-0151201d5cc3\") " pod="hostpath-provisioner/csi-hostpathplugin-grqvq" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098766 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/4a97f1cb-48e1-4049-be4a-0151201d5cc3-csi-data-dir\") pod \"csi-hostpathplugin-grqvq\" (UID: \"4a97f1cb-48e1-4049-be4a-0151201d5cc3\") " pod="hostpath-provisioner/csi-hostpathplugin-grqvq" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098792 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9d6b106d-6589-40f2-b694-eb158c541d82-ca-trust-extracted\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098809 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098831 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5bc629a2-3424-4180-a506-1accee2c0246-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bt5pp\" (UID: \"5bc629a2-3424-4180-a506-1accee2c0246\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bt5pp" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098847 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/962b1e46-ba63-4aa2-9882-a04487a05813-secret-volume\") pod \"collect-profiles-29335725-pg67v\" (UID: \"962b1e46-ba63-4aa2-9882-a04487a05813\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335725-pg67v" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098885 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16bef631-ee0f-4346-bb9b-c6eb48a09448-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xgtfn\" (UID: \"16bef631-ee0f-4346-bb9b-c6eb48a09448\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgtfn" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098903 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4a97f1cb-48e1-4049-be4a-0151201d5cc3-socket-dir\") pod \"csi-hostpathplugin-grqvq\" (UID: \"4a97f1cb-48e1-4049-be4a-0151201d5cc3\") " pod="hostpath-provisioner/csi-hostpathplugin-grqvq" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098921 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mvml\" (UniqueName: \"kubernetes.io/projected/b489193c-9eab-4025-8c8f-23b3b42bae0e-kube-api-access-9mvml\") pod \"dns-default-smtjk\" (UID: \"b489193c-9eab-4025-8c8f-23b3b42bae0e\") " pod="openshift-dns/dns-default-smtjk" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098943 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/006ce2c3-4b48-4621-acc0-50428bb1c862-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rdwmd\" (UID: \"006ce2c3-4b48-4621-acc0-50428bb1c862\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rdwmd" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098980 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5690cb54-1fbe-4d33-a809-b7bdca4df6c0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rt2nd\" (UID: \"5690cb54-1fbe-4d33-a809-b7bdca4df6c0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rt2nd" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.098996 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/224b66ec-aec1-4d9e-96c0-b6c7ddeb37e7-signing-cabundle\") pod \"service-ca-9c57cc56f-65mgg\" (UID: \"224b66ec-aec1-4d9e-96c0-b6c7ddeb37e7\") " pod="openshift-service-ca/service-ca-9c57cc56f-65mgg" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.099013 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pttq\" (UniqueName: \"kubernetes.io/projected/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-kube-api-access-4pttq\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.099030 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1c78dcdb-20e7-4e37-95ba-c3576a63b2cf-srv-cert\") pod \"olm-operator-6b444d44fb-s8jrj\" (UID: \"1c78dcdb-20e7-4e37-95ba-c3576a63b2cf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8jrj" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.099049 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1ae9d5c5-6a97-483f-9b76-45c73259c85b-stats-auth\") pod \"router-default-5444994796-6srbj\" (UID: \"1ae9d5c5-6a97-483f-9b76-45c73259c85b\") " pod="openshift-ingress/router-default-5444994796-6srbj" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.099064 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.099163 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d6b106d-6589-40f2-b694-eb158c541d82-trusted-ca\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.099211 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e2550ee-2c0c-437b-a4e3-3332ffba4e48-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bc6rx\" (UID: \"6e2550ee-2c0c-437b-a4e3-3332ffba4e48\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bc6rx" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.099419 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f8744f8-b334-4487-a52b-c6b6f226401d-trusted-ca\") pod \"ingress-operator-5b745b69d9-85wtv\" (UID: \"3f8744f8-b334-4487-a52b-c6b6f226401d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-85wtv" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.099978 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e2550ee-2c0c-437b-a4e3-3332ffba4e48-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bc6rx\" (UID: \"6e2550ee-2c0c-437b-a4e3-3332ffba4e48\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bc6rx" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.100026 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b489193c-9eab-4025-8c8f-23b3b42bae0e-metrics-tls\") pod \"dns-default-smtjk\" (UID: \"b489193c-9eab-4025-8c8f-23b3b42bae0e\") " pod="openshift-dns/dns-default-smtjk" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.100087 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1ae9d5c5-6a97-483f-9b76-45c73259c85b-default-certificate\") pod \"router-default-5444994796-6srbj\" (UID: \"1ae9d5c5-6a97-483f-9b76-45c73259c85b\") " pod="openshift-ingress/router-default-5444994796-6srbj" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.100249 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/224b66ec-aec1-4d9e-96c0-b6c7ddeb37e7-signing-cabundle\") pod \"service-ca-9c57cc56f-65mgg\" (UID: \"224b66ec-aec1-4d9e-96c0-b6c7ddeb37e7\") " pod="openshift-service-ca/service-ca-9c57cc56f-65mgg" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.100358 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-audit-dir\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.100398 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9d6b106d-6589-40f2-b694-eb158c541d82-ca-trust-extracted\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.102553 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d6b106d-6589-40f2-b694-eb158c541d82-trusted-ca\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.102733 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlwdg\" (UniqueName: \"kubernetes.io/projected/37c86724-d259-449c-baf5-b8965ea658a9-kube-api-access-qlwdg\") pod \"machine-config-server-6zk27\" (UID: \"37c86724-d259-449c-baf5-b8965ea658a9\") " pod="openshift-machine-config-operator/machine-config-server-6zk27" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.102794 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f3263799-b7d3-4b1b-a61b-4768a061e502-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-v98q2\" (UID: \"f3263799-b7d3-4b1b-a61b-4768a061e502\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v98q2" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.102909 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.102932 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk7df\" (UniqueName: \"kubernetes.io/projected/4a97f1cb-48e1-4049-be4a-0151201d5cc3-kube-api-access-lk7df\") pod \"csi-hostpathplugin-grqvq\" (UID: \"4a97f1cb-48e1-4049-be4a-0151201d5cc3\") " pod="hostpath-provisioner/csi-hostpathplugin-grqvq" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.104142 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f3263799-b7d3-4b1b-a61b-4768a061e502-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-v98q2\" (UID: \"f3263799-b7d3-4b1b-a61b-4768a061e502\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v98q2" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.104571 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9d6b106d-6589-40f2-b694-eb158c541d82-registry-certificates\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.104719 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ae9d5c5-6a97-483f-9b76-45c73259c85b-metrics-certs\") pod \"router-default-5444994796-6srbj\" (UID: \"1ae9d5c5-6a97-483f-9b76-45c73259c85b\") " pod="openshift-ingress/router-default-5444994796-6srbj" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.106524 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8msh\" (UniqueName: \"kubernetes.io/projected/962b1e46-ba63-4aa2-9882-a04487a05813-kube-api-access-n8msh\") pod \"collect-profiles-29335725-pg67v\" (UID: \"962b1e46-ba63-4aa2-9882-a04487a05813\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335725-pg67v" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.106653 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.106749 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/544ad287-fc9b-4033-b139-3893a3e10a00-webhook-cert\") pod \"packageserver-d55dfcdfc-ccjcp\" (UID: \"544ad287-fc9b-4033-b139-3893a3e10a00\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ccjcp" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.106842 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f8744f8-b334-4487-a52b-c6b6f226401d-metrics-tls\") pod \"ingress-operator-5b745b69d9-85wtv\" (UID: \"3f8744f8-b334-4487-a52b-c6b6f226401d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-85wtv" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.107033 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/224b66ec-aec1-4d9e-96c0-b6c7ddeb37e7-signing-key\") pod \"service-ca-9c57cc56f-65mgg\" (UID: \"224b66ec-aec1-4d9e-96c0-b6c7ddeb37e7\") " pod="openshift-service-ca/service-ca-9c57cc56f-65mgg" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.107132 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68717faf-e06c-4833-9b59-3e6279b38a6d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-f677q\" (UID: \"68717faf-e06c-4833-9b59-3e6279b38a6d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f677q" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.107222 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68717faf-e06c-4833-9b59-3e6279b38a6d-config\") pod \"kube-controller-manager-operator-78b949d7b-f677q\" (UID: \"68717faf-e06c-4833-9b59-3e6279b38a6d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f677q" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.107311 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgjjc\" (UniqueName: \"kubernetes.io/projected/eaad8a68-7587-4d59-9b94-e8dcdc1869f8-kube-api-access-pgjjc\") pod \"migrator-59844c95c7-z2mc9\" (UID: \"eaad8a68-7587-4d59-9b94-e8dcdc1869f8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z2mc9" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.107400 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6e2550ee-2c0c-437b-a4e3-3332ffba4e48-images\") pod \"machine-config-operator-74547568cd-bc6rx\" (UID: \"6e2550ee-2c0c-437b-a4e3-3332ffba4e48\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bc6rx" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.107490 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/122090c8-fe9b-404f-a9f3-a2ee9520091a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-62mz2\" (UID: \"122090c8-fe9b-404f-a9f3-a2ee9520091a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-62mz2" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.107583 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/88ae6ca7-3631-4aee-820c-5c979a0d4f02-profile-collector-cert\") pod \"catalog-operator-68c6474976-cqmp6\" (UID: \"88ae6ca7-3631-4aee-820c-5c979a0d4f02\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqmp6" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.107676 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmjtw\" (UniqueName: \"kubernetes.io/projected/006ce2c3-4b48-4621-acc0-50428bb1c862-kube-api-access-gmjtw\") pod \"kube-storage-version-migrator-operator-b67b599dd-rdwmd\" (UID: \"006ce2c3-4b48-4621-acc0-50428bb1c862\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rdwmd" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.107766 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bbkn\" (UniqueName: \"kubernetes.io/projected/1ae9d5c5-6a97-483f-9b76-45c73259c85b-kube-api-access-2bbkn\") pod \"router-default-5444994796-6srbj\" (UID: \"1ae9d5c5-6a97-483f-9b76-45c73259c85b\") " pod="openshift-ingress/router-default-5444994796-6srbj" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.107867 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.107946 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/4a97f1cb-48e1-4049-be4a-0151201d5cc3-plugins-dir\") pod \"csi-hostpathplugin-grqvq\" (UID: \"4a97f1cb-48e1-4049-be4a-0151201d5cc3\") " pod="hostpath-provisioner/csi-hostpathplugin-grqvq" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.108057 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk5j7\" (UniqueName: \"kubernetes.io/projected/88ae6ca7-3631-4aee-820c-5c979a0d4f02-kube-api-access-wk5j7\") pod \"catalog-operator-68c6474976-cqmp6\" (UID: \"88ae6ca7-3631-4aee-820c-5c979a0d4f02\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqmp6" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.108163 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-audit-policies\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.108256 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/37c86724-d259-449c-baf5-b8965ea658a9-certs\") pod \"machine-config-server-6zk27\" (UID: \"37c86724-d259-449c-baf5-b8965ea658a9\") " pod="openshift-machine-config-operator/machine-config-server-6zk27" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.108342 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0073037-950f-40df-bcb6-d9fd0aceccb2-config\") pod \"service-ca-operator-777779d784-4k997\" (UID: \"e0073037-950f-40df-bcb6-d9fd0aceccb2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4k997" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.108436 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.108548 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2305c756-5c61-4d13-aac9-1ff5d3c6b2ad-config\") pod \"route-controller-manager-6576b87f9c-f2gjt\" (UID: \"2305c756-5c61-4d13-aac9-1ff5d3c6b2ad\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.108658 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.108748 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/544ad287-fc9b-4033-b139-3893a3e10a00-tmpfs\") pod \"packageserver-d55dfcdfc-ccjcp\" (UID: \"544ad287-fc9b-4033-b139-3893a3e10a00\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ccjcp" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.108848 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ae9d5c5-6a97-483f-9b76-45c73259c85b-service-ca-bundle\") pod \"router-default-5444994796-6srbj\" (UID: \"1ae9d5c5-6a97-483f-9b76-45c73259c85b\") " pod="openshift-ingress/router-default-5444994796-6srbj" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.109037 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4a97f1cb-48e1-4049-be4a-0151201d5cc3-registration-dir\") pod \"csi-hostpathplugin-grqvq\" (UID: \"4a97f1cb-48e1-4049-be4a-0151201d5cc3\") " pod="hostpath-provisioner/csi-hostpathplugin-grqvq" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.109132 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swbdz\" (UniqueName: \"kubernetes.io/projected/9d6b106d-6589-40f2-b694-eb158c541d82-kube-api-access-swbdz\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.109205 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.107357 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.109218 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bc629a2-3424-4180-a506-1accee2c0246-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bt5pp\" (UID: \"5bc629a2-3424-4180-a506-1accee2c0246\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bt5pp" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.109414 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/88ae6ca7-3631-4aee-820c-5c979a0d4f02-srv-cert\") pod \"catalog-operator-68c6474976-cqmp6\" (UID: \"88ae6ca7-3631-4aee-820c-5c979a0d4f02\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqmp6" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.109494 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/887d134b-7eff-41e9-abf8-f0e70fd4c0e1-cert\") pod \"ingress-canary-mlnnl\" (UID: \"887d134b-7eff-41e9-abf8-f0e70fd4c0e1\") " pod="openshift-ingress-canary/ingress-canary-mlnnl" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.109593 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntzb8\" (UniqueName: \"kubernetes.io/projected/e0073037-950f-40df-bcb6-d9fd0aceccb2-kube-api-access-ntzb8\") pod \"service-ca-operator-777779d784-4k997\" (UID: \"e0073037-950f-40df-bcb6-d9fd0aceccb2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4k997" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.109740 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.109824 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0073037-950f-40df-bcb6-d9fd0aceccb2-serving-cert\") pod \"service-ca-operator-777779d784-4k997\" (UID: \"e0073037-950f-40df-bcb6-d9fd0aceccb2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4k997" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.109931 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjmcn\" (UniqueName: \"kubernetes.io/projected/2305c756-5c61-4d13-aac9-1ff5d3c6b2ad-kube-api-access-hjmcn\") pod \"route-controller-manager-6576b87f9c-f2gjt\" (UID: \"2305c756-5c61-4d13-aac9-1ff5d3c6b2ad\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.110021 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68717faf-e06c-4833-9b59-3e6279b38a6d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-f677q\" (UID: \"68717faf-e06c-4833-9b59-3e6279b38a6d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f677q" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.110111 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z52pv\" (UniqueName: \"kubernetes.io/projected/3f8744f8-b334-4487-a52b-c6b6f226401d-kube-api-access-z52pv\") pod \"ingress-operator-5b745b69d9-85wtv\" (UID: \"3f8744f8-b334-4487-a52b-c6b6f226401d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-85wtv" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.110202 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgpg8\" (UniqueName: \"kubernetes.io/projected/224b66ec-aec1-4d9e-96c0-b6c7ddeb37e7-kube-api-access-hgpg8\") pod \"service-ca-9c57cc56f-65mgg\" (UID: \"224b66ec-aec1-4d9e-96c0-b6c7ddeb37e7\") " pod="openshift-service-ca/service-ca-9c57cc56f-65mgg" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.110388 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9d6b106d-6589-40f2-b694-eb158c541d82-registry-tls\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.110469 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mfxs\" (UniqueName: \"kubernetes.io/projected/5690cb54-1fbe-4d33-a809-b7bdca4df6c0-kube-api-access-2mfxs\") pod \"control-plane-machine-set-operator-78cbb6b69f-rt2nd\" (UID: \"5690cb54-1fbe-4d33-a809-b7bdca4df6c0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rt2nd" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.110547 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2305c756-5c61-4d13-aac9-1ff5d3c6b2ad-client-ca\") pod \"route-controller-manager-6576b87f9c-f2gjt\" (UID: \"2305c756-5c61-4d13-aac9-1ff5d3c6b2ad\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.110636 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnbhz\" (UniqueName: \"kubernetes.io/projected/1c78dcdb-20e7-4e37-95ba-c3576a63b2cf-kube-api-access-hnbhz\") pod \"olm-operator-6b444d44fb-s8jrj\" (UID: \"1c78dcdb-20e7-4e37-95ba-c3576a63b2cf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8jrj" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.110730 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b489193c-9eab-4025-8c8f-23b3b42bae0e-config-volume\") pod \"dns-default-smtjk\" (UID: \"b489193c-9eab-4025-8c8f-23b3b42bae0e\") " pod="openshift-dns/dns-default-smtjk" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.110838 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d6b106d-6589-40f2-b694-eb158c541d82-bound-sa-token\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.110934 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kth6j\" (UniqueName: \"kubernetes.io/projected/6e2550ee-2c0c-437b-a4e3-3332ffba4e48-kube-api-access-kth6j\") pod \"machine-config-operator-74547568cd-bc6rx\" (UID: \"6e2550ee-2c0c-437b-a4e3-3332ffba4e48\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bc6rx" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.111024 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmpvd\" (UniqueName: \"kubernetes.io/projected/887d134b-7eff-41e9-abf8-f0e70fd4c0e1-kube-api-access-hmpvd\") pod \"ingress-canary-mlnnl\" (UID: \"887d134b-7eff-41e9-abf8-f0e70fd4c0e1\") " pod="openshift-ingress-canary/ingress-canary-mlnnl" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.111118 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/006ce2c3-4b48-4621-acc0-50428bb1c862-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rdwmd\" (UID: \"006ce2c3-4b48-4621-acc0-50428bb1c862\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rdwmd" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.111215 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2305c756-5c61-4d13-aac9-1ff5d3c6b2ad-serving-cert\") pod \"route-controller-manager-6576b87f9c-f2gjt\" (UID: \"2305c756-5c61-4d13-aac9-1ff5d3c6b2ad\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.111306 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/16bef631-ee0f-4346-bb9b-c6eb48a09448-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xgtfn\" (UID: \"16bef631-ee0f-4346-bb9b-c6eb48a09448\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgtfn" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.111397 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.111495 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.111592 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zlgv\" (UniqueName: \"kubernetes.io/projected/16bef631-ee0f-4346-bb9b-c6eb48a09448-kube-api-access-2zlgv\") pod \"marketplace-operator-79b997595-xgtfn\" (UID: \"16bef631-ee0f-4346-bb9b-c6eb48a09448\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgtfn" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.111691 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv7cl\" (UniqueName: \"kubernetes.io/projected/122090c8-fe9b-404f-a9f3-a2ee9520091a-kube-api-access-lv7cl\") pod \"package-server-manager-789f6589d5-62mz2\" (UID: \"122090c8-fe9b-404f-a9f3-a2ee9520091a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-62mz2" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.111781 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9d6b106d-6589-40f2-b694-eb158c541d82-installation-pull-secrets\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.111882 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e2550ee-2c0c-437b-a4e3-3332ffba4e48-proxy-tls\") pod \"machine-config-operator-74547568cd-bc6rx\" (UID: \"6e2550ee-2c0c-437b-a4e3-3332ffba4e48\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bc6rx" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.112202 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.112350 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2305c756-5c61-4d13-aac9-1ff5d3c6b2ad-config\") pod \"route-controller-manager-6576b87f9c-f2gjt\" (UID: \"2305c756-5c61-4d13-aac9-1ff5d3c6b2ad\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt" Oct 11 00:54:12 crc kubenswrapper[4743]: E1011 00:54:12.112662 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:12.612650141 +0000 UTC m=+147.265630538 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.112693 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-audit-policies\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.113500 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1ae9d5c5-6a97-483f-9b76-45c73259c85b-default-certificate\") pod \"router-default-5444994796-6srbj\" (UID: \"1ae9d5c5-6a97-483f-9b76-45c73259c85b\") " pod="openshift-ingress/router-default-5444994796-6srbj" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.108933 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ae9d5c5-6a97-483f-9b76-45c73259c85b-metrics-certs\") pod \"router-default-5444994796-6srbj\" (UID: \"1ae9d5c5-6a97-483f-9b76-45c73259c85b\") " pod="openshift-ingress/router-default-5444994796-6srbj" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.113932 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1ae9d5c5-6a97-483f-9b76-45c73259c85b-stats-auth\") pod \"router-default-5444994796-6srbj\" (UID: \"1ae9d5c5-6a97-483f-9b76-45c73259c85b\") " pod="openshift-ingress/router-default-5444994796-6srbj" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.114375 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/122090c8-fe9b-404f-a9f3-a2ee9520091a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-62mz2\" (UID: \"122090c8-fe9b-404f-a9f3-a2ee9520091a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-62mz2" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.114770 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1c78dcdb-20e7-4e37-95ba-c3576a63b2cf-srv-cert\") pod \"olm-operator-6b444d44fb-s8jrj\" (UID: \"1c78dcdb-20e7-4e37-95ba-c3576a63b2cf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8jrj" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.114797 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68717faf-e06c-4833-9b59-3e6279b38a6d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-f677q\" (UID: \"68717faf-e06c-4833-9b59-3e6279b38a6d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f677q" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.109379 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6e2550ee-2c0c-437b-a4e3-3332ffba4e48-images\") pod \"machine-config-operator-74547568cd-bc6rx\" (UID: \"6e2550ee-2c0c-437b-a4e3-3332ffba4e48\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bc6rx" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.115290 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.115758 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/006ce2c3-4b48-4621-acc0-50428bb1c862-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rdwmd\" (UID: \"006ce2c3-4b48-4621-acc0-50428bb1c862\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rdwmd" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.116620 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9d6b106d-6589-40f2-b694-eb158c541d82-registry-tls\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.110513 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.116959 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.117189 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1c78dcdb-20e7-4e37-95ba-c3576a63b2cf-profile-collector-cert\") pod \"olm-operator-6b444d44fb-s8jrj\" (UID: \"1c78dcdb-20e7-4e37-95ba-c3576a63b2cf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8jrj" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.117719 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.118419 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5690cb54-1fbe-4d33-a809-b7bdca4df6c0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rt2nd\" (UID: \"5690cb54-1fbe-4d33-a809-b7bdca4df6c0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rt2nd" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.109466 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68717faf-e06c-4833-9b59-3e6279b38a6d-config\") pod \"kube-controller-manager-operator-78b949d7b-f677q\" (UID: \"68717faf-e06c-4833-9b59-3e6279b38a6d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f677q" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.118463 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7ffea29d-93cb-4136-b0bc-7861c20751d4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wm7rl\" (UID: \"7ffea29d-93cb-4136-b0bc-7861c20751d4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wm7rl" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.110037 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ae9d5c5-6a97-483f-9b76-45c73259c85b-service-ca-bundle\") pod \"router-default-5444994796-6srbj\" (UID: \"1ae9d5c5-6a97-483f-9b76-45c73259c85b\") " pod="openshift-ingress/router-default-5444994796-6srbj" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.105195 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bc629a2-3424-4180-a506-1accee2c0246-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bt5pp\" (UID: \"5bc629a2-3424-4180-a506-1accee2c0246\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bt5pp" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.118668 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.119264 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bc629a2-3424-4180-a506-1accee2c0246-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bt5pp\" (UID: \"5bc629a2-3424-4180-a506-1accee2c0246\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bt5pp" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.119276 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/006ce2c3-4b48-4621-acc0-50428bb1c862-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rdwmd\" (UID: \"006ce2c3-4b48-4621-acc0-50428bb1c862\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rdwmd" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.120099 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2305c756-5c61-4d13-aac9-1ff5d3c6b2ad-client-ca\") pod \"route-controller-manager-6576b87f9c-f2gjt\" (UID: \"2305c756-5c61-4d13-aac9-1ff5d3c6b2ad\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.122166 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/224b66ec-aec1-4d9e-96c0-b6c7ddeb37e7-signing-key\") pod \"service-ca-9c57cc56f-65mgg\" (UID: \"224b66ec-aec1-4d9e-96c0-b6c7ddeb37e7\") " pod="openshift-service-ca/service-ca-9c57cc56f-65mgg" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.122696 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f3263799-b7d3-4b1b-a61b-4768a061e502-proxy-tls\") pod \"machine-config-controller-84d6567774-v98q2\" (UID: \"f3263799-b7d3-4b1b-a61b-4768a061e502\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v98q2" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.122814 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.124533 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/88ae6ca7-3631-4aee-820c-5c979a0d4f02-srv-cert\") pod \"catalog-operator-68c6474976-cqmp6\" (UID: \"88ae6ca7-3631-4aee-820c-5c979a0d4f02\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqmp6" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.129336 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e2550ee-2c0c-437b-a4e3-3332ffba4e48-proxy-tls\") pod \"machine-config-operator-74547568cd-bc6rx\" (UID: \"6e2550ee-2c0c-437b-a4e3-3332ffba4e48\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bc6rx" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.129464 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.129568 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/88ae6ca7-3631-4aee-820c-5c979a0d4f02-profile-collector-cert\") pod \"catalog-operator-68c6474976-cqmp6\" (UID: \"88ae6ca7-3631-4aee-820c-5c979a0d4f02\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqmp6" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.129744 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9d6b106d-6589-40f2-b694-eb158c541d82-installation-pull-secrets\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.129889 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.129849 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2305c756-5c61-4d13-aac9-1ff5d3c6b2ad-serving-cert\") pod \"route-controller-manager-6576b87f9c-f2gjt\" (UID: \"2305c756-5c61-4d13-aac9-1ff5d3c6b2ad\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.130037 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f8744f8-b334-4487-a52b-c6b6f226401d-metrics-tls\") pod \"ingress-operator-5b745b69d9-85wtv\" (UID: \"3f8744f8-b334-4487-a52b-c6b6f226401d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-85wtv" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.130180 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.133019 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pttq\" (UniqueName: \"kubernetes.io/projected/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-kube-api-access-4pttq\") pod \"oauth-openshift-558db77b4-sjtsw\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.145065 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gd5tn" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.156517 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdxls\" (UniqueName: \"kubernetes.io/projected/7ffea29d-93cb-4136-b0bc-7861c20751d4-kube-api-access-qdxls\") pod \"multus-admission-controller-857f4d67dd-wm7rl\" (UID: \"7ffea29d-93cb-4136-b0bc-7861c20751d4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wm7rl" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.157631 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.170940 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w57f" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.175639 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5bc629a2-3424-4180-a506-1accee2c0246-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bt5pp\" (UID: \"5bc629a2-3424-4180-a506-1accee2c0246\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bt5pp" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.189748 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bt5pp" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.194180 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f8744f8-b334-4487-a52b-c6b6f226401d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-85wtv\" (UID: \"3f8744f8-b334-4487-a52b-c6b6f226401d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-85wtv" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.215452 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.215768 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0073037-950f-40df-bcb6-d9fd0aceccb2-serving-cert\") pod \"service-ca-operator-777779d784-4k997\" (UID: \"e0073037-950f-40df-bcb6-d9fd0aceccb2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4k997" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.215904 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b489193c-9eab-4025-8c8f-23b3b42bae0e-config-volume\") pod \"dns-default-smtjk\" (UID: \"b489193c-9eab-4025-8c8f-23b3b42bae0e\") " pod="openshift-dns/dns-default-smtjk" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.215992 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmpvd\" (UniqueName: \"kubernetes.io/projected/887d134b-7eff-41e9-abf8-f0e70fd4c0e1-kube-api-access-hmpvd\") pod \"ingress-canary-mlnnl\" (UID: \"887d134b-7eff-41e9-abf8-f0e70fd4c0e1\") " pod="openshift-ingress-canary/ingress-canary-mlnnl" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.216018 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/16bef631-ee0f-4346-bb9b-c6eb48a09448-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xgtfn\" (UID: \"16bef631-ee0f-4346-bb9b-c6eb48a09448\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgtfn" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.216066 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zlgv\" (UniqueName: \"kubernetes.io/projected/16bef631-ee0f-4346-bb9b-c6eb48a09448-kube-api-access-2zlgv\") pod \"marketplace-operator-79b997595-xgtfn\" (UID: \"16bef631-ee0f-4346-bb9b-c6eb48a09448\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgtfn" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.216163 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/544ad287-fc9b-4033-b139-3893a3e10a00-apiservice-cert\") pod \"packageserver-d55dfcdfc-ccjcp\" (UID: \"544ad287-fc9b-4033-b139-3893a3e10a00\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ccjcp" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.216188 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/962b1e46-ba63-4aa2-9882-a04487a05813-config-volume\") pod \"collect-profiles-29335725-pg67v\" (UID: \"962b1e46-ba63-4aa2-9882-a04487a05813\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335725-pg67v" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.216237 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtqv5\" (UniqueName: \"kubernetes.io/projected/544ad287-fc9b-4033-b139-3893a3e10a00-kube-api-access-jtqv5\") pod \"packageserver-d55dfcdfc-ccjcp\" (UID: \"544ad287-fc9b-4033-b139-3893a3e10a00\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ccjcp" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.216402 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/37c86724-d259-449c-baf5-b8965ea658a9-node-bootstrap-token\") pod \"machine-config-server-6zk27\" (UID: \"37c86724-d259-449c-baf5-b8965ea658a9\") " pod="openshift-machine-config-operator/machine-config-server-6zk27" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.216450 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/4a97f1cb-48e1-4049-be4a-0151201d5cc3-mountpoint-dir\") pod \"csi-hostpathplugin-grqvq\" (UID: \"4a97f1cb-48e1-4049-be4a-0151201d5cc3\") " pod="hostpath-provisioner/csi-hostpathplugin-grqvq" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.216476 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/4a97f1cb-48e1-4049-be4a-0151201d5cc3-csi-data-dir\") pod \"csi-hostpathplugin-grqvq\" (UID: \"4a97f1cb-48e1-4049-be4a-0151201d5cc3\") " pod="hostpath-provisioner/csi-hostpathplugin-grqvq" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.216535 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/962b1e46-ba63-4aa2-9882-a04487a05813-secret-volume\") pod \"collect-profiles-29335725-pg67v\" (UID: \"962b1e46-ba63-4aa2-9882-a04487a05813\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335725-pg67v" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.216557 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16bef631-ee0f-4346-bb9b-c6eb48a09448-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xgtfn\" (UID: \"16bef631-ee0f-4346-bb9b-c6eb48a09448\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgtfn" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.216579 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4a97f1cb-48e1-4049-be4a-0151201d5cc3-socket-dir\") pod \"csi-hostpathplugin-grqvq\" (UID: \"4a97f1cb-48e1-4049-be4a-0151201d5cc3\") " pod="hostpath-provisioner/csi-hostpathplugin-grqvq" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.216626 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mvml\" (UniqueName: \"kubernetes.io/projected/b489193c-9eab-4025-8c8f-23b3b42bae0e-kube-api-access-9mvml\") pod \"dns-default-smtjk\" (UID: \"b489193c-9eab-4025-8c8f-23b3b42bae0e\") " pod="openshift-dns/dns-default-smtjk" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.216694 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b489193c-9eab-4025-8c8f-23b3b42bae0e-metrics-tls\") pod \"dns-default-smtjk\" (UID: \"b489193c-9eab-4025-8c8f-23b3b42bae0e\") " pod="openshift-dns/dns-default-smtjk" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.216727 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlwdg\" (UniqueName: \"kubernetes.io/projected/37c86724-d259-449c-baf5-b8965ea658a9-kube-api-access-qlwdg\") pod \"machine-config-server-6zk27\" (UID: \"37c86724-d259-449c-baf5-b8965ea658a9\") " pod="openshift-machine-config-operator/machine-config-server-6zk27" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.216748 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk7df\" (UniqueName: \"kubernetes.io/projected/4a97f1cb-48e1-4049-be4a-0151201d5cc3-kube-api-access-lk7df\") pod \"csi-hostpathplugin-grqvq\" (UID: \"4a97f1cb-48e1-4049-be4a-0151201d5cc3\") " pod="hostpath-provisioner/csi-hostpathplugin-grqvq" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.216798 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8msh\" (UniqueName: \"kubernetes.io/projected/962b1e46-ba63-4aa2-9882-a04487a05813-kube-api-access-n8msh\") pod \"collect-profiles-29335725-pg67v\" (UID: \"962b1e46-ba63-4aa2-9882-a04487a05813\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335725-pg67v" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.216798 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b489193c-9eab-4025-8c8f-23b3b42bae0e-config-volume\") pod \"dns-default-smtjk\" (UID: \"b489193c-9eab-4025-8c8f-23b3b42bae0e\") " pod="openshift-dns/dns-default-smtjk" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.216823 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/544ad287-fc9b-4033-b139-3893a3e10a00-webhook-cert\") pod \"packageserver-d55dfcdfc-ccjcp\" (UID: \"544ad287-fc9b-4033-b139-3893a3e10a00\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ccjcp" Oct 11 00:54:12 crc kubenswrapper[4743]: E1011 00:54:12.216891 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:12.716878476 +0000 UTC m=+147.369858873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.216943 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/4a97f1cb-48e1-4049-be4a-0151201d5cc3-plugins-dir\") pod \"csi-hostpathplugin-grqvq\" (UID: \"4a97f1cb-48e1-4049-be4a-0151201d5cc3\") " pod="hostpath-provisioner/csi-hostpathplugin-grqvq" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.216966 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/37c86724-d259-449c-baf5-b8965ea658a9-certs\") pod \"machine-config-server-6zk27\" (UID: \"37c86724-d259-449c-baf5-b8965ea658a9\") " pod="openshift-machine-config-operator/machine-config-server-6zk27" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.216982 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0073037-950f-40df-bcb6-d9fd0aceccb2-config\") pod \"service-ca-operator-777779d784-4k997\" (UID: \"e0073037-950f-40df-bcb6-d9fd0aceccb2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4k997" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.217003 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/544ad287-fc9b-4033-b139-3893a3e10a00-tmpfs\") pod \"packageserver-d55dfcdfc-ccjcp\" (UID: \"544ad287-fc9b-4033-b139-3893a3e10a00\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ccjcp" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.217021 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4a97f1cb-48e1-4049-be4a-0151201d5cc3-registration-dir\") pod \"csi-hostpathplugin-grqvq\" (UID: \"4a97f1cb-48e1-4049-be4a-0151201d5cc3\") " pod="hostpath-provisioner/csi-hostpathplugin-grqvq" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.217044 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/887d134b-7eff-41e9-abf8-f0e70fd4c0e1-cert\") pod \"ingress-canary-mlnnl\" (UID: \"887d134b-7eff-41e9-abf8-f0e70fd4c0e1\") " pod="openshift-ingress-canary/ingress-canary-mlnnl" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.217062 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntzb8\" (UniqueName: \"kubernetes.io/projected/e0073037-950f-40df-bcb6-d9fd0aceccb2-kube-api-access-ntzb8\") pod \"service-ca-operator-777779d784-4k997\" (UID: \"e0073037-950f-40df-bcb6-d9fd0aceccb2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4k997" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.218169 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/4a97f1cb-48e1-4049-be4a-0151201d5cc3-csi-data-dir\") pod \"csi-hostpathplugin-grqvq\" (UID: \"4a97f1cb-48e1-4049-be4a-0151201d5cc3\") " pod="hostpath-provisioner/csi-hostpathplugin-grqvq" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.220125 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0073037-950f-40df-bcb6-d9fd0aceccb2-serving-cert\") pod \"service-ca-operator-777779d784-4k997\" (UID: \"e0073037-950f-40df-bcb6-d9fd0aceccb2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4k997" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.220278 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/4a97f1cb-48e1-4049-be4a-0151201d5cc3-plugins-dir\") pod \"csi-hostpathplugin-grqvq\" (UID: \"4a97f1cb-48e1-4049-be4a-0151201d5cc3\") " pod="hostpath-provisioner/csi-hostpathplugin-grqvq" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.222005 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f"] Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.222524 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/962b1e46-ba63-4aa2-9882-a04487a05813-config-volume\") pod \"collect-profiles-29335725-pg67v\" (UID: \"962b1e46-ba63-4aa2-9882-a04487a05813\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335725-pg67v" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.222631 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv5d7\" (UniqueName: \"kubernetes.io/projected/f3263799-b7d3-4b1b-a61b-4768a061e502-kube-api-access-mv5d7\") pod \"machine-config-controller-84d6567774-v98q2\" (UID: \"f3263799-b7d3-4b1b-a61b-4768a061e502\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v98q2" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.222694 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/4a97f1cb-48e1-4049-be4a-0151201d5cc3-mountpoint-dir\") pod \"csi-hostpathplugin-grqvq\" (UID: \"4a97f1cb-48e1-4049-be4a-0151201d5cc3\") " pod="hostpath-provisioner/csi-hostpathplugin-grqvq" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.222934 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/544ad287-fc9b-4033-b139-3893a3e10a00-webhook-cert\") pod \"packageserver-d55dfcdfc-ccjcp\" (UID: \"544ad287-fc9b-4033-b139-3893a3e10a00\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ccjcp" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.222962 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4a97f1cb-48e1-4049-be4a-0151201d5cc3-socket-dir\") pod \"csi-hostpathplugin-grqvq\" (UID: \"4a97f1cb-48e1-4049-be4a-0151201d5cc3\") " pod="hostpath-provisioner/csi-hostpathplugin-grqvq" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.223186 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/37c86724-d259-449c-baf5-b8965ea658a9-node-bootstrap-token\") pod \"machine-config-server-6zk27\" (UID: \"37c86724-d259-449c-baf5-b8965ea658a9\") " pod="openshift-machine-config-operator/machine-config-server-6zk27" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.223599 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0073037-950f-40df-bcb6-d9fd0aceccb2-config\") pod \"service-ca-operator-777779d784-4k997\" (UID: \"e0073037-950f-40df-bcb6-d9fd0aceccb2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4k997" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.223658 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4a97f1cb-48e1-4049-be4a-0151201d5cc3-registration-dir\") pod \"csi-hostpathplugin-grqvq\" (UID: \"4a97f1cb-48e1-4049-be4a-0151201d5cc3\") " pod="hostpath-provisioner/csi-hostpathplugin-grqvq" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.223848 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16bef631-ee0f-4346-bb9b-c6eb48a09448-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xgtfn\" (UID: \"16bef631-ee0f-4346-bb9b-c6eb48a09448\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgtfn" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.224147 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wm7rl" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.224796 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/16bef631-ee0f-4346-bb9b-c6eb48a09448-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xgtfn\" (UID: \"16bef631-ee0f-4346-bb9b-c6eb48a09448\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgtfn" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.225244 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/37c86724-d259-449c-baf5-b8965ea658a9-certs\") pod \"machine-config-server-6zk27\" (UID: \"37c86724-d259-449c-baf5-b8965ea658a9\") " pod="openshift-machine-config-operator/machine-config-server-6zk27" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.229277 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/544ad287-fc9b-4033-b139-3893a3e10a00-apiservice-cert\") pod \"packageserver-d55dfcdfc-ccjcp\" (UID: \"544ad287-fc9b-4033-b139-3893a3e10a00\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ccjcp" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.231893 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/887d134b-7eff-41e9-abf8-f0e70fd4c0e1-cert\") pod \"ingress-canary-mlnnl\" (UID: \"887d134b-7eff-41e9-abf8-f0e70fd4c0e1\") " pod="openshift-ingress-canary/ingress-canary-mlnnl" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.231923 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b489193c-9eab-4025-8c8f-23b3b42bae0e-metrics-tls\") pod \"dns-default-smtjk\" (UID: \"b489193c-9eab-4025-8c8f-23b3b42bae0e\") " pod="openshift-dns/dns-default-smtjk" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.234257 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/544ad287-fc9b-4033-b139-3893a3e10a00-tmpfs\") pod \"packageserver-d55dfcdfc-ccjcp\" (UID: \"544ad287-fc9b-4033-b139-3893a3e10a00\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ccjcp" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.238088 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/962b1e46-ba63-4aa2-9882-a04487a05813-secret-volume\") pod \"collect-profiles-29335725-pg67v\" (UID: \"962b1e46-ba63-4aa2-9882-a04487a05813\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335725-pg67v" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.241525 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-468m5"] Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.249240 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wlvjw"] Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.256657 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-229tw"] Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.260961 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgjjc\" (UniqueName: \"kubernetes.io/projected/eaad8a68-7587-4d59-9b94-e8dcdc1869f8-kube-api-access-pgjjc\") pod \"migrator-59844c95c7-z2mc9\" (UID: \"eaad8a68-7587-4d59-9b94-e8dcdc1869f8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z2mc9" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.278685 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk5j7\" (UniqueName: \"kubernetes.io/projected/88ae6ca7-3631-4aee-820c-5c979a0d4f02-kube-api-access-wk5j7\") pod \"catalog-operator-68c6474976-cqmp6\" (UID: \"88ae6ca7-3631-4aee-820c-5c979a0d4f02\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqmp6" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.310811 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmjtw\" (UniqueName: \"kubernetes.io/projected/006ce2c3-4b48-4621-acc0-50428bb1c862-kube-api-access-gmjtw\") pod \"kube-storage-version-migrator-operator-b67b599dd-rdwmd\" (UID: \"006ce2c3-4b48-4621-acc0-50428bb1c862\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rdwmd" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.318150 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:12 crc kubenswrapper[4743]: E1011 00:54:12.318433 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:12.818422421 +0000 UTC m=+147.471402818 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.320338 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bbkn\" (UniqueName: \"kubernetes.io/projected/1ae9d5c5-6a97-483f-9b76-45c73259c85b-kube-api-access-2bbkn\") pod \"router-default-5444994796-6srbj\" (UID: \"1ae9d5c5-6a97-483f-9b76-45c73259c85b\") " pod="openshift-ingress/router-default-5444994796-6srbj" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.324786 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qksm9"] Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.326085 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fv4x7"] Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.343207 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgpg8\" (UniqueName: \"kubernetes.io/projected/224b66ec-aec1-4d9e-96c0-b6c7ddeb37e7-kube-api-access-hgpg8\") pod \"service-ca-9c57cc56f-65mgg\" (UID: \"224b66ec-aec1-4d9e-96c0-b6c7ddeb37e7\") " pod="openshift-service-ca/service-ca-9c57cc56f-65mgg" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.349827 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wswrq"] Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.353451 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kzxfv"] Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.363316 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9npzd"] Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.365063 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swbdz\" (UniqueName: \"kubernetes.io/projected/9d6b106d-6589-40f2-b694-eb158c541d82-kube-api-access-swbdz\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.383284 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mfxs\" (UniqueName: \"kubernetes.io/projected/5690cb54-1fbe-4d33-a809-b7bdca4df6c0-kube-api-access-2mfxs\") pod \"control-plane-machine-set-operator-78cbb6b69f-rt2nd\" (UID: \"5690cb54-1fbe-4d33-a809-b7bdca4df6c0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rt2nd" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.393806 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv7cl\" (UniqueName: \"kubernetes.io/projected/122090c8-fe9b-404f-a9f3-a2ee9520091a-kube-api-access-lv7cl\") pod \"package-server-manager-789f6589d5-62mz2\" (UID: \"122090c8-fe9b-404f-a9f3-a2ee9520091a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-62mz2" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.415690 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjmcn\" (UniqueName: \"kubernetes.io/projected/2305c756-5c61-4d13-aac9-1ff5d3c6b2ad-kube-api-access-hjmcn\") pod \"route-controller-manager-6576b87f9c-f2gjt\" (UID: \"2305c756-5c61-4d13-aac9-1ff5d3c6b2ad\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.418666 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:12 crc kubenswrapper[4743]: E1011 00:54:12.418770 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:12.918756659 +0000 UTC m=+147.571737046 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.419678 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:12 crc kubenswrapper[4743]: E1011 00:54:12.419944 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:12.919936574 +0000 UTC m=+147.572916971 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.437560 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.438066 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68717faf-e06c-4833-9b59-3e6279b38a6d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-f677q\" (UID: \"68717faf-e06c-4833-9b59-3e6279b38a6d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f677q" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.463442 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d6b106d-6589-40f2-b694-eb158c541d82-bound-sa-token\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.464542 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f677q" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.476983 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v98q2" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.478173 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z52pv\" (UniqueName: \"kubernetes.io/projected/3f8744f8-b334-4487-a52b-c6b6f226401d-kube-api-access-z52pv\") pod \"ingress-operator-5b745b69d9-85wtv\" (UID: \"3f8744f8-b334-4487-a52b-c6b6f226401d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-85wtv" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.484034 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rdwmd" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.496141 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6srbj" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.501271 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kth6j\" (UniqueName: \"kubernetes.io/projected/6e2550ee-2c0c-437b-a4e3-3332ffba4e48-kube-api-access-kth6j\") pod \"machine-config-operator-74547568cd-bc6rx\" (UID: \"6e2550ee-2c0c-437b-a4e3-3332ffba4e48\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bc6rx" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.503400 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bc6rx" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.508849 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rt2nd" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.511537 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnbhz\" (UniqueName: \"kubernetes.io/projected/1c78dcdb-20e7-4e37-95ba-c3576a63b2cf-kube-api-access-hnbhz\") pod \"olm-operator-6b444d44fb-s8jrj\" (UID: \"1c78dcdb-20e7-4e37-95ba-c3576a63b2cf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8jrj" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.514785 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z2mc9" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.521309 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:12 crc kubenswrapper[4743]: E1011 00:54:12.521430 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:13.021393736 +0000 UTC m=+147.674374133 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.521554 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:12 crc kubenswrapper[4743]: E1011 00:54:12.521983 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:13.021975174 +0000 UTC m=+147.674955571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.527335 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8jrj" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.533601 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-62mz2" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.533806 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntzb8\" (UniqueName: \"kubernetes.io/projected/e0073037-950f-40df-bcb6-d9fd0aceccb2-kube-api-access-ntzb8\") pod \"service-ca-operator-777779d784-4k997\" (UID: \"e0073037-950f-40df-bcb6-d9fd0aceccb2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4k997" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.540115 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqmp6" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.549003 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-65mgg" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.556336 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmpvd\" (UniqueName: \"kubernetes.io/projected/887d134b-7eff-41e9-abf8-f0e70fd4c0e1-kube-api-access-hmpvd\") pod \"ingress-canary-mlnnl\" (UID: \"887d134b-7eff-41e9-abf8-f0e70fd4c0e1\") " pod="openshift-ingress-canary/ingress-canary-mlnnl" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.571752 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4k997" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.573720 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zlgv\" (UniqueName: \"kubernetes.io/projected/16bef631-ee0f-4346-bb9b-c6eb48a09448-kube-api-access-2zlgv\") pod \"marketplace-operator-79b997595-xgtfn\" (UID: \"16bef631-ee0f-4346-bb9b-c6eb48a09448\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgtfn" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.577882 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xgtfn" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.596699 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mvml\" (UniqueName: \"kubernetes.io/projected/b489193c-9eab-4025-8c8f-23b3b42bae0e-kube-api-access-9mvml\") pod \"dns-default-smtjk\" (UID: \"b489193c-9eab-4025-8c8f-23b3b42bae0e\") " pod="openshift-dns/dns-default-smtjk" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.623639 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:12 crc kubenswrapper[4743]: E1011 00:54:12.623793 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:13.123768536 +0000 UTC m=+147.776748933 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.623946 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.624163 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk7df\" (UniqueName: \"kubernetes.io/projected/4a97f1cb-48e1-4049-be4a-0151201d5cc3-kube-api-access-lk7df\") pod \"csi-hostpathplugin-grqvq\" (UID: \"4a97f1cb-48e1-4049-be4a-0151201d5cc3\") " pod="hostpath-provisioner/csi-hostpathplugin-grqvq" Oct 11 00:54:12 crc kubenswrapper[4743]: E1011 00:54:12.624373 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:13.124353543 +0000 UTC m=+147.777334020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.628568 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mlnnl" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.635376 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlwdg\" (UniqueName: \"kubernetes.io/projected/37c86724-d259-449c-baf5-b8965ea658a9-kube-api-access-qlwdg\") pod \"machine-config-server-6zk27\" (UID: \"37c86724-d259-449c-baf5-b8965ea658a9\") " pod="openshift-machine-config-operator/machine-config-server-6zk27" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.645415 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gd5tn"] Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.652044 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8msh\" (UniqueName: \"kubernetes.io/projected/962b1e46-ba63-4aa2-9882-a04487a05813-kube-api-access-n8msh\") pod \"collect-profiles-29335725-pg67v\" (UID: \"962b1e46-ba63-4aa2-9882-a04487a05813\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335725-pg67v" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.673556 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtqv5\" (UniqueName: \"kubernetes.io/projected/544ad287-fc9b-4033-b139-3893a3e10a00-kube-api-access-jtqv5\") pod \"packageserver-d55dfcdfc-ccjcp\" (UID: \"544ad287-fc9b-4033-b139-3893a3e10a00\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ccjcp" Oct 11 00:54:12 crc kubenswrapper[4743]: W1011 00:54:12.683367 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ae9d5c5_6a97_483f_9b76_45c73259c85b.slice/crio-d5c7db2cd6a64201f59d8085f784621434a9a7c3d6675cd74a2bdcae0d2e2763 WatchSource:0}: Error finding container d5c7db2cd6a64201f59d8085f784621434a9a7c3d6675cd74a2bdcae0d2e2763: Status 404 returned error can't find the container with id d5c7db2cd6a64201f59d8085f784621434a9a7c3d6675cd74a2bdcae0d2e2763 Oct 11 00:54:12 crc kubenswrapper[4743]: W1011 00:54:12.715376 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode59e36c1_5053_44bc_bb35_cab447a646fb.slice/crio-1ec78f09a5c173fa3ab8a6dcfe8261798a70ae9892ce1367336255d8681bb2a9 WatchSource:0}: Error finding container 1ec78f09a5c173fa3ab8a6dcfe8261798a70ae9892ce1367336255d8681bb2a9: Status 404 returned error can't find the container with id 1ec78f09a5c173fa3ab8a6dcfe8261798a70ae9892ce1367336255d8681bb2a9 Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.725543 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:12 crc kubenswrapper[4743]: E1011 00:54:12.725754 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:13.225683691 +0000 UTC m=+147.878664098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.725975 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:12 crc kubenswrapper[4743]: E1011 00:54:12.726257 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:13.226246268 +0000 UTC m=+147.879226775 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.727409 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt"] Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.731933 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-85wtv" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.778147 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w57f"] Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.779924 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sjtsw"] Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.826784 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:12 crc kubenswrapper[4743]: E1011 00:54:12.826949 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:13.326927977 +0000 UTC m=+147.979908374 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.827064 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:12 crc kubenswrapper[4743]: E1011 00:54:12.827345 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:13.327334629 +0000 UTC m=+147.980315026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.855926 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335725-pg67v" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.863672 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bt5pp"] Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.863841 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ccjcp" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.864299 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wm7rl"] Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.886725 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-smtjk" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.895364 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-6zk27" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.923604 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-grqvq" Oct 11 00:54:12 crc kubenswrapper[4743]: I1011 00:54:12.929980 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:12 crc kubenswrapper[4743]: E1011 00:54:12.932407 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:13.432380158 +0000 UTC m=+148.085360555 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:12 crc kubenswrapper[4743]: W1011 00:54:12.969838 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd684c3f_8689_4d7b_ab5e_ff1c14ab9747.slice/crio-524e9e3c7664232f8c27349559ad77daed50f5b953a3da421b56fce6f0cbe555 WatchSource:0}: Error finding container 524e9e3c7664232f8c27349559ad77daed50f5b953a3da421b56fce6f0cbe555: Status 404 returned error can't find the container with id 524e9e3c7664232f8c27349559ad77daed50f5b953a3da421b56fce6f0cbe555 Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.032417 4743 generic.go:334] "Generic (PLEG): container finished" podID="58a40e2a-9789-4d4c-8817-8aa7920baa39" containerID="169903dd0061185208554c78829a477021f7bfa0af31f0d455d34dae48c93ac5" exitCode=0 Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.032501 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" event={"ID":"58a40e2a-9789-4d4c-8817-8aa7920baa39","Type":"ContainerDied","Data":"169903dd0061185208554c78829a477021f7bfa0af31f0d455d34dae48c93ac5"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.032543 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" event={"ID":"58a40e2a-9789-4d4c-8817-8aa7920baa39","Type":"ContainerStarted","Data":"87544e6c6b97379afb3b2907878addace218f5a4c925771280f60de0e0a55fa5"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.033536 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:13 crc kubenswrapper[4743]: E1011 00:54:13.033817 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:13.533805498 +0000 UTC m=+148.186785895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.047315 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt" event={"ID":"2305c756-5c61-4d13-aac9-1ff5d3c6b2ad","Type":"ContainerStarted","Data":"ab726670e1efb6c2cfcdffbcc9be2764cad49feb3f410b47bb64b0629ed4d67d"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.056261 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-rtt9m" event={"ID":"149214a5-565a-452f-abbb-1479919b6104","Type":"ContainerStarted","Data":"4f57750fe8dcf1b4f39fef6fc27c183c4b66f32f2d1d895ab880a7d38445adbe"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.056302 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-rtt9m" event={"ID":"149214a5-565a-452f-abbb-1479919b6104","Type":"ContainerStarted","Data":"1179a9d087093f2a07bfb2b0a8dde15d3d10e49f4c27d548fbc5557f235fe574"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.066926 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f677q"] Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.082944 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9npzd" event={"ID":"7f689d34-10fa-427c-8db0-cfc9324ae9de","Type":"ContainerStarted","Data":"e42d9e4e1b89b0afcf849896d2b28ca9f208a19a3eee8a80364e5840c242c113"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.084116 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-v98q2"] Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.085906 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kzxfv" event={"ID":"703e145f-c49a-40b7-b50f-a4902208d939","Type":"ContainerStarted","Data":"9e4a086925e2ba226ac59daf1634f27de542eeb8717ec08e725e6fc83d6a4136"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.085936 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kzxfv" event={"ID":"703e145f-c49a-40b7-b50f-a4902208d939","Type":"ContainerStarted","Data":"7764c266d5e50ce37d249e8c420810a42b5de64c1cd65e4a4bc70505fd0cc7e0"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.097647 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-229tw" event={"ID":"121b64d6-750f-4720-8297-6f3a91dc3a3a","Type":"ContainerStarted","Data":"490707c3665ea4cd295225dd78b739b11940c838dd4b784f737b04edcb9d9472"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.098985 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29335680-fmgl6" event={"ID":"0f26ac0d-8683-415e-850c-5aef3da4b59f","Type":"ContainerStarted","Data":"fdbb287bcae0625eba994d102bc94cd017ab3bb48ee16dfebf2df90c7556de0f"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.099030 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29335680-fmgl6" event={"ID":"0f26ac0d-8683-415e-850c-5aef3da4b59f","Type":"ContainerStarted","Data":"32d687856f9e5b99c79ac570d73ed4eb65c0644fc9f01a9a60bf8c45d039dad7"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.102645 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-468m5" event={"ID":"b5776f9a-8455-4c34-8496-0b4c4e821135","Type":"ContainerStarted","Data":"ab022353864545543708f1cd63246b729bd9128d7b912a63741aed10601ac6f9"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.102687 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-468m5" event={"ID":"b5776f9a-8455-4c34-8496-0b4c4e821135","Type":"ContainerStarted","Data":"d78d37fa4d7b8b9b5ecc1aeafdd5ffb9dc28daef3c828360238f9bd4bf381480"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.104286 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bt5pp" event={"ID":"5bc629a2-3424-4180-a506-1accee2c0246","Type":"ContainerStarted","Data":"07aae0b956efb2bbf87d5a695e50b96799a2e19e8e4bb3b883e51abd44139e45"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.112454 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wswrq" event={"ID":"0c2c2460-fedf-4109-8fd9-986749f1e021","Type":"ContainerStarted","Data":"0fd1014e3cf9b8829f789e1080ca11288b6a21a6118f63a001a8754dba866568"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.122232 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qksm9" event={"ID":"e8a051b9-029d-4b92-a9a1-380c8d18f051","Type":"ContainerStarted","Data":"f4cc4c512909e4e7480ed5b309aa46e6189fd88fb2761e8c9f66e4ba59382d19"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.129085 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wlvjw" event={"ID":"8b3d2036-8156-4d9f-9e11-2a8133ab5295","Type":"ContainerStarted","Data":"14a5ef5292bb22fce0fc36a11462b38a0720b7559b694d7daa009e128cc03d77"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.129154 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-wlvjw" Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.129166 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wlvjw" event={"ID":"8b3d2036-8156-4d9f-9e11-2a8133ab5295","Type":"ContainerStarted","Data":"94b276fe2615d52d38861a0ac4cd39ab9101b6883e11146984d19f7c797cebde"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.130942 4743 patch_prober.go:28] interesting pod/console-operator-58897d9998-wlvjw container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.131008 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-wlvjw" podUID="8b3d2036-8156-4d9f-9e11-2a8133ab5295" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.135156 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:13 crc kubenswrapper[4743]: E1011 00:54:13.135637 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:13.635611951 +0000 UTC m=+148.288592348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.137527 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gd5tn" event={"ID":"e59e36c1-5053-44bc-bb35-cab447a646fb","Type":"ContainerStarted","Data":"1ec78f09a5c173fa3ab8a6dcfe8261798a70ae9892ce1367336255d8681bb2a9"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.142121 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lxzlb" event={"ID":"a0dfb25d-8789-4754-83a0-0e3ee1888e3e","Type":"ContainerStarted","Data":"f45a45f38ca0f8238301f5a490696c83e9c77b60ef839d4b32be0f9f7594e8d2"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.142163 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lxzlb" event={"ID":"a0dfb25d-8789-4754-83a0-0e3ee1888e3e","Type":"ContainerStarted","Data":"0b80ca11f7ad44223cb466c3e4b980e6ec2e42b65f05b784e0f98275e5da1cd7"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.142173 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lxzlb" event={"ID":"a0dfb25d-8789-4754-83a0-0e3ee1888e3e","Type":"ContainerStarted","Data":"78537e5b5b859e60d6a50d5f5af0f5583aa9fb1809c35c8c868e6a07c2bc45cf"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.152596 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6l9k" event={"ID":"9b43d826-75ca-4c81-9f93-11b4398b96fa","Type":"ContainerStarted","Data":"60ca4ab18a790c80012b77a52d564f6698f2b1d77dcf7b83757edd2aec18854c"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.155646 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lq56c" event={"ID":"61413bd4-c42d-4336-92db-d443ed8ea1de","Type":"ContainerStarted","Data":"9fe7775609ba9ffb2d35967e6287ae92130a26793c2584cbb8494d699231f0b2"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.155692 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lq56c" event={"ID":"61413bd4-c42d-4336-92db-d443ed8ea1de","Type":"ContainerStarted","Data":"63f506409ed7d5fc1d576f4d06767a84cbd3e800ee01215270fe0f8a0d76e29b"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.157622 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6srbj" event={"ID":"1ae9d5c5-6a97-483f-9b76-45c73259c85b","Type":"ContainerStarted","Data":"d5c7db2cd6a64201f59d8085f784621434a9a7c3d6675cd74a2bdcae0d2e2763"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.159417 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" event={"ID":"284b66a9-09b0-4bc0-bc8a-5bd32f06d088","Type":"ContainerStarted","Data":"3b95fe7e7935bb8d1343d8924f1756fedc81c2216a4f1b707f36335e3eca586e"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.167637 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tljzf" event={"ID":"79462f0e-13e0-4ee7-af5f-02e6e5cd849d","Type":"ContainerStarted","Data":"8f22eb2f304a4ff6d92147cf5c65e6306fe3a0cc98f5d145a66fc0502c91e058"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.167679 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tljzf" event={"ID":"79462f0e-13e0-4ee7-af5f-02e6e5cd849d","Type":"ContainerStarted","Data":"98252cdc7a99a69532d611b2ca398aa98ab21f79aed312d5fcbb78596b0a11dc"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.167689 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tljzf" event={"ID":"79462f0e-13e0-4ee7-af5f-02e6e5cd849d","Type":"ContainerStarted","Data":"f591687c2005275515cdd6184b5b72f64233daff2132b5ddef94ee2b5357e588"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.173751 4743 generic.go:334] "Generic (PLEG): container finished" podID="3abfab8d-967c-42bf-9f48-bd69bbe6f8f2" containerID="09cf730a4192d24762273de5c65aaa6d437ca70786b4f528453be78dcdfdfc0e" exitCode=0 Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.173829 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" event={"ID":"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2","Type":"ContainerDied","Data":"09cf730a4192d24762273de5c65aaa6d437ca70786b4f528453be78dcdfdfc0e"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.173883 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" event={"ID":"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2","Type":"ContainerStarted","Data":"a7c214e50c8cb4fca7352d646cc7defe43ba54bb92a6a9fc4df867521e371270"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.181744 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" event={"ID":"8693817b-7cf6-486d-a055-93c4c0308d95","Type":"ContainerStarted","Data":"4f7ac059cbc05beab2f9e31382692729609e8ab2a6ea74eb68eb64398b5a2280"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.181785 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" event={"ID":"8693817b-7cf6-486d-a055-93c4c0308d95","Type":"ContainerStarted","Data":"925237d2417c4fb065179b4e9305fcb9f85e2f8b0c6276c96581ae216da79b22"} Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.182556 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.188052 4743 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-fv4x7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.188108 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" podUID="8693817b-7cf6-486d-a055-93c4c0308d95" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.237629 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:13 crc kubenswrapper[4743]: E1011 00:54:13.238128 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:13.738105923 +0000 UTC m=+148.391086320 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.336588 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-62mz2"] Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.339157 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:13 crc kubenswrapper[4743]: E1011 00:54:13.340402 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:13.840356339 +0000 UTC m=+148.493336736 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:13 crc kubenswrapper[4743]: W1011 00:54:13.392489 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod122090c8_fe9b_404f_a9f3_a2ee9520091a.slice/crio-1c5a77dec7286f22c2ac063c2445a8b16e7b0a2300409bb34c36e5639b99e671 WatchSource:0}: Error finding container 1c5a77dec7286f22c2ac063c2445a8b16e7b0a2300409bb34c36e5639b99e671: Status 404 returned error can't find the container with id 1c5a77dec7286f22c2ac063c2445a8b16e7b0a2300409bb34c36e5639b99e671 Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.442022 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:13 crc kubenswrapper[4743]: E1011 00:54:13.442612 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:13.942600525 +0000 UTC m=+148.595580922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.544310 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:13 crc kubenswrapper[4743]: E1011 00:54:13.544468 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:14.044445478 +0000 UTC m=+148.697425875 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.544706 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:13 crc kubenswrapper[4743]: E1011 00:54:13.545071 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:14.045059576 +0000 UTC m=+148.698039973 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.561124 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqmp6"] Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.582130 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-z2mc9"] Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.645467 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:13 crc kubenswrapper[4743]: E1011 00:54:13.646061 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:14.146047334 +0000 UTC m=+148.799027731 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.747185 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:13 crc kubenswrapper[4743]: E1011 00:54:13.747467 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:14.247454785 +0000 UTC m=+148.900435182 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.848425 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:13 crc kubenswrapper[4743]: E1011 00:54:13.849181 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:14.349166525 +0000 UTC m=+149.002146922 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.888919 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xgtfn"] Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.900195 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-65mgg"] Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.903333 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rdwmd"] Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.929339 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8jrj"] Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.931251 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bc6rx"] Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.953387 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.953434 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.953492 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.953526 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.953560 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:13 crc kubenswrapper[4743]: E1011 00:54:13.953841 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:14.453830732 +0000 UTC m=+149.106811129 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.962387 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.962590 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.963096 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:54:13 crc kubenswrapper[4743]: I1011 00:54:13.969277 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.053988 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:14 crc kubenswrapper[4743]: E1011 00:54:14.054204 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:14.554181281 +0000 UTC m=+149.207161678 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.054504 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:14 crc kubenswrapper[4743]: E1011 00:54:14.054752 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:14.554741138 +0000 UTC m=+149.207721535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.128451 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" podStartSLOduration=128.128376461 podStartE2EDuration="2m8.128376461s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:14.128061182 +0000 UTC m=+148.781041579" watchObservedRunningTime="2025-10-11 00:54:14.128376461 +0000 UTC m=+148.781356858" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.153132 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.156397 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:14 crc kubenswrapper[4743]: E1011 00:54:14.156693 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:14.656679324 +0000 UTC m=+149.309659721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.158061 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.177610 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rt2nd"] Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.179229 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.202760 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mlnnl"] Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.221078 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4k997"] Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.266381 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-85wtv"] Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.266479 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z2mc9" event={"ID":"eaad8a68-7587-4d59-9b94-e8dcdc1869f8","Type":"ContainerStarted","Data":"da0d98a7b83c79f595823f08afdb4fe5fd4551f8654af46dd996c7eae7743263"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.266520 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z2mc9" event={"ID":"eaad8a68-7587-4d59-9b94-e8dcdc1869f8","Type":"ContainerStarted","Data":"18b3704139641633dd08d51c94b557d075210d35b375acceb0e9918495a5ca71"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.267062 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:14 crc kubenswrapper[4743]: E1011 00:54:14.267321 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:14.76730976 +0000 UTC m=+149.420290157 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.270929 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-smtjk"] Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.275673 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lxzlb" podStartSLOduration=129.275648878 podStartE2EDuration="2m9.275648878s" podCreationTimestamp="2025-10-11 00:52:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:14.266141895 +0000 UTC m=+148.919122292" watchObservedRunningTime="2025-10-11 00:54:14.275648878 +0000 UTC m=+148.928629295" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.287423 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wm7rl" event={"ID":"7ffea29d-93cb-4136-b0bc-7861c20751d4","Type":"ContainerStarted","Data":"d843fe10595a84c58d88ac1939ca0d56e8ed287e17d9a32a3216266995195b98"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.287465 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wm7rl" event={"ID":"7ffea29d-93cb-4136-b0bc-7861c20751d4","Type":"ContainerStarted","Data":"404c3ddf49b554db36c088162116419aff242659ca1f15df097664806e8a5bb2"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.289793 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-grqvq"] Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.297437 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gd5tn" event={"ID":"e59e36c1-5053-44bc-bb35-cab447a646fb","Type":"ContainerStarted","Data":"db2f6cffc32a96067b00affa1cf6b971c2bdfe8847c1dbd83208918ac60a6237"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.321472 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bc6rx" event={"ID":"6e2550ee-2c0c-437b-a4e3-3332ffba4e48","Type":"ContainerStarted","Data":"8bc67fcc4577e83ab796d57dd1140b2d3e4b3cc6ca6c9ec3e2719ec131eb2e2d"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.328567 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ccjcp"] Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.331588 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt" event={"ID":"2305c756-5c61-4d13-aac9-1ff5d3c6b2ad","Type":"ContainerStarted","Data":"9909c64ed346016efe19ad99a71225803560950fbb3f81586ba367f3b5ea7856"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.332180 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.337536 4743 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-f2gjt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.337585 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt" podUID="2305c756-5c61-4d13-aac9-1ff5d3c6b2ad" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.346920 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-62mz2" event={"ID":"122090c8-fe9b-404f-a9f3-a2ee9520091a","Type":"ContainerStarted","Data":"a7df50d08e7f17b5267fc631cfa67eac5ac9bc0c19bec5718a629a5db84ab63c"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.347131 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-62mz2" event={"ID":"122090c8-fe9b-404f-a9f3-a2ee9520091a","Type":"ContainerStarted","Data":"1c5a77dec7286f22c2ac063c2445a8b16e7b0a2300409bb34c36e5639b99e671"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.352880 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" event={"ID":"284b66a9-09b0-4bc0-bc8a-5bd32f06d088","Type":"ContainerStarted","Data":"56a42a64696f571fb3083137e133846b8b1b3a756e6f826d4684e9dd64cf8b99"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.353783 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.354757 4743 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-sjtsw container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.32:6443/healthz\": dial tcp 10.217.0.32:6443: connect: connection refused" start-of-body= Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.354788 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" podUID="284b66a9-09b0-4bc0-bc8a-5bd32f06d088" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.32:6443/healthz\": dial tcp 10.217.0.32:6443: connect: connection refused" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.366449 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" event={"ID":"58a40e2a-9789-4d4c-8817-8aa7920baa39","Type":"ContainerStarted","Data":"e6c13d255b19946d4fc51813a3d0f1c349212e5b19addf180bea89ee5f511c65"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.368258 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:14 crc kubenswrapper[4743]: E1011 00:54:14.368628 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:14.868612297 +0000 UTC m=+149.521592684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:14 crc kubenswrapper[4743]: W1011 00:54:14.376756 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod544ad287_fc9b_4033_b139_3893a3e10a00.slice/crio-2b5cafba61f03f29f6124f3a9b24b5b9b356b3b5286f77cbb54ea00d75dafe77 WatchSource:0}: Error finding container 2b5cafba61f03f29f6124f3a9b24b5b9b356b3b5286f77cbb54ea00d75dafe77: Status 404 returned error can't find the container with id 2b5cafba61f03f29f6124f3a9b24b5b9b356b3b5286f77cbb54ea00d75dafe77 Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.376927 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqmp6" event={"ID":"88ae6ca7-3631-4aee-820c-5c979a0d4f02","Type":"ContainerStarted","Data":"48fa59ab18b6003c8777aba545379c8b2c19d1be8a7a6aca35b4a42e1736e96b"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.377079 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqmp6" event={"ID":"88ae6ca7-3631-4aee-820c-5c979a0d4f02","Type":"ContainerStarted","Data":"da20c0c3d62042ef6bbb11d859728423d861c4c99b9e29338ed122352ff1aea9"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.377977 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqmp6" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.380997 4743 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-cqmp6 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.381039 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqmp6" podUID="88ae6ca7-3631-4aee-820c-5c979a0d4f02" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.384902 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rdwmd" event={"ID":"006ce2c3-4b48-4621-acc0-50428bb1c862","Type":"ContainerStarted","Data":"1b30e84518751ff3016ac6a8f8261b770a800e2a87f4b38c920f3056ce6a2ffa"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.393044 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-468m5" podStartSLOduration=128.393028374 podStartE2EDuration="2m8.393028374s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:14.362407272 +0000 UTC m=+149.015387669" watchObservedRunningTime="2025-10-11 00:54:14.393028374 +0000 UTC m=+149.046008761" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.400968 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335725-pg67v"] Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.420120 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f677q" event={"ID":"68717faf-e06c-4833-9b59-3e6279b38a6d","Type":"ContainerStarted","Data":"4fd7e3075f7f2d058676fdf1329588f8089f417110d4996005abb7db5c96f4a6"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.420157 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f677q" event={"ID":"68717faf-e06c-4833-9b59-3e6279b38a6d","Type":"ContainerStarted","Data":"2b56be478fdf7a42eb2ae037354fb97851555cad3261773935d2a611e79fdb67"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.443014 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-6zk27" event={"ID":"37c86724-d259-449c-baf5-b8965ea658a9","Type":"ContainerStarted","Data":"b347b59f269402ce6acc7516cdc4a4c03a9c76a68f20643cf7c1b036f1407b5e"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.443049 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-6zk27" event={"ID":"37c86724-d259-449c-baf5-b8965ea658a9","Type":"ContainerStarted","Data":"863f2c754abcb13bdddc220b62a94f22d3d8336f02985ad2835ffda8c54a0b43"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.445439 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kzxfv" podStartSLOduration=128.445418085 podStartE2EDuration="2m8.445418085s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:14.436744016 +0000 UTC m=+149.089724423" watchObservedRunningTime="2025-10-11 00:54:14.445418085 +0000 UTC m=+149.098398482" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.445620 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29335680-fmgl6" podStartSLOduration=129.445616201 podStartE2EDuration="2m9.445616201s" podCreationTimestamp="2025-10-11 00:52:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:14.41739867 +0000 UTC m=+149.070379087" watchObservedRunningTime="2025-10-11 00:54:14.445616201 +0000 UTC m=+149.098596598" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.471313 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.471363 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.472669 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.473212 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-tljzf" podStartSLOduration=127.473193852 podStartE2EDuration="2m7.473193852s" podCreationTimestamp="2025-10-11 00:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:14.471831061 +0000 UTC m=+149.124811478" watchObservedRunningTime="2025-10-11 00:54:14.473193852 +0000 UTC m=+149.126174259" Oct 11 00:54:14 crc kubenswrapper[4743]: E1011 00:54:14.472637 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:14.972626115 +0000 UTC m=+149.625606512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.476505 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qksm9" event={"ID":"e8a051b9-029d-4b92-a9a1-380c8d18f051","Type":"ContainerStarted","Data":"e1c06ffbcfa6dc6d77a1072065ff7f91053486f1e4e7004ae2f42b22a4c19388"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.477177 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-qksm9" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.478478 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6srbj" event={"ID":"1ae9d5c5-6a97-483f-9b76-45c73259c85b","Type":"ContainerStarted","Data":"a85e17e93498a485911611889f00f006f7ebbed91112d2712444b7f6978ff4d3"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.484332 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-qksm9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.484382 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qksm9" podUID="e8a051b9-029d-4b92-a9a1-380c8d18f051" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.487583 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8jrj" event={"ID":"1c78dcdb-20e7-4e37-95ba-c3576a63b2cf","Type":"ContainerStarted","Data":"41d9d7e0c86466885f677ba981d8d06c4065aaee7bb115a71cd82ad8e6da71d0"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.506655 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-6srbj" Oct 11 00:54:14 crc kubenswrapper[4743]: W1011 00:54:14.523995 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod962b1e46_ba63_4aa2_9882_a04487a05813.slice/crio-75471898b2fb952c943d0dfc4adbf9a88d039d341607e5b2f83e2c56b0e2d22f WatchSource:0}: Error finding container 75471898b2fb952c943d0dfc4adbf9a88d039d341607e5b2f83e2c56b0e2d22f: Status 404 returned error can't find the container with id 75471898b2fb952c943d0dfc4adbf9a88d039d341607e5b2f83e2c56b0e2d22f Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.524106 4743 patch_prober.go:28] interesting pod/router-default-5444994796-6srbj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 11 00:54:14 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Oct 11 00:54:14 crc kubenswrapper[4743]: [+]process-running ok Oct 11 00:54:14 crc kubenswrapper[4743]: healthz check failed Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.524152 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6srbj" podUID="1ae9d5c5-6a97-483f-9b76-45c73259c85b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.531511 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xgtfn" event={"ID":"16bef631-ee0f-4346-bb9b-c6eb48a09448","Type":"ContainerStarted","Data":"5ba6cfb5c39fff28953038d718c294177cb4aa7cf9206839ada218d601de67b3"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.532481 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xgtfn" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.536702 4743 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xgtfn container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.536746 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xgtfn" podUID="16bef631-ee0f-4346-bb9b-c6eb48a09448" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.552961 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lq56c" podStartSLOduration=128.552942548 podStartE2EDuration="2m8.552942548s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:14.552531585 +0000 UTC m=+149.205511982" watchObservedRunningTime="2025-10-11 00:54:14.552942548 +0000 UTC m=+149.205922935" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.565966 4743 generic.go:334] "Generic (PLEG): container finished" podID="0c2c2460-fedf-4109-8fd9-986749f1e021" containerID="24a1513d8f780e519bd5621de45b1cdf98fbf489271247f78cf2789a202dc93b" exitCode=0 Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.566253 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wswrq" event={"ID":"0c2c2460-fedf-4109-8fd9-986749f1e021","Type":"ContainerDied","Data":"24a1513d8f780e519bd5621de45b1cdf98fbf489271247f78cf2789a202dc93b"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.574556 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:14 crc kubenswrapper[4743]: E1011 00:54:14.576387 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:15.076371625 +0000 UTC m=+149.729352022 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.579375 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9npzd" event={"ID":"7f689d34-10fa-427c-8db0-cfc9324ae9de","Type":"ContainerStarted","Data":"f227b41d3e4c4f021b3c5790625d72e8bec2c55c439263201135f55607b8a826"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.579415 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9npzd" event={"ID":"7f689d34-10fa-427c-8db0-cfc9324ae9de","Type":"ContainerStarted","Data":"037c7dc830ac6fc2d2a3a045e8a052357aef3aa67106a022887251078ee97cd3"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.601225 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-rtt9m" podStartSLOduration=128.601203545 podStartE2EDuration="2m8.601203545s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:14.597084842 +0000 UTC m=+149.250065239" watchObservedRunningTime="2025-10-11 00:54:14.601203545 +0000 UTC m=+149.254183942" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.609659 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-65mgg" event={"ID":"224b66ec-aec1-4d9e-96c0-b6c7ddeb37e7","Type":"ContainerStarted","Data":"d7331e72e02d8d915fc1e926e1c3e6132217d8b2400f62a5bc2956c88580602b"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.659741 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-6zk27" podStartSLOduration=5.659728018 podStartE2EDuration="5.659728018s" podCreationTimestamp="2025-10-11 00:54:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:14.659143351 +0000 UTC m=+149.312123758" watchObservedRunningTime="2025-10-11 00:54:14.659728018 +0000 UTC m=+149.312708415" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.662471 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-wlvjw" podStartSLOduration=128.662451569 podStartE2EDuration="2m8.662451569s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:14.627487838 +0000 UTC m=+149.280468235" watchObservedRunningTime="2025-10-11 00:54:14.662451569 +0000 UTC m=+149.315431956" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.678722 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:14 crc kubenswrapper[4743]: E1011 00:54:14.680032 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:15.180017623 +0000 UTC m=+149.832998020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.693521 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v98q2" event={"ID":"f3263799-b7d3-4b1b-a61b-4768a061e502","Type":"ContainerStarted","Data":"3c3efbafdc4159965f90ad67af36d6bf69f63feb5a477c661e84759cf8b7e793"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.693875 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v98q2" event={"ID":"f3263799-b7d3-4b1b-a61b-4768a061e502","Type":"ContainerStarted","Data":"ce9fe7cac479e2076e966e488a2714ae103073ccbfd483f09fd4c283f89c6e20"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.700807 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" podStartSLOduration=127.700789711 podStartE2EDuration="2m7.700789711s" podCreationTimestamp="2025-10-11 00:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:14.695411051 +0000 UTC m=+149.348391458" watchObservedRunningTime="2025-10-11 00:54:14.700789711 +0000 UTC m=+149.353770108" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.724217 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xgtfn" podStartSLOduration=127.724199909 podStartE2EDuration="2m7.724199909s" podCreationTimestamp="2025-10-11 00:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:14.722763386 +0000 UTC m=+149.375743783" watchObservedRunningTime="2025-10-11 00:54:14.724199909 +0000 UTC m=+149.377180306" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.796109 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:14 crc kubenswrapper[4743]: E1011 00:54:14.796293 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:15.296271965 +0000 UTC m=+149.949252362 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.797301 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:14 crc kubenswrapper[4743]: E1011 00:54:14.803179 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:15.303160751 +0000 UTC m=+149.956141148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.803230 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w57f" event={"ID":"bd684c3f-8689-4d7b-ab5e-ff1c14ab9747","Type":"ContainerStarted","Data":"53e9947d84ea02d7e748f579604f73a7d94fd9fe3c421c1afbd71b81bf87192e"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.803275 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w57f" event={"ID":"bd684c3f-8689-4d7b-ab5e-ff1c14ab9747","Type":"ContainerStarted","Data":"524e9e3c7664232f8c27349559ad77daed50f5b953a3da421b56fce6f0cbe555"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.844872 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" podStartSLOduration=128.844841442 podStartE2EDuration="2m8.844841442s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:14.81823658 +0000 UTC m=+149.471216997" watchObservedRunningTime="2025-10-11 00:54:14.844841442 +0000 UTC m=+149.497821839" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.849060 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" event={"ID":"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2","Type":"ContainerStarted","Data":"bf1f6aa0ad999498dd24464c94aabfe589083d32124926181dfdfb3834d2b3f1"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.850536 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-6srbj" podStartSLOduration=128.850520361 podStartE2EDuration="2m8.850520361s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:14.849088799 +0000 UTC m=+149.502069196" watchObservedRunningTime="2025-10-11 00:54:14.850520361 +0000 UTC m=+149.503500758" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.880574 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt" podStartSLOduration=127.880558886 podStartE2EDuration="2m7.880558886s" podCreationTimestamp="2025-10-11 00:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:14.878142534 +0000 UTC m=+149.531122941" watchObservedRunningTime="2025-10-11 00:54:14.880558886 +0000 UTC m=+149.533539283" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.898282 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:14 crc kubenswrapper[4743]: E1011 00:54:14.898557 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:15.398543002 +0000 UTC m=+150.051523399 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.940525 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f677q" podStartSLOduration=128.940501091 podStartE2EDuration="2m8.940501091s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:14.907680484 +0000 UTC m=+149.560660881" watchObservedRunningTime="2025-10-11 00:54:14.940501091 +0000 UTC m=+149.593481488" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.952580 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6l9k" event={"ID":"9b43d826-75ca-4c81-9f93-11b4398b96fa","Type":"ContainerStarted","Data":"7c21d58e1a0e835fd4fa64d9b0ca262d19efdda331b014171c2e4f90b01e9882"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.953177 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-qksm9" podStartSLOduration=128.953155088 podStartE2EDuration="2m8.953155088s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:14.943508891 +0000 UTC m=+149.596489288" watchObservedRunningTime="2025-10-11 00:54:14.953155088 +0000 UTC m=+149.606135485" Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.960508 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bt5pp" event={"ID":"5bc629a2-3424-4180-a506-1accee2c0246","Type":"ContainerStarted","Data":"8ee1a37100fe5f6c8d3a1c9a17722f1ec35319b5f7885dcae6f79d3cd3842225"} Oct 11 00:54:14 crc kubenswrapper[4743]: I1011 00:54:14.976393 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-229tw" event={"ID":"121b64d6-750f-4720-8297-6f3a91dc3a3a","Type":"ContainerStarted","Data":"b8463408a7318c067d3ffeba56924761dcb24b0066dc39b9faf45ef57a9ba813"} Oct 11 00:54:15 crc kubenswrapper[4743]: I1011 00:54:15.000758 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:15 crc kubenswrapper[4743]: E1011 00:54:15.001917 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:15.50190543 +0000 UTC m=+150.154885817 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:15 crc kubenswrapper[4743]: I1011 00:54:15.015543 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqmp6" podStartSLOduration=128.015527856 podStartE2EDuration="2m8.015527856s" podCreationTimestamp="2025-10-11 00:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:15.014332831 +0000 UTC m=+149.667313228" watchObservedRunningTime="2025-10-11 00:54:15.015527856 +0000 UTC m=+149.668508243" Oct 11 00:54:15 crc kubenswrapper[4743]: I1011 00:54:15.016370 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gd5tn" podStartSLOduration=129.016364881 podStartE2EDuration="2m9.016364881s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:14.992420288 +0000 UTC m=+149.645400705" watchObservedRunningTime="2025-10-11 00:54:15.016364881 +0000 UTC m=+149.669345268" Oct 11 00:54:15 crc kubenswrapper[4743]: I1011 00:54:15.064165 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-9npzd" podStartSLOduration=129.064151824 podStartE2EDuration="2m9.064151824s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:15.061888657 +0000 UTC m=+149.714869074" watchObservedRunningTime="2025-10-11 00:54:15.064151824 +0000 UTC m=+149.717132221" Oct 11 00:54:15 crc kubenswrapper[4743]: I1011 00:54:15.104210 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:15 crc kubenswrapper[4743]: E1011 00:54:15.105987 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:15.6059675 +0000 UTC m=+150.258947897 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:15 crc kubenswrapper[4743]: I1011 00:54:15.112299 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" Oct 11 00:54:15 crc kubenswrapper[4743]: I1011 00:54:15.114054 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-65mgg" podStartSLOduration=128.11404127 podStartE2EDuration="2m8.11404127s" podCreationTimestamp="2025-10-11 00:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:15.112656629 +0000 UTC m=+149.765637036" watchObservedRunningTime="2025-10-11 00:54:15.11404127 +0000 UTC m=+149.767021667" Oct 11 00:54:15 crc kubenswrapper[4743]: I1011 00:54:15.199368 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6l9k" podStartSLOduration=130.199352742 podStartE2EDuration="2m10.199352742s" podCreationTimestamp="2025-10-11 00:52:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:15.15464182 +0000 UTC m=+149.807622237" watchObservedRunningTime="2025-10-11 00:54:15.199352742 +0000 UTC m=+149.852333139" Oct 11 00:54:15 crc kubenswrapper[4743]: I1011 00:54:15.206392 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:15 crc kubenswrapper[4743]: E1011 00:54:15.206680 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:15.706668189 +0000 UTC m=+150.359648586 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:15 crc kubenswrapper[4743]: I1011 00:54:15.219655 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9w57f" podStartSLOduration=129.219624405 podStartE2EDuration="2m9.219624405s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:15.19897078 +0000 UTC m=+149.851951167" watchObservedRunningTime="2025-10-11 00:54:15.219624405 +0000 UTC m=+149.872604802" Oct 11 00:54:15 crc kubenswrapper[4743]: I1011 00:54:15.295029 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bt5pp" podStartSLOduration=129.295012191 podStartE2EDuration="2m9.295012191s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:15.293825226 +0000 UTC m=+149.946805633" watchObservedRunningTime="2025-10-11 00:54:15.295012191 +0000 UTC m=+149.947992588" Oct 11 00:54:15 crc kubenswrapper[4743]: I1011 00:54:15.295285 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-229tw" podStartSLOduration=129.295279739 podStartE2EDuration="2m9.295279739s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:15.263951266 +0000 UTC m=+149.916931663" watchObservedRunningTime="2025-10-11 00:54:15.295279739 +0000 UTC m=+149.948260136" Oct 11 00:54:15 crc kubenswrapper[4743]: I1011 00:54:15.311508 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:15 crc kubenswrapper[4743]: E1011 00:54:15.311873 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:15.811837492 +0000 UTC m=+150.464817889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:15 crc kubenswrapper[4743]: I1011 00:54:15.338309 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v98q2" podStartSLOduration=128.3382924 podStartE2EDuration="2m8.3382924s" podCreationTimestamp="2025-10-11 00:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:15.337532197 +0000 UTC m=+149.990512604" watchObservedRunningTime="2025-10-11 00:54:15.3382924 +0000 UTC m=+149.991272797" Oct 11 00:54:15 crc kubenswrapper[4743]: I1011 00:54:15.413597 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:15 crc kubenswrapper[4743]: E1011 00:54:15.413935 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:15.913924073 +0000 UTC m=+150.566904470 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:15 crc kubenswrapper[4743]: I1011 00:54:15.516342 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:15 crc kubenswrapper[4743]: E1011 00:54:15.516950 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:16.016935571 +0000 UTC m=+150.669915958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:15 crc kubenswrapper[4743]: I1011 00:54:15.518495 4743 patch_prober.go:28] interesting pod/router-default-5444994796-6srbj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 11 00:54:15 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Oct 11 00:54:15 crc kubenswrapper[4743]: [+]process-running ok Oct 11 00:54:15 crc kubenswrapper[4743]: healthz check failed Oct 11 00:54:15 crc kubenswrapper[4743]: I1011 00:54:15.518525 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6srbj" podUID="1ae9d5c5-6a97-483f-9b76-45c73259c85b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 00:54:15 crc kubenswrapper[4743]: I1011 00:54:15.618526 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:15 crc kubenswrapper[4743]: E1011 00:54:15.618842 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:16.118830246 +0000 UTC m=+150.771810643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:15 crc kubenswrapper[4743]: I1011 00:54:15.632183 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-wlvjw" Oct 11 00:54:15 crc kubenswrapper[4743]: I1011 00:54:15.720406 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:15 crc kubenswrapper[4743]: E1011 00:54:15.720534 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:16.220515805 +0000 UTC m=+150.873496202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:15 crc kubenswrapper[4743]: I1011 00:54:15.720908 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:15 crc kubenswrapper[4743]: E1011 00:54:15.721169 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:16.221160894 +0000 UTC m=+150.874141291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:15 crc kubenswrapper[4743]: I1011 00:54:15.822291 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:15 crc kubenswrapper[4743]: E1011 00:54:15.822576 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:16.322561105 +0000 UTC m=+150.975541502 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:15 crc kubenswrapper[4743]: I1011 00:54:15.925530 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:15 crc kubenswrapper[4743]: E1011 00:54:15.926039 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:16.426027286 +0000 UTC m=+151.079007693 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.020191 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-smtjk" event={"ID":"b489193c-9eab-4025-8c8f-23b3b42bae0e","Type":"ContainerStarted","Data":"e2ef38fa98edba38d07ddddae60e16f77511452a075aa3b8abb224562cb4a634"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.020228 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-smtjk" event={"ID":"b489193c-9eab-4025-8c8f-23b3b42bae0e","Type":"ContainerStarted","Data":"f7e6410685ca2abd80613c844195bd324e16c38e20c99f9a00ab7d07af46c789"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.026727 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:16 crc kubenswrapper[4743]: E1011 00:54:16.026845 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:16.526829269 +0000 UTC m=+151.179809666 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.026992 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:16 crc kubenswrapper[4743]: E1011 00:54:16.027265 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:16.527255812 +0000 UTC m=+151.180236199 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.046115 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rt2nd" event={"ID":"5690cb54-1fbe-4d33-a809-b7bdca4df6c0","Type":"ContainerStarted","Data":"993c1770469d608a055106d8de76512f9f7a956e1c48d4be8fb095ba3fa4a3af"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.046168 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rt2nd" event={"ID":"5690cb54-1fbe-4d33-a809-b7bdca4df6c0","Type":"ContainerStarted","Data":"f2fbc9b4650f6f34ca6c6162aea9e84e436397845c93e8fc6eaf2bf5397ea739"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.066273 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wswrq" event={"ID":"0c2c2460-fedf-4109-8fd9-986749f1e021","Type":"ContainerStarted","Data":"992500545c8ec2c81bfa0698f4406b4a1a66a791b11d2d271eeb50c6cce19388"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.066888 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wswrq" Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.081290 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-62mz2" event={"ID":"122090c8-fe9b-404f-a9f3-a2ee9520091a","Type":"ContainerStarted","Data":"46ced6e9b57ff0426770e3648f64a467d0452228dfb6ab451a3fb7a82b831051"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.081956 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-62mz2" Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.100986 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rt2nd" podStartSLOduration=129.100970057 podStartE2EDuration="2m9.100970057s" podCreationTimestamp="2025-10-11 00:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:16.096061621 +0000 UTC m=+150.749042018" watchObservedRunningTime="2025-10-11 00:54:16.100970057 +0000 UTC m=+150.753950454" Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.114870 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ca331342c91ba259618d95eec5315e24b598cc0fbe4759b65d9c0b8613dc89ab"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.129006 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8d5324bf9a4f471a892b1c5e6c262285708d17991217e8cc753e8d35f0c167ac"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.129038 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xgtfn" event={"ID":"16bef631-ee0f-4346-bb9b-c6eb48a09448","Type":"ContainerStarted","Data":"a17b919a62d3e30de28933f816947e9668d1a111d03144fa9f95231658c5d17c"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.129056 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ccjcp" event={"ID":"544ad287-fc9b-4033-b139-3893a3e10a00","Type":"ContainerStarted","Data":"ef21acfbcdaae15bc269b4f697a0b6d092da9925fdd9a786306ac4dd3286cce5"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.129070 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ccjcp" event={"ID":"544ad287-fc9b-4033-b139-3893a3e10a00","Type":"ContainerStarted","Data":"2b5cafba61f03f29f6124f3a9b24b5b9b356b3b5286f77cbb54ea00d75dafe77"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.129590 4743 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xgtfn container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.129635 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xgtfn" podUID="16bef631-ee0f-4346-bb9b-c6eb48a09448" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.129706 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.130235 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:16 crc kubenswrapper[4743]: E1011 00:54:16.131238 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:16.631213248 +0000 UTC m=+151.284193645 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.150475 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-65mgg" event={"ID":"224b66ec-aec1-4d9e-96c0-b6c7ddeb37e7","Type":"ContainerStarted","Data":"890db6cbe0a13eadfe54e7f1fc593281af2db034c24e095f28916abe9cdbab05"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.179239 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4k997" event={"ID":"e0073037-950f-40df-bcb6-d9fd0aceccb2","Type":"ContainerStarted","Data":"5eeae96acea22be81750688d75cd0066c7811984d01dc95bf77d880afc227dc6"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.179281 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4k997" event={"ID":"e0073037-950f-40df-bcb6-d9fd0aceccb2","Type":"ContainerStarted","Data":"48641d6a375bf365e09c7eca26f821fbe91db64c4339aec4a11d18af295ffff6"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.220127 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wm7rl" event={"ID":"7ffea29d-93cb-4136-b0bc-7861c20751d4","Type":"ContainerStarted","Data":"4d8adc51a8a9057e740a567c9ffda05038168dc05cf7bff02866d3a1e0363804"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.231004 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:16 crc kubenswrapper[4743]: E1011 00:54:16.233914 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:16.733901837 +0000 UTC m=+151.386882234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.235428 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f09f5153a221cd3d82a1efa772b9cfe7620c189270ff06562019eafc2499d154"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.235463 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e63f11b2c960e4c385e11502b02b8ede90af139505e2712a5ffca513f12591bb"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.237238 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rdwmd" event={"ID":"006ce2c3-4b48-4621-acc0-50428bb1c862","Type":"ContainerStarted","Data":"e22b1b3ded86013ae3ee05dcd1938ab05febdaa6c2ca1af5e4a3611d8167bbf7"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.253772 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v98q2" event={"ID":"f3263799-b7d3-4b1b-a61b-4768a061e502","Type":"ContainerStarted","Data":"a1c4023454dbc46b9117221df11e6effbbc32199a42837e89c9e6a6ed79e68aa"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.267082 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2170df0c30d60dc7cc0b3ee1e696c43b8b58ac498905db689b83bbf0d3e0d8b8"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.267131 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b3606a97b8bf52abf224748c367c64ce0702779aeb9a13ff662600f6531beba2"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.282431 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bc6rx" event={"ID":"6e2550ee-2c0c-437b-a4e3-3332ffba4e48","Type":"ContainerStarted","Data":"eb7f0241b2b7986f158ae4e9519d44a6c79f691c5338ce4a44c52c4c4000f294"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.282468 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bc6rx" event={"ID":"6e2550ee-2c0c-437b-a4e3-3332ffba4e48","Type":"ContainerStarted","Data":"57076f83a6cc0f2fe10bca3a4e198edd811c6c1799a5384c2c5831c7b52debcc"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.282650 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wswrq" podStartSLOduration=130.282634358 podStartE2EDuration="2m10.282634358s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:16.280612568 +0000 UTC m=+150.933592965" watchObservedRunningTime="2025-10-11 00:54:16.282634358 +0000 UTC m=+150.935614755" Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.287228 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-grqvq" event={"ID":"4a97f1cb-48e1-4049-be4a-0151201d5cc3","Type":"ContainerStarted","Data":"adb06b37078bea335df7c4161fd3bd1e68f19142fd848a71525354d25fbff2e7"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.294801 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-85wtv" event={"ID":"3f8744f8-b334-4487-a52b-c6b6f226401d","Type":"ContainerStarted","Data":"0c115d4a157d3b03d1c20bd21b26fbb01c1bded882ab6551a6a351d3603fa32a"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.294832 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-85wtv" event={"ID":"3f8744f8-b334-4487-a52b-c6b6f226401d","Type":"ContainerStarted","Data":"1ce87d66d1de13cebafe7297b732f6ea5da8c399f87066786cba380006df8e52"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.294842 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-85wtv" event={"ID":"3f8744f8-b334-4487-a52b-c6b6f226401d","Type":"ContainerStarted","Data":"ae3211ff9b8efd27dba4538741977d67afb8e8991210d44701e86b7915d79155"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.309409 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" event={"ID":"3abfab8d-967c-42bf-9f48-bd69bbe6f8f2","Type":"ContainerStarted","Data":"4d1389b1f5e032f8d620c54875cfbc27f833867968e5be29bca9b44faedec01d"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.326044 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8jrj" event={"ID":"1c78dcdb-20e7-4e37-95ba-c3576a63b2cf","Type":"ContainerStarted","Data":"b5f2057b00082de79d11e7497c486ad23c7029a12938b363753d91754902c620"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.326593 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8jrj" Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.341249 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:16 crc kubenswrapper[4743]: E1011 00:54:16.342460 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:16.84244497 +0000 UTC m=+151.495425367 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.360864 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mlnnl" event={"ID":"887d134b-7eff-41e9-abf8-f0e70fd4c0e1","Type":"ContainerStarted","Data":"19d2d3e7a56719fc505d3945c8ed9558c896480de2914655a31d60d1f9b5c566"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.360912 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mlnnl" event={"ID":"887d134b-7eff-41e9-abf8-f0e70fd4c0e1","Type":"ContainerStarted","Data":"6d01425cc52a4b56c9851113f245d838c7122f1507f75a42b82a351daeb545a2"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.361027 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8jrj" Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.386016 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335725-pg67v" event={"ID":"962b1e46-ba63-4aa2-9882-a04487a05813","Type":"ContainerStarted","Data":"057a072f6b313157d0743795a2f56c68aa7d55e7b92aaabe95560a120c6bc0cc"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.386054 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335725-pg67v" event={"ID":"962b1e46-ba63-4aa2-9882-a04487a05813","Type":"ContainerStarted","Data":"75471898b2fb952c943d0dfc4adbf9a88d039d341607e5b2f83e2c56b0e2d22f"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.409880 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z2mc9" event={"ID":"eaad8a68-7587-4d59-9b94-e8dcdc1869f8","Type":"ContainerStarted","Data":"f9636dfbb67078172188d14f59e447e35099d337f82a34eceee8597235ee919e"} Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.412584 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-qksm9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.412620 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qksm9" podUID="e8a051b9-029d-4b92-a9a1-380c8d18f051" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.430801 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt" Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.451580 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.453157 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqmp6" Oct 11 00:54:16 crc kubenswrapper[4743]: E1011 00:54:16.453514 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:16.953498978 +0000 UTC m=+151.606479375 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.529121 4743 patch_prober.go:28] interesting pod/router-default-5444994796-6srbj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 11 00:54:16 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Oct 11 00:54:16 crc kubenswrapper[4743]: [+]process-running ok Oct 11 00:54:16 crc kubenswrapper[4743]: healthz check failed Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.529433 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6srbj" podUID="1ae9d5c5-6a97-483f-9b76-45c73259c85b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.555280 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:16 crc kubenswrapper[4743]: E1011 00:54:16.556757 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:17.056744542 +0000 UTC m=+151.709724939 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.657338 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.657631 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.658203 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:16 crc kubenswrapper[4743]: E1011 00:54:16.658531 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:17.158519514 +0000 UTC m=+151.811499911 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.675114 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.675366 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.696461 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.762320 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:16 crc kubenswrapper[4743]: E1011 00:54:16.762654 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:17.262638705 +0000 UTC m=+151.915619102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.824245 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-62mz2" podStartSLOduration=129.8242303 podStartE2EDuration="2m9.8242303s" podCreationTimestamp="2025-10-11 00:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:16.821258341 +0000 UTC m=+151.474238758" watchObservedRunningTime="2025-10-11 00:54:16.8242303 +0000 UTC m=+151.477210687" Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.865526 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:16 crc kubenswrapper[4743]: E1011 00:54:16.865789 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:17.365778897 +0000 UTC m=+152.018759294 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.877175 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.879266 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rdwmd" podStartSLOduration=130.879257199 podStartE2EDuration="2m10.879257199s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:16.843698129 +0000 UTC m=+151.496678516" watchObservedRunningTime="2025-10-11 00:54:16.879257199 +0000 UTC m=+151.532237586" Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.907026 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4k997" podStartSLOduration=129.907001535 podStartE2EDuration="2m9.907001535s" podCreationTimestamp="2025-10-11 00:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:16.880012441 +0000 UTC m=+151.532992838" watchObservedRunningTime="2025-10-11 00:54:16.907001535 +0000 UTC m=+151.559981922" Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.908679 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" podStartSLOduration=130.908673245 podStartE2EDuration="2m10.908673245s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:16.902666936 +0000 UTC m=+151.555647333" watchObservedRunningTime="2025-10-11 00:54:16.908673245 +0000 UTC m=+151.561653642" Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.976416 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:16 crc kubenswrapper[4743]: E1011 00:54:16.976553 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:17.476529246 +0000 UTC m=+152.129509643 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:16 crc kubenswrapper[4743]: I1011 00:54:16.976615 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:16 crc kubenswrapper[4743]: E1011 00:54:16.976958 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:17.476946388 +0000 UTC m=+152.129926785 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.055669 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29335725-pg67v" podStartSLOduration=131.055652953 podStartE2EDuration="2m11.055652953s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:17.055256961 +0000 UTC m=+151.708237358" watchObservedRunningTime="2025-10-11 00:54:17.055652953 +0000 UTC m=+151.708633340" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.078461 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:17 crc kubenswrapper[4743]: E1011 00:54:17.078723 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:17.578709999 +0000 UTC m=+152.231690396 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.180948 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:17 crc kubenswrapper[4743]: E1011 00:54:17.181313 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:17.681301435 +0000 UTC m=+152.334281832 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.237303 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ccjcp" podStartSLOduration=130.237289373 podStartE2EDuration="2m10.237289373s" podCreationTimestamp="2025-10-11 00:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:17.157359972 +0000 UTC m=+151.810340369" watchObservedRunningTime="2025-10-11 00:54:17.237289373 +0000 UTC m=+151.890269770" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.238241 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-wm7rl" podStartSLOduration=130.238236871 podStartE2EDuration="2m10.238236871s" podCreationTimestamp="2025-10-11 00:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:17.235710606 +0000 UTC m=+151.888691013" watchObservedRunningTime="2025-10-11 00:54:17.238236871 +0000 UTC m=+151.891217268" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.291334 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:17 crc kubenswrapper[4743]: E1011 00:54:17.291668 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:17.791639932 +0000 UTC m=+152.444620329 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.349984 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-85wtv" podStartSLOduration=131.349966659 podStartE2EDuration="2m11.349966659s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:17.338269141 +0000 UTC m=+151.991249558" watchObservedRunningTime="2025-10-11 00:54:17.349966659 +0000 UTC m=+152.002947056" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.392477 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:17 crc kubenswrapper[4743]: E1011 00:54:17.392892 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:17.892877557 +0000 UTC m=+152.545857954 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.447016 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-smtjk" event={"ID":"b489193c-9eab-4025-8c8f-23b3b42bae0e","Type":"ContainerStarted","Data":"55b99edce5106dab72e1f8419be0a977b1d1fac3f8fefc662ec57e4c065595bf"} Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.447667 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-smtjk" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.464675 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-grqvq" event={"ID":"4a97f1cb-48e1-4049-be4a-0151201d5cc3","Type":"ContainerStarted","Data":"c875a440cb414caf0215116dc78ffde241b2f471ebe319f6743817557b9d740a"} Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.464716 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ccjcp" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.468195 4743 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xgtfn container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.468231 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xgtfn" podUID="16bef631-ee0f-4346-bb9b-c6eb48a09448" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.468288 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-qksm9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.468300 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qksm9" podUID="e8a051b9-029d-4b92-a9a1-380c8d18f051" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.477009 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgd8f" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.494352 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:17 crc kubenswrapper[4743]: E1011 00:54:17.494684 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:17.994667939 +0000 UTC m=+152.647648336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.503061 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r8dmc"] Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.503213 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bc6rx" podStartSLOduration=130.503198793 podStartE2EDuration="2m10.503198793s" podCreationTimestamp="2025-10-11 00:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:17.49872502 +0000 UTC m=+152.151705417" watchObservedRunningTime="2025-10-11 00:54:17.503198793 +0000 UTC m=+152.156179190" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.504009 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8dmc" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.510545 4743 patch_prober.go:28] interesting pod/router-default-5444994796-6srbj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 11 00:54:17 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Oct 11 00:54:17 crc kubenswrapper[4743]: [+]process-running ok Oct 11 00:54:17 crc kubenswrapper[4743]: healthz check failed Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.510602 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6srbj" podUID="1ae9d5c5-6a97-483f-9b76-45c73259c85b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.533768 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r8dmc"] Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.530729 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.602280 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-mlnnl" podStartSLOduration=8.602264654 podStartE2EDuration="8.602264654s" podCreationTimestamp="2025-10-11 00:54:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:17.602166511 +0000 UTC m=+152.255146908" watchObservedRunningTime="2025-10-11 00:54:17.602264654 +0000 UTC m=+152.255245051" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.603517 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.603689 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z2mc9" podStartSLOduration=130.603683676 podStartE2EDuration="2m10.603683676s" podCreationTimestamp="2025-10-11 00:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:17.552152632 +0000 UTC m=+152.205133029" watchObservedRunningTime="2025-10-11 00:54:17.603683676 +0000 UTC m=+152.256664073" Oct 11 00:54:17 crc kubenswrapper[4743]: E1011 00:54:17.624627 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:18.12461183 +0000 UTC m=+152.777592227 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.628871 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-drlgz"] Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.629754 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-drlgz" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.642418 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.650933 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-drlgz"] Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.711025 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s8jrj" podStartSLOduration=130.711011513 podStartE2EDuration="2m10.711011513s" podCreationTimestamp="2025-10-11 00:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:17.708328863 +0000 UTC m=+152.361309260" watchObservedRunningTime="2025-10-11 00:54:17.711011513 +0000 UTC m=+152.363991910" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.717746 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.717902 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c7ded1c-c0ce-47c4-9959-b95631f067ea-catalog-content\") pod \"certified-operators-r8dmc\" (UID: \"7c7ded1c-c0ce-47c4-9959-b95631f067ea\") " pod="openshift-marketplace/certified-operators-r8dmc" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.717977 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c7ded1c-c0ce-47c4-9959-b95631f067ea-utilities\") pod \"certified-operators-r8dmc\" (UID: \"7c7ded1c-c0ce-47c4-9959-b95631f067ea\") " pod="openshift-marketplace/certified-operators-r8dmc" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.718000 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef-utilities\") pod \"community-operators-drlgz\" (UID: \"ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef\") " pod="openshift-marketplace/community-operators-drlgz" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.718018 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llb68\" (UniqueName: \"kubernetes.io/projected/7c7ded1c-c0ce-47c4-9959-b95631f067ea-kube-api-access-llb68\") pod \"certified-operators-r8dmc\" (UID: \"7c7ded1c-c0ce-47c4-9959-b95631f067ea\") " pod="openshift-marketplace/certified-operators-r8dmc" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.718042 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qwzj\" (UniqueName: \"kubernetes.io/projected/ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef-kube-api-access-4qwzj\") pod \"community-operators-drlgz\" (UID: \"ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef\") " pod="openshift-marketplace/community-operators-drlgz" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.718068 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef-catalog-content\") pod \"community-operators-drlgz\" (UID: \"ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef\") " pod="openshift-marketplace/community-operators-drlgz" Oct 11 00:54:17 crc kubenswrapper[4743]: E1011 00:54:17.718339 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:18.218145396 +0000 UTC m=+152.871125793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.762033 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-smtjk" podStartSLOduration=8.762019393 podStartE2EDuration="8.762019393s" podCreationTimestamp="2025-10-11 00:54:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:17.760114076 +0000 UTC m=+152.413094473" watchObservedRunningTime="2025-10-11 00:54:17.762019393 +0000 UTC m=+152.414999790" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.819928 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.819979 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c7ded1c-c0ce-47c4-9959-b95631f067ea-utilities\") pod \"certified-operators-r8dmc\" (UID: \"7c7ded1c-c0ce-47c4-9959-b95631f067ea\") " pod="openshift-marketplace/certified-operators-r8dmc" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.820008 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef-utilities\") pod \"community-operators-drlgz\" (UID: \"ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef\") " pod="openshift-marketplace/community-operators-drlgz" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.820025 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llb68\" (UniqueName: \"kubernetes.io/projected/7c7ded1c-c0ce-47c4-9959-b95631f067ea-kube-api-access-llb68\") pod \"certified-operators-r8dmc\" (UID: \"7c7ded1c-c0ce-47c4-9959-b95631f067ea\") " pod="openshift-marketplace/certified-operators-r8dmc" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.820050 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qwzj\" (UniqueName: \"kubernetes.io/projected/ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef-kube-api-access-4qwzj\") pod \"community-operators-drlgz\" (UID: \"ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef\") " pod="openshift-marketplace/community-operators-drlgz" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.820075 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef-catalog-content\") pod \"community-operators-drlgz\" (UID: \"ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef\") " pod="openshift-marketplace/community-operators-drlgz" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.820102 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c7ded1c-c0ce-47c4-9959-b95631f067ea-catalog-content\") pod \"certified-operators-r8dmc\" (UID: \"7c7ded1c-c0ce-47c4-9959-b95631f067ea\") " pod="openshift-marketplace/certified-operators-r8dmc" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.820508 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c7ded1c-c0ce-47c4-9959-b95631f067ea-catalog-content\") pod \"certified-operators-r8dmc\" (UID: \"7c7ded1c-c0ce-47c4-9959-b95631f067ea\") " pod="openshift-marketplace/certified-operators-r8dmc" Oct 11 00:54:17 crc kubenswrapper[4743]: E1011 00:54:17.820773 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:18.320759032 +0000 UTC m=+152.973739429 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.821061 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c7ded1c-c0ce-47c4-9959-b95631f067ea-utilities\") pod \"certified-operators-r8dmc\" (UID: \"7c7ded1c-c0ce-47c4-9959-b95631f067ea\") " pod="openshift-marketplace/certified-operators-r8dmc" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.821056 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef-utilities\") pod \"community-operators-drlgz\" (UID: \"ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef\") " pod="openshift-marketplace/community-operators-drlgz" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.821891 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef-catalog-content\") pod \"community-operators-drlgz\" (UID: \"ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef\") " pod="openshift-marketplace/community-operators-drlgz" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.846993 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nvz7n"] Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.847884 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nvz7n" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.870684 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llb68\" (UniqueName: \"kubernetes.io/projected/7c7ded1c-c0ce-47c4-9959-b95631f067ea-kube-api-access-llb68\") pod \"certified-operators-r8dmc\" (UID: \"7c7ded1c-c0ce-47c4-9959-b95631f067ea\") " pod="openshift-marketplace/certified-operators-r8dmc" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.882315 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qwzj\" (UniqueName: \"kubernetes.io/projected/ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef-kube-api-access-4qwzj\") pod \"community-operators-drlgz\" (UID: \"ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef\") " pod="openshift-marketplace/community-operators-drlgz" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.887568 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8dmc" Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.900073 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nvz7n"] Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.924088 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:17 crc kubenswrapper[4743]: E1011 00:54:17.924450 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:18.42443445 +0000 UTC m=+153.077414837 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:17 crc kubenswrapper[4743]: I1011 00:54:17.998656 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-drlgz" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.015024 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hjs95"] Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.027061 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-776lz\" (UniqueName: \"kubernetes.io/projected/712f7dea-8de1-43e3-802b-b4e9b521b0b6-kube-api-access-776lz\") pod \"certified-operators-nvz7n\" (UID: \"712f7dea-8de1-43e3-802b-b4e9b521b0b6\") " pod="openshift-marketplace/certified-operators-nvz7n" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.027111 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/712f7dea-8de1-43e3-802b-b4e9b521b0b6-utilities\") pod \"certified-operators-nvz7n\" (UID: \"712f7dea-8de1-43e3-802b-b4e9b521b0b6\") " pod="openshift-marketplace/certified-operators-nvz7n" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.027189 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.027208 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/712f7dea-8de1-43e3-802b-b4e9b521b0b6-catalog-content\") pod \"certified-operators-nvz7n\" (UID: \"712f7dea-8de1-43e3-802b-b4e9b521b0b6\") " pod="openshift-marketplace/certified-operators-nvz7n" Oct 11 00:54:18 crc kubenswrapper[4743]: E1011 00:54:18.027693 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:18.527680566 +0000 UTC m=+153.180660963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.029037 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hjs95" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.056461 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hjs95"] Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.129946 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.130127 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtwzw\" (UniqueName: \"kubernetes.io/projected/de5b1941-3ffa-477d-9b7f-7e07fe3ed206-kube-api-access-vtwzw\") pod \"community-operators-hjs95\" (UID: \"de5b1941-3ffa-477d-9b7f-7e07fe3ed206\") " pod="openshift-marketplace/community-operators-hjs95" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.130171 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/712f7dea-8de1-43e3-802b-b4e9b521b0b6-catalog-content\") pod \"certified-operators-nvz7n\" (UID: \"712f7dea-8de1-43e3-802b-b4e9b521b0b6\") " pod="openshift-marketplace/certified-operators-nvz7n" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.130200 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-776lz\" (UniqueName: \"kubernetes.io/projected/712f7dea-8de1-43e3-802b-b4e9b521b0b6-kube-api-access-776lz\") pod \"certified-operators-nvz7n\" (UID: \"712f7dea-8de1-43e3-802b-b4e9b521b0b6\") " pod="openshift-marketplace/certified-operators-nvz7n" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.130217 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de5b1941-3ffa-477d-9b7f-7e07fe3ed206-catalog-content\") pod \"community-operators-hjs95\" (UID: \"de5b1941-3ffa-477d-9b7f-7e07fe3ed206\") " pod="openshift-marketplace/community-operators-hjs95" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.130241 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/712f7dea-8de1-43e3-802b-b4e9b521b0b6-utilities\") pod \"certified-operators-nvz7n\" (UID: \"712f7dea-8de1-43e3-802b-b4e9b521b0b6\") " pod="openshift-marketplace/certified-operators-nvz7n" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.130281 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de5b1941-3ffa-477d-9b7f-7e07fe3ed206-utilities\") pod \"community-operators-hjs95\" (UID: \"de5b1941-3ffa-477d-9b7f-7e07fe3ed206\") " pod="openshift-marketplace/community-operators-hjs95" Oct 11 00:54:18 crc kubenswrapper[4743]: E1011 00:54:18.130370 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:18.630355564 +0000 UTC m=+153.283335961 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.130716 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/712f7dea-8de1-43e3-802b-b4e9b521b0b6-catalog-content\") pod \"certified-operators-nvz7n\" (UID: \"712f7dea-8de1-43e3-802b-b4e9b521b0b6\") " pod="openshift-marketplace/certified-operators-nvz7n" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.131158 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/712f7dea-8de1-43e3-802b-b4e9b521b0b6-utilities\") pod \"certified-operators-nvz7n\" (UID: \"712f7dea-8de1-43e3-802b-b4e9b521b0b6\") " pod="openshift-marketplace/certified-operators-nvz7n" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.148497 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-776lz\" (UniqueName: \"kubernetes.io/projected/712f7dea-8de1-43e3-802b-b4e9b521b0b6-kube-api-access-776lz\") pod \"certified-operators-nvz7n\" (UID: \"712f7dea-8de1-43e3-802b-b4e9b521b0b6\") " pod="openshift-marketplace/certified-operators-nvz7n" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.217699 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nvz7n" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.232489 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de5b1941-3ffa-477d-9b7f-7e07fe3ed206-utilities\") pod \"community-operators-hjs95\" (UID: \"de5b1941-3ffa-477d-9b7f-7e07fe3ed206\") " pod="openshift-marketplace/community-operators-hjs95" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.232565 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtwzw\" (UniqueName: \"kubernetes.io/projected/de5b1941-3ffa-477d-9b7f-7e07fe3ed206-kube-api-access-vtwzw\") pod \"community-operators-hjs95\" (UID: \"de5b1941-3ffa-477d-9b7f-7e07fe3ed206\") " pod="openshift-marketplace/community-operators-hjs95" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.232602 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.232624 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de5b1941-3ffa-477d-9b7f-7e07fe3ed206-catalog-content\") pod \"community-operators-hjs95\" (UID: \"de5b1941-3ffa-477d-9b7f-7e07fe3ed206\") " pod="openshift-marketplace/community-operators-hjs95" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.233305 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de5b1941-3ffa-477d-9b7f-7e07fe3ed206-catalog-content\") pod \"community-operators-hjs95\" (UID: \"de5b1941-3ffa-477d-9b7f-7e07fe3ed206\") " pod="openshift-marketplace/community-operators-hjs95" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.233509 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de5b1941-3ffa-477d-9b7f-7e07fe3ed206-utilities\") pod \"community-operators-hjs95\" (UID: \"de5b1941-3ffa-477d-9b7f-7e07fe3ed206\") " pod="openshift-marketplace/community-operators-hjs95" Oct 11 00:54:18 crc kubenswrapper[4743]: E1011 00:54:18.233932 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:18.733923249 +0000 UTC m=+153.386903636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.265277 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtwzw\" (UniqueName: \"kubernetes.io/projected/de5b1941-3ffa-477d-9b7f-7e07fe3ed206-kube-api-access-vtwzw\") pod \"community-operators-hjs95\" (UID: \"de5b1941-3ffa-477d-9b7f-7e07fe3ed206\") " pod="openshift-marketplace/community-operators-hjs95" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.336678 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:18 crc kubenswrapper[4743]: E1011 00:54:18.337063 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:18.837048421 +0000 UTC m=+153.490028818 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.405669 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hjs95" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.438794 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:18 crc kubenswrapper[4743]: E1011 00:54:18.439098 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:18.93908679 +0000 UTC m=+153.592067187 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.466289 4743 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-ccjcp container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.466343 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ccjcp" podUID="544ad287-fc9b-4033-b139-3893a3e10a00" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.482934 4743 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xwxfx container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 11 00:54:18 crc kubenswrapper[4743]: [+]log ok Oct 11 00:54:18 crc kubenswrapper[4743]: [+]etcd ok Oct 11 00:54:18 crc kubenswrapper[4743]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 11 00:54:18 crc kubenswrapper[4743]: [+]poststarthook/generic-apiserver-start-informers ok Oct 11 00:54:18 crc kubenswrapper[4743]: [+]poststarthook/max-in-flight-filter ok Oct 11 00:54:18 crc kubenswrapper[4743]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 11 00:54:18 crc kubenswrapper[4743]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 11 00:54:18 crc kubenswrapper[4743]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 11 00:54:18 crc kubenswrapper[4743]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 11 00:54:18 crc kubenswrapper[4743]: [+]poststarthook/project.openshift.io-projectcache ok Oct 11 00:54:18 crc kubenswrapper[4743]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 11 00:54:18 crc kubenswrapper[4743]: [+]poststarthook/openshift.io-startinformers ok Oct 11 00:54:18 crc kubenswrapper[4743]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 11 00:54:18 crc kubenswrapper[4743]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 11 00:54:18 crc kubenswrapper[4743]: livez check failed Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.482991 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" podUID="3abfab8d-967c-42bf-9f48-bd69bbe6f8f2" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.490146 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-grqvq" event={"ID":"4a97f1cb-48e1-4049-be4a-0151201d5cc3","Type":"ContainerStarted","Data":"e43adac4dd68be309dd819d94aafbd9c932a719c7cc07170a2b3afe6f057691f"} Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.490194 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-grqvq" event={"ID":"4a97f1cb-48e1-4049-be4a-0151201d5cc3","Type":"ContainerStarted","Data":"b6e329dd50e012fd16dd24b276ffff4a3af171353efe0063c760f03b5b0ae435"} Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.533362 4743 generic.go:334] "Generic (PLEG): container finished" podID="962b1e46-ba63-4aa2-9882-a04487a05813" containerID="057a072f6b313157d0743795a2f56c68aa7d55e7b92aaabe95560a120c6bc0cc" exitCode=0 Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.536023 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335725-pg67v" event={"ID":"962b1e46-ba63-4aa2-9882-a04487a05813","Type":"ContainerDied","Data":"057a072f6b313157d0743795a2f56c68aa7d55e7b92aaabe95560a120c6bc0cc"} Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.539332 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:18 crc kubenswrapper[4743]: E1011 00:54:18.539698 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:19.039684156 +0000 UTC m=+153.692664553 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.563115 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ccjcp" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.580151 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wswrq" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.612389 4743 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.643583 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:18 crc kubenswrapper[4743]: E1011 00:54:18.650290 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:19.15027238 +0000 UTC m=+153.803252767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.710071 4743 patch_prober.go:28] interesting pod/router-default-5444994796-6srbj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 11 00:54:18 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Oct 11 00:54:18 crc kubenswrapper[4743]: [+]process-running ok Oct 11 00:54:18 crc kubenswrapper[4743]: healthz check failed Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.710126 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6srbj" podUID="1ae9d5c5-6a97-483f-9b76-45c73259c85b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.745200 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:18 crc kubenswrapper[4743]: E1011 00:54:18.746467 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:19.246452565 +0000 UTC m=+153.899432962 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.850645 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:18 crc kubenswrapper[4743]: E1011 00:54:18.850964 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:19.350952638 +0000 UTC m=+154.003933035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.953230 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:18 crc kubenswrapper[4743]: E1011 00:54:18.953623 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:19.453606716 +0000 UTC m=+154.106587113 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:18 crc kubenswrapper[4743]: I1011 00:54:18.955734 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r8dmc"] Oct 11 00:54:18 crc kubenswrapper[4743]: W1011 00:54:18.975729 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c7ded1c_c0ce_47c4_9959_b95631f067ea.slice/crio-328acba2fcf7b182090b966a9107eff5a3cbb81f95c3ac070d3db395acd6530b WatchSource:0}: Error finding container 328acba2fcf7b182090b966a9107eff5a3cbb81f95c3ac070d3db395acd6530b: Status 404 returned error can't find the container with id 328acba2fcf7b182090b966a9107eff5a3cbb81f95c3ac070d3db395acd6530b Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.055549 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:19 crc kubenswrapper[4743]: E1011 00:54:19.055883 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:19.555854841 +0000 UTC m=+154.208835238 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.058333 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-drlgz"] Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.115792 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nvz7n"] Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.139385 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hjs95"] Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.156461 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:19 crc kubenswrapper[4743]: E1011 00:54:19.156619 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:19.656597802 +0000 UTC m=+154.309578199 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.156730 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:19 crc kubenswrapper[4743]: E1011 00:54:19.157041 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-11 00:54:19.657033425 +0000 UTC m=+154.310013822 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-24m6m" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:19 crc kubenswrapper[4743]: W1011 00:54:19.205193 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde5b1941_3ffa_477d_9b7f_7e07fe3ed206.slice/crio-3312bbe422c232e5b9b9480145f8f718e20f139744526f4753b449dd623ed6de WatchSource:0}: Error finding container 3312bbe422c232e5b9b9480145f8f718e20f139744526f4753b449dd623ed6de: Status 404 returned error can't find the container with id 3312bbe422c232e5b9b9480145f8f718e20f139744526f4753b449dd623ed6de Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.258305 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:19 crc kubenswrapper[4743]: E1011 00:54:19.258610 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-11 00:54:19.75859583 +0000 UTC m=+154.411576227 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.299794 4743 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-11T00:54:18.612412773Z","Handler":null,"Name":""} Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.302093 4743 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.302130 4743 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.359708 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.372428 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.372460 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.390364 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jb4p2"] Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.391352 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jb4p2" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.394367 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.395033 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-24m6m\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.400799 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jb4p2"] Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.461083 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.475359 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.503131 4743 patch_prober.go:28] interesting pod/router-default-5444994796-6srbj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 11 00:54:19 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Oct 11 00:54:19 crc kubenswrapper[4743]: [+]process-running ok Oct 11 00:54:19 crc kubenswrapper[4743]: healthz check failed Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.503187 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6srbj" podUID="1ae9d5c5-6a97-483f-9b76-45c73259c85b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.542047 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-grqvq" event={"ID":"4a97f1cb-48e1-4049-be4a-0151201d5cc3","Type":"ContainerStarted","Data":"f0cef01c289b5021551da0ba213c1c97691b27292ab349ad983ee9b258dbc493"} Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.543950 4743 generic.go:334] "Generic (PLEG): container finished" podID="de5b1941-3ffa-477d-9b7f-7e07fe3ed206" containerID="04501ac8f93cace89c12dfe74e02008693776089e55e0d7f943c972cdad90534" exitCode=0 Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.544052 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjs95" event={"ID":"de5b1941-3ffa-477d-9b7f-7e07fe3ed206","Type":"ContainerDied","Data":"04501ac8f93cace89c12dfe74e02008693776089e55e0d7f943c972cdad90534"} Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.544103 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjs95" event={"ID":"de5b1941-3ffa-477d-9b7f-7e07fe3ed206","Type":"ContainerStarted","Data":"3312bbe422c232e5b9b9480145f8f718e20f139744526f4753b449dd623ed6de"} Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.545775 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.546911 4743 generic.go:334] "Generic (PLEG): container finished" podID="7c7ded1c-c0ce-47c4-9959-b95631f067ea" containerID="19e74fe5505ca96f8a8fb14c34ecf3090c0b787767ab0143fb8ffb55a5a10ef5" exitCode=0 Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.546957 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8dmc" event={"ID":"7c7ded1c-c0ce-47c4-9959-b95631f067ea","Type":"ContainerDied","Data":"19e74fe5505ca96f8a8fb14c34ecf3090c0b787767ab0143fb8ffb55a5a10ef5"} Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.546978 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8dmc" event={"ID":"7c7ded1c-c0ce-47c4-9959-b95631f067ea","Type":"ContainerStarted","Data":"328acba2fcf7b182090b966a9107eff5a3cbb81f95c3ac070d3db395acd6530b"} Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.548879 4743 generic.go:334] "Generic (PLEG): container finished" podID="ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef" containerID="9b1bda191d798619c8ae3467915792f9eb3ab97b8cd570fd22cc01523c75ac69" exitCode=0 Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.548923 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-drlgz" event={"ID":"ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef","Type":"ContainerDied","Data":"9b1bda191d798619c8ae3467915792f9eb3ab97b8cd570fd22cc01523c75ac69"} Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.548940 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-drlgz" event={"ID":"ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef","Type":"ContainerStarted","Data":"9e7d864ff7d6210f6fa8bc3f4089dbe1954dae45a103cf1acaac9b2f7a8a0f57"} Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.551312 4743 generic.go:334] "Generic (PLEG): container finished" podID="712f7dea-8de1-43e3-802b-b4e9b521b0b6" containerID="77c6419678ea74b7572a35af5b0841ced6c9503ceade00d1246027649c8320cf" exitCode=0 Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.552077 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvz7n" event={"ID":"712f7dea-8de1-43e3-802b-b4e9b521b0b6","Type":"ContainerDied","Data":"77c6419678ea74b7572a35af5b0841ced6c9503ceade00d1246027649c8320cf"} Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.552105 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvz7n" event={"ID":"712f7dea-8de1-43e3-802b-b4e9b521b0b6","Type":"ContainerStarted","Data":"7abed6724087a9fbe160e30f62148a63c6bf1e5a8c3e875c4f8e2e5ca7cce252"} Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.562726 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgfkz\" (UniqueName: \"kubernetes.io/projected/e405bfea-f17b-4c2b-b69e-fc7284876cdc-kube-api-access-xgfkz\") pod \"redhat-marketplace-jb4p2\" (UID: \"e405bfea-f17b-4c2b-b69e-fc7284876cdc\") " pod="openshift-marketplace/redhat-marketplace-jb4p2" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.562890 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e405bfea-f17b-4c2b-b69e-fc7284876cdc-utilities\") pod \"redhat-marketplace-jb4p2\" (UID: \"e405bfea-f17b-4c2b-b69e-fc7284876cdc\") " pod="openshift-marketplace/redhat-marketplace-jb4p2" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.562935 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e405bfea-f17b-4c2b-b69e-fc7284876cdc-catalog-content\") pod \"redhat-marketplace-jb4p2\" (UID: \"e405bfea-f17b-4c2b-b69e-fc7284876cdc\") " pod="openshift-marketplace/redhat-marketplace-jb4p2" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.568600 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-grqvq" podStartSLOduration=10.568587454 podStartE2EDuration="10.568587454s" podCreationTimestamp="2025-10-11 00:54:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:19.567541682 +0000 UTC m=+154.220522079" watchObservedRunningTime="2025-10-11 00:54:19.568587454 +0000 UTC m=+154.221567841" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.626757 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.668237 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e405bfea-f17b-4c2b-b69e-fc7284876cdc-utilities\") pod \"redhat-marketplace-jb4p2\" (UID: \"e405bfea-f17b-4c2b-b69e-fc7284876cdc\") " pod="openshift-marketplace/redhat-marketplace-jb4p2" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.668314 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e405bfea-f17b-4c2b-b69e-fc7284876cdc-catalog-content\") pod \"redhat-marketplace-jb4p2\" (UID: \"e405bfea-f17b-4c2b-b69e-fc7284876cdc\") " pod="openshift-marketplace/redhat-marketplace-jb4p2" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.668456 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgfkz\" (UniqueName: \"kubernetes.io/projected/e405bfea-f17b-4c2b-b69e-fc7284876cdc-kube-api-access-xgfkz\") pod \"redhat-marketplace-jb4p2\" (UID: \"e405bfea-f17b-4c2b-b69e-fc7284876cdc\") " pod="openshift-marketplace/redhat-marketplace-jb4p2" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.669312 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e405bfea-f17b-4c2b-b69e-fc7284876cdc-utilities\") pod \"redhat-marketplace-jb4p2\" (UID: \"e405bfea-f17b-4c2b-b69e-fc7284876cdc\") " pod="openshift-marketplace/redhat-marketplace-jb4p2" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.669636 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e405bfea-f17b-4c2b-b69e-fc7284876cdc-catalog-content\") pod \"redhat-marketplace-jb4p2\" (UID: \"e405bfea-f17b-4c2b-b69e-fc7284876cdc\") " pod="openshift-marketplace/redhat-marketplace-jb4p2" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.704680 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgfkz\" (UniqueName: \"kubernetes.io/projected/e405bfea-f17b-4c2b-b69e-fc7284876cdc-kube-api-access-xgfkz\") pod \"redhat-marketplace-jb4p2\" (UID: \"e405bfea-f17b-4c2b-b69e-fc7284876cdc\") " pod="openshift-marketplace/redhat-marketplace-jb4p2" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.745431 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jb4p2" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.784131 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335725-pg67v" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.795384 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-67kfm"] Oct 11 00:54:19 crc kubenswrapper[4743]: E1011 00:54:19.795614 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="962b1e46-ba63-4aa2-9882-a04487a05813" containerName="collect-profiles" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.795630 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="962b1e46-ba63-4aa2-9882-a04487a05813" containerName="collect-profiles" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.795755 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="962b1e46-ba63-4aa2-9882-a04487a05813" containerName="collect-profiles" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.796659 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67kfm" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.816567 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-67kfm"] Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.870400 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/962b1e46-ba63-4aa2-9882-a04487a05813-secret-volume\") pod \"962b1e46-ba63-4aa2-9882-a04487a05813\" (UID: \"962b1e46-ba63-4aa2-9882-a04487a05813\") " Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.870460 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/962b1e46-ba63-4aa2-9882-a04487a05813-config-volume\") pod \"962b1e46-ba63-4aa2-9882-a04487a05813\" (UID: \"962b1e46-ba63-4aa2-9882-a04487a05813\") " Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.870488 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8msh\" (UniqueName: \"kubernetes.io/projected/962b1e46-ba63-4aa2-9882-a04487a05813-kube-api-access-n8msh\") pod \"962b1e46-ba63-4aa2-9882-a04487a05813\" (UID: \"962b1e46-ba63-4aa2-9882-a04487a05813\") " Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.879928 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/962b1e46-ba63-4aa2-9882-a04487a05813-config-volume" (OuterVolumeSpecName: "config-volume") pod "962b1e46-ba63-4aa2-9882-a04487a05813" (UID: "962b1e46-ba63-4aa2-9882-a04487a05813"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.884091 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-24m6m"] Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.885557 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/962b1e46-ba63-4aa2-9882-a04487a05813-kube-api-access-n8msh" (OuterVolumeSpecName: "kube-api-access-n8msh") pod "962b1e46-ba63-4aa2-9882-a04487a05813" (UID: "962b1e46-ba63-4aa2-9882-a04487a05813"). InnerVolumeSpecName "kube-api-access-n8msh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.892813 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962b1e46-ba63-4aa2-9882-a04487a05813-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "962b1e46-ba63-4aa2-9882-a04487a05813" (UID: "962b1e46-ba63-4aa2-9882-a04487a05813"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.893672 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.895525 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.897377 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.897793 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.900026 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.971556 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7adc79-4e39-4ed1-8024-3cd0bb96f661-catalog-content\") pod \"redhat-marketplace-67kfm\" (UID: \"9b7adc79-4e39-4ed1-8024-3cd0bb96f661\") " pod="openshift-marketplace/redhat-marketplace-67kfm" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.971599 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5xp4\" (UniqueName: \"kubernetes.io/projected/9b7adc79-4e39-4ed1-8024-3cd0bb96f661-kube-api-access-n5xp4\") pod \"redhat-marketplace-67kfm\" (UID: \"9b7adc79-4e39-4ed1-8024-3cd0bb96f661\") " pod="openshift-marketplace/redhat-marketplace-67kfm" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.971638 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7adc79-4e39-4ed1-8024-3cd0bb96f661-utilities\") pod \"redhat-marketplace-67kfm\" (UID: \"9b7adc79-4e39-4ed1-8024-3cd0bb96f661\") " pod="openshift-marketplace/redhat-marketplace-67kfm" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.971768 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/962b1e46-ba63-4aa2-9882-a04487a05813-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.971797 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/962b1e46-ba63-4aa2-9882-a04487a05813-config-volume\") on node \"crc\" DevicePath \"\"" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.971810 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8msh\" (UniqueName: \"kubernetes.io/projected/962b1e46-ba63-4aa2-9882-a04487a05813-kube-api-access-n8msh\") on node \"crc\" DevicePath \"\"" Oct 11 00:54:19 crc kubenswrapper[4743]: I1011 00:54:19.992813 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jb4p2"] Oct 11 00:54:19 crc kubenswrapper[4743]: W1011 00:54:19.998851 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode405bfea_f17b_4c2b_b69e_fc7284876cdc.slice/crio-6b312ec61258615a36c62c7ff87b1fb25cfb7a42fdfe0803983be183945e4eb1 WatchSource:0}: Error finding container 6b312ec61258615a36c62c7ff87b1fb25cfb7a42fdfe0803983be183945e4eb1: Status 404 returned error can't find the container with id 6b312ec61258615a36c62c7ff87b1fb25cfb7a42fdfe0803983be183945e4eb1 Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.072410 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/414ae978-d3eb-4912-918a-72f472a48b45-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"414ae978-d3eb-4912-918a-72f472a48b45\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.072761 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/414ae978-d3eb-4912-918a-72f472a48b45-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"414ae978-d3eb-4912-918a-72f472a48b45\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.072831 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7adc79-4e39-4ed1-8024-3cd0bb96f661-catalog-content\") pod \"redhat-marketplace-67kfm\" (UID: \"9b7adc79-4e39-4ed1-8024-3cd0bb96f661\") " pod="openshift-marketplace/redhat-marketplace-67kfm" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.072889 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5xp4\" (UniqueName: \"kubernetes.io/projected/9b7adc79-4e39-4ed1-8024-3cd0bb96f661-kube-api-access-n5xp4\") pod \"redhat-marketplace-67kfm\" (UID: \"9b7adc79-4e39-4ed1-8024-3cd0bb96f661\") " pod="openshift-marketplace/redhat-marketplace-67kfm" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.072932 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7adc79-4e39-4ed1-8024-3cd0bb96f661-utilities\") pod \"redhat-marketplace-67kfm\" (UID: \"9b7adc79-4e39-4ed1-8024-3cd0bb96f661\") " pod="openshift-marketplace/redhat-marketplace-67kfm" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.075251 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7adc79-4e39-4ed1-8024-3cd0bb96f661-utilities\") pod \"redhat-marketplace-67kfm\" (UID: \"9b7adc79-4e39-4ed1-8024-3cd0bb96f661\") " pod="openshift-marketplace/redhat-marketplace-67kfm" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.075556 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7adc79-4e39-4ed1-8024-3cd0bb96f661-catalog-content\") pod \"redhat-marketplace-67kfm\" (UID: \"9b7adc79-4e39-4ed1-8024-3cd0bb96f661\") " pod="openshift-marketplace/redhat-marketplace-67kfm" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.090984 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5xp4\" (UniqueName: \"kubernetes.io/projected/9b7adc79-4e39-4ed1-8024-3cd0bb96f661-kube-api-access-n5xp4\") pod \"redhat-marketplace-67kfm\" (UID: \"9b7adc79-4e39-4ed1-8024-3cd0bb96f661\") " pod="openshift-marketplace/redhat-marketplace-67kfm" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.105013 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.124929 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67kfm" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.174378 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/414ae978-d3eb-4912-918a-72f472a48b45-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"414ae978-d3eb-4912-918a-72f472a48b45\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.174463 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/414ae978-d3eb-4912-918a-72f472a48b45-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"414ae978-d3eb-4912-918a-72f472a48b45\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.174583 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/414ae978-d3eb-4912-918a-72f472a48b45-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"414ae978-d3eb-4912-918a-72f472a48b45\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.194830 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/414ae978-d3eb-4912-918a-72f472a48b45-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"414ae978-d3eb-4912-918a-72f472a48b45\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.212926 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.361986 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-67kfm"] Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.481473 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 11 00:54:20 crc kubenswrapper[4743]: W1011 00:54:20.490344 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod414ae978_d3eb_4912_918a_72f472a48b45.slice/crio-ee0399f404c3580026ac4083097132ec7d44cf5aaee2aacab64dfe7a7dcffa82 WatchSource:0}: Error finding container ee0399f404c3580026ac4083097132ec7d44cf5aaee2aacab64dfe7a7dcffa82: Status 404 returned error can't find the container with id ee0399f404c3580026ac4083097132ec7d44cf5aaee2aacab64dfe7a7dcffa82 Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.500661 4743 patch_prober.go:28] interesting pod/router-default-5444994796-6srbj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 11 00:54:20 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Oct 11 00:54:20 crc kubenswrapper[4743]: [+]process-running ok Oct 11 00:54:20 crc kubenswrapper[4743]: healthz check failed Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.500695 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6srbj" podUID="1ae9d5c5-6a97-483f-9b76-45c73259c85b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.563063 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335725-pg67v" event={"ID":"962b1e46-ba63-4aa2-9882-a04487a05813","Type":"ContainerDied","Data":"75471898b2fb952c943d0dfc4adbf9a88d039d341607e5b2f83e2c56b0e2d22f"} Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.563101 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75471898b2fb952c943d0dfc4adbf9a88d039d341607e5b2f83e2c56b0e2d22f" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.563160 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335725-pg67v" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.565662 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" event={"ID":"9d6b106d-6589-40f2-b694-eb158c541d82","Type":"ContainerStarted","Data":"3da0b8283ad0d2002a73e40a09d147bb70dee93ceddfdd6006d466b69da78fde"} Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.565716 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" event={"ID":"9d6b106d-6589-40f2-b694-eb158c541d82","Type":"ContainerStarted","Data":"d04e702924fa574dfda36716020c9ef9cc9af74bde8d2f67803333dd7998b185"} Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.565777 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.567296 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67kfm" event={"ID":"9b7adc79-4e39-4ed1-8024-3cd0bb96f661","Type":"ContainerStarted","Data":"34465edd74ca7038e1977d3063f90f17baa2e488937a9e4de1a535ac180dce98"} Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.571756 4743 generic.go:334] "Generic (PLEG): container finished" podID="e405bfea-f17b-4c2b-b69e-fc7284876cdc" containerID="c6b0ce73a1a2439c1501173eb8f60a0b29e4984619e479a4d11d2622e3ac8aad" exitCode=0 Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.571800 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jb4p2" event={"ID":"e405bfea-f17b-4c2b-b69e-fc7284876cdc","Type":"ContainerDied","Data":"c6b0ce73a1a2439c1501173eb8f60a0b29e4984619e479a4d11d2622e3ac8aad"} Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.571820 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jb4p2" event={"ID":"e405bfea-f17b-4c2b-b69e-fc7284876cdc","Type":"ContainerStarted","Data":"6b312ec61258615a36c62c7ff87b1fb25cfb7a42fdfe0803983be183945e4eb1"} Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.574688 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"414ae978-d3eb-4912-918a-72f472a48b45","Type":"ContainerStarted","Data":"ee0399f404c3580026ac4083097132ec7d44cf5aaee2aacab64dfe7a7dcffa82"} Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.591585 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" podStartSLOduration=134.591567503 podStartE2EDuration="2m14.591567503s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:20.579564636 +0000 UTC m=+155.232545033" watchObservedRunningTime="2025-10-11 00:54:20.591567503 +0000 UTC m=+155.244547900" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.595747 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cb8zp"] Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.601241 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cb8zp" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.613328 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.621920 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cb8zp"] Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.684614 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b87ffc73-065d-4570-867f-b91e442a4c73-catalog-content\") pod \"redhat-operators-cb8zp\" (UID: \"b87ffc73-065d-4570-867f-b91e442a4c73\") " pod="openshift-marketplace/redhat-operators-cb8zp" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.684670 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b87ffc73-065d-4570-867f-b91e442a4c73-utilities\") pod \"redhat-operators-cb8zp\" (UID: \"b87ffc73-065d-4570-867f-b91e442a4c73\") " pod="openshift-marketplace/redhat-operators-cb8zp" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.685219 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm4k7\" (UniqueName: \"kubernetes.io/projected/b87ffc73-065d-4570-867f-b91e442a4c73-kube-api-access-qm4k7\") pod \"redhat-operators-cb8zp\" (UID: \"b87ffc73-065d-4570-867f-b91e442a4c73\") " pod="openshift-marketplace/redhat-operators-cb8zp" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.786393 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b87ffc73-065d-4570-867f-b91e442a4c73-utilities\") pod \"redhat-operators-cb8zp\" (UID: \"b87ffc73-065d-4570-867f-b91e442a4c73\") " pod="openshift-marketplace/redhat-operators-cb8zp" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.786472 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm4k7\" (UniqueName: \"kubernetes.io/projected/b87ffc73-065d-4570-867f-b91e442a4c73-kube-api-access-qm4k7\") pod \"redhat-operators-cb8zp\" (UID: \"b87ffc73-065d-4570-867f-b91e442a4c73\") " pod="openshift-marketplace/redhat-operators-cb8zp" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.786548 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b87ffc73-065d-4570-867f-b91e442a4c73-catalog-content\") pod \"redhat-operators-cb8zp\" (UID: \"b87ffc73-065d-4570-867f-b91e442a4c73\") " pod="openshift-marketplace/redhat-operators-cb8zp" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.786909 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b87ffc73-065d-4570-867f-b91e442a4c73-utilities\") pod \"redhat-operators-cb8zp\" (UID: \"b87ffc73-065d-4570-867f-b91e442a4c73\") " pod="openshift-marketplace/redhat-operators-cb8zp" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.786940 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b87ffc73-065d-4570-867f-b91e442a4c73-catalog-content\") pod \"redhat-operators-cb8zp\" (UID: \"b87ffc73-065d-4570-867f-b91e442a4c73\") " pod="openshift-marketplace/redhat-operators-cb8zp" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.828917 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm4k7\" (UniqueName: \"kubernetes.io/projected/b87ffc73-065d-4570-867f-b91e442a4c73-kube-api-access-qm4k7\") pod \"redhat-operators-cb8zp\" (UID: \"b87ffc73-065d-4570-867f-b91e442a4c73\") " pod="openshift-marketplace/redhat-operators-cb8zp" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.970138 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cb8zp" Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.995829 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wxvl7"] Oct 11 00:54:20 crc kubenswrapper[4743]: I1011 00:54:20.997537 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wxvl7" Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.003219 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wxvl7"] Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.090072 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d89bc6e-6240-4acc-866a-347de2b7bc0a-utilities\") pod \"redhat-operators-wxvl7\" (UID: \"7d89bc6e-6240-4acc-866a-347de2b7bc0a\") " pod="openshift-marketplace/redhat-operators-wxvl7" Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.090123 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98sjk\" (UniqueName: \"kubernetes.io/projected/7d89bc6e-6240-4acc-866a-347de2b7bc0a-kube-api-access-98sjk\") pod \"redhat-operators-wxvl7\" (UID: \"7d89bc6e-6240-4acc-866a-347de2b7bc0a\") " pod="openshift-marketplace/redhat-operators-wxvl7" Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.090166 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d89bc6e-6240-4acc-866a-347de2b7bc0a-catalog-content\") pod \"redhat-operators-wxvl7\" (UID: \"7d89bc6e-6240-4acc-866a-347de2b7bc0a\") " pod="openshift-marketplace/redhat-operators-wxvl7" Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.191854 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d89bc6e-6240-4acc-866a-347de2b7bc0a-catalog-content\") pod \"redhat-operators-wxvl7\" (UID: \"7d89bc6e-6240-4acc-866a-347de2b7bc0a\") " pod="openshift-marketplace/redhat-operators-wxvl7" Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.191966 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d89bc6e-6240-4acc-866a-347de2b7bc0a-utilities\") pod \"redhat-operators-wxvl7\" (UID: \"7d89bc6e-6240-4acc-866a-347de2b7bc0a\") " pod="openshift-marketplace/redhat-operators-wxvl7" Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.191993 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98sjk\" (UniqueName: \"kubernetes.io/projected/7d89bc6e-6240-4acc-866a-347de2b7bc0a-kube-api-access-98sjk\") pod \"redhat-operators-wxvl7\" (UID: \"7d89bc6e-6240-4acc-866a-347de2b7bc0a\") " pod="openshift-marketplace/redhat-operators-wxvl7" Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.194091 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d89bc6e-6240-4acc-866a-347de2b7bc0a-utilities\") pod \"redhat-operators-wxvl7\" (UID: \"7d89bc6e-6240-4acc-866a-347de2b7bc0a\") " pod="openshift-marketplace/redhat-operators-wxvl7" Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.194425 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d89bc6e-6240-4acc-866a-347de2b7bc0a-catalog-content\") pod \"redhat-operators-wxvl7\" (UID: \"7d89bc6e-6240-4acc-866a-347de2b7bc0a\") " pod="openshift-marketplace/redhat-operators-wxvl7" Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.231254 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98sjk\" (UniqueName: \"kubernetes.io/projected/7d89bc6e-6240-4acc-866a-347de2b7bc0a-kube-api-access-98sjk\") pod \"redhat-operators-wxvl7\" (UID: \"7d89bc6e-6240-4acc-866a-347de2b7bc0a\") " pod="openshift-marketplace/redhat-operators-wxvl7" Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.300411 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cb8zp"] Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.353086 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wxvl7" Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.507687 4743 patch_prober.go:28] interesting pod/router-default-5444994796-6srbj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 11 00:54:21 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Oct 11 00:54:21 crc kubenswrapper[4743]: [+]process-running ok Oct 11 00:54:21 crc kubenswrapper[4743]: healthz check failed Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.507743 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6srbj" podUID="1ae9d5c5-6a97-483f-9b76-45c73259c85b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.587427 4743 generic.go:334] "Generic (PLEG): container finished" podID="b87ffc73-065d-4570-867f-b91e442a4c73" containerID="32626fd9833452489f8dfdc37d17299e3395c605374e40511c31d6f063029bfb" exitCode=0 Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.587747 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cb8zp" event={"ID":"b87ffc73-065d-4570-867f-b91e442a4c73","Type":"ContainerDied","Data":"32626fd9833452489f8dfdc37d17299e3395c605374e40511c31d6f063029bfb"} Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.587773 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cb8zp" event={"ID":"b87ffc73-065d-4570-867f-b91e442a4c73","Type":"ContainerStarted","Data":"4240f01a773773fadcc9fa0e5d12298e05b283f5f1be4ba0ccd1e69fb22dc336"} Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.598602 4743 generic.go:334] "Generic (PLEG): container finished" podID="9b7adc79-4e39-4ed1-8024-3cd0bb96f661" containerID="c9fc6e92823ed599e63c13206db29c911e55943ed38c0edd5b6155f86f572b3a" exitCode=0 Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.598647 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67kfm" event={"ID":"9b7adc79-4e39-4ed1-8024-3cd0bb96f661","Type":"ContainerDied","Data":"c9fc6e92823ed599e63c13206db29c911e55943ed38c0edd5b6155f86f572b3a"} Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.627432 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"414ae978-d3eb-4912-918a-72f472a48b45","Type":"ContainerStarted","Data":"33a422a0a0fadbe5806a616c19f915e96c281432f32d7f78538052490959c106"} Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.640456 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wxvl7"] Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.653445 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.653427372 podStartE2EDuration="2.653427372s" podCreationTimestamp="2025-10-11 00:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:21.650422983 +0000 UTC m=+156.303403370" watchObservedRunningTime="2025-10-11 00:54:21.653427372 +0000 UTC m=+156.306407769" Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.659537 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.664518 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-xwxfx" Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.666584 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.695018 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.695053 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.696411 4743 patch_prober.go:28] interesting pod/console-f9d7485db-468m5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.24:8443/health\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.696445 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-468m5" podUID="b5776f9a-8455-4c34-8496-0b4c4e821135" containerName="console" probeResult="failure" output="Get \"https://10.217.0.24:8443/health\": dial tcp 10.217.0.24:8443: connect: connection refused" Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.804364 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-qksm9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.804417 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qksm9" podUID="e8a051b9-029d-4b92-a9a1-380c8d18f051" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.804934 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-qksm9 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 11 00:54:21 crc kubenswrapper[4743]: I1011 00:54:21.804958 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qksm9" podUID="e8a051b9-029d-4b92-a9a1-380c8d18f051" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 11 00:54:22 crc kubenswrapper[4743]: I1011 00:54:22.498149 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-6srbj" Oct 11 00:54:22 crc kubenswrapper[4743]: I1011 00:54:22.501434 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-6srbj" Oct 11 00:54:22 crc kubenswrapper[4743]: I1011 00:54:22.581981 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xgtfn" Oct 11 00:54:22 crc kubenswrapper[4743]: I1011 00:54:22.616099 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 11 00:54:22 crc kubenswrapper[4743]: I1011 00:54:22.616826 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 11 00:54:22 crc kubenswrapper[4743]: I1011 00:54:22.621949 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 11 00:54:22 crc kubenswrapper[4743]: I1011 00:54:22.622785 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 11 00:54:22 crc kubenswrapper[4743]: I1011 00:54:22.623026 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 11 00:54:22 crc kubenswrapper[4743]: I1011 00:54:22.635584 4743 generic.go:334] "Generic (PLEG): container finished" podID="7d89bc6e-6240-4acc-866a-347de2b7bc0a" containerID="2f4d3c43be1e46ee9f55df140627d0c9a36385c5cc6061cf122657b8b3767fa8" exitCode=0 Oct 11 00:54:22 crc kubenswrapper[4743]: I1011 00:54:22.635628 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxvl7" event={"ID":"7d89bc6e-6240-4acc-866a-347de2b7bc0a","Type":"ContainerDied","Data":"2f4d3c43be1e46ee9f55df140627d0c9a36385c5cc6061cf122657b8b3767fa8"} Oct 11 00:54:22 crc kubenswrapper[4743]: I1011 00:54:22.635669 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxvl7" event={"ID":"7d89bc6e-6240-4acc-866a-347de2b7bc0a","Type":"ContainerStarted","Data":"400e311e0c0d4ba65bfc9d826bca26d578b7f0ce1503c1dc8992563f0af5b5e0"} Oct 11 00:54:22 crc kubenswrapper[4743]: I1011 00:54:22.643886 4743 generic.go:334] "Generic (PLEG): container finished" podID="414ae978-d3eb-4912-918a-72f472a48b45" containerID="33a422a0a0fadbe5806a616c19f915e96c281432f32d7f78538052490959c106" exitCode=0 Oct 11 00:54:22 crc kubenswrapper[4743]: I1011 00:54:22.643992 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"414ae978-d3eb-4912-918a-72f472a48b45","Type":"ContainerDied","Data":"33a422a0a0fadbe5806a616c19f915e96c281432f32d7f78538052490959c106"} Oct 11 00:54:22 crc kubenswrapper[4743]: I1011 00:54:22.646353 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-6srbj" Oct 11 00:54:22 crc kubenswrapper[4743]: I1011 00:54:22.728389 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9862ed4b-81c0-40c0-80de-863846831655-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9862ed4b-81c0-40c0-80de-863846831655\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 11 00:54:22 crc kubenswrapper[4743]: I1011 00:54:22.728527 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9862ed4b-81c0-40c0-80de-863846831655-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9862ed4b-81c0-40c0-80de-863846831655\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 11 00:54:22 crc kubenswrapper[4743]: I1011 00:54:22.829894 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9862ed4b-81c0-40c0-80de-863846831655-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9862ed4b-81c0-40c0-80de-863846831655\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 11 00:54:22 crc kubenswrapper[4743]: I1011 00:54:22.829960 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9862ed4b-81c0-40c0-80de-863846831655-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9862ed4b-81c0-40c0-80de-863846831655\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 11 00:54:22 crc kubenswrapper[4743]: I1011 00:54:22.830053 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9862ed4b-81c0-40c0-80de-863846831655-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9862ed4b-81c0-40c0-80de-863846831655\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 11 00:54:22 crc kubenswrapper[4743]: I1011 00:54:22.866604 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9862ed4b-81c0-40c0-80de-863846831655-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9862ed4b-81c0-40c0-80de-863846831655\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 11 00:54:22 crc kubenswrapper[4743]: I1011 00:54:22.981222 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 11 00:54:23 crc kubenswrapper[4743]: I1011 00:54:23.851682 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 11 00:54:24 crc kubenswrapper[4743]: I1011 00:54:24.089994 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 11 00:54:24 crc kubenswrapper[4743]: I1011 00:54:24.177802 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/414ae978-d3eb-4912-918a-72f472a48b45-kube-api-access\") pod \"414ae978-d3eb-4912-918a-72f472a48b45\" (UID: \"414ae978-d3eb-4912-918a-72f472a48b45\") " Oct 11 00:54:24 crc kubenswrapper[4743]: I1011 00:54:24.177930 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/414ae978-d3eb-4912-918a-72f472a48b45-kubelet-dir\") pod \"414ae978-d3eb-4912-918a-72f472a48b45\" (UID: \"414ae978-d3eb-4912-918a-72f472a48b45\") " Oct 11 00:54:24 crc kubenswrapper[4743]: I1011 00:54:24.178271 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/414ae978-d3eb-4912-918a-72f472a48b45-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "414ae978-d3eb-4912-918a-72f472a48b45" (UID: "414ae978-d3eb-4912-918a-72f472a48b45"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 00:54:24 crc kubenswrapper[4743]: I1011 00:54:24.191159 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/414ae978-d3eb-4912-918a-72f472a48b45-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "414ae978-d3eb-4912-918a-72f472a48b45" (UID: "414ae978-d3eb-4912-918a-72f472a48b45"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:54:24 crc kubenswrapper[4743]: I1011 00:54:24.281420 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/414ae978-d3eb-4912-918a-72f472a48b45-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 11 00:54:24 crc kubenswrapper[4743]: I1011 00:54:24.281463 4743 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/414ae978-d3eb-4912-918a-72f472a48b45-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 11 00:54:24 crc kubenswrapper[4743]: I1011 00:54:24.694103 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"414ae978-d3eb-4912-918a-72f472a48b45","Type":"ContainerDied","Data":"ee0399f404c3580026ac4083097132ec7d44cf5aaee2aacab64dfe7a7dcffa82"} Oct 11 00:54:24 crc kubenswrapper[4743]: I1011 00:54:24.694274 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee0399f404c3580026ac4083097132ec7d44cf5aaee2aacab64dfe7a7dcffa82" Oct 11 00:54:24 crc kubenswrapper[4743]: I1011 00:54:24.699836 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 11 00:54:24 crc kubenswrapper[4743]: I1011 00:54:24.700311 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9862ed4b-81c0-40c0-80de-863846831655","Type":"ContainerStarted","Data":"ddf6fee6cca240443a1ab5bc370d94ac7950148f381d7cc10cc2276691550534"} Oct 11 00:54:25 crc kubenswrapper[4743]: I1011 00:54:25.727431 4743 generic.go:334] "Generic (PLEG): container finished" podID="9862ed4b-81c0-40c0-80de-863846831655" containerID="2bf1cb3437bb604f63534d379b79db9fdffaeab210663d166724fbab8aadc0d0" exitCode=0 Oct 11 00:54:25 crc kubenswrapper[4743]: I1011 00:54:25.727728 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9862ed4b-81c0-40c0-80de-863846831655","Type":"ContainerDied","Data":"2bf1cb3437bb604f63534d379b79db9fdffaeab210663d166724fbab8aadc0d0"} Oct 11 00:54:27 crc kubenswrapper[4743]: I1011 00:54:27.888896 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-smtjk" Oct 11 00:54:30 crc kubenswrapper[4743]: I1011 00:54:30.392286 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs\") pod \"network-metrics-daemon-cb5z5\" (UID: \"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\") " pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:54:30 crc kubenswrapper[4743]: I1011 00:54:30.413552 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b02b8636-a5c4-447d-b1cf-401b3dcfa02b-metrics-certs\") pod \"network-metrics-daemon-cb5z5\" (UID: \"b02b8636-a5c4-447d-b1cf-401b3dcfa02b\") " pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:54:30 crc kubenswrapper[4743]: I1011 00:54:30.712624 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb5z5" Oct 11 00:54:31 crc kubenswrapper[4743]: I1011 00:54:31.700996 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:31 crc kubenswrapper[4743]: I1011 00:54:31.704677 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-468m5" Oct 11 00:54:31 crc kubenswrapper[4743]: I1011 00:54:31.811408 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-qksm9" Oct 11 00:54:33 crc kubenswrapper[4743]: I1011 00:54:33.485704 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 11 00:54:33 crc kubenswrapper[4743]: I1011 00:54:33.553301 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9862ed4b-81c0-40c0-80de-863846831655-kubelet-dir\") pod \"9862ed4b-81c0-40c0-80de-863846831655\" (UID: \"9862ed4b-81c0-40c0-80de-863846831655\") " Oct 11 00:54:33 crc kubenswrapper[4743]: I1011 00:54:33.553371 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9862ed4b-81c0-40c0-80de-863846831655-kube-api-access\") pod \"9862ed4b-81c0-40c0-80de-863846831655\" (UID: \"9862ed4b-81c0-40c0-80de-863846831655\") " Oct 11 00:54:33 crc kubenswrapper[4743]: I1011 00:54:33.553681 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9862ed4b-81c0-40c0-80de-863846831655-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9862ed4b-81c0-40c0-80de-863846831655" (UID: "9862ed4b-81c0-40c0-80de-863846831655"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 00:54:33 crc kubenswrapper[4743]: I1011 00:54:33.563537 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9862ed4b-81c0-40c0-80de-863846831655-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9862ed4b-81c0-40c0-80de-863846831655" (UID: "9862ed4b-81c0-40c0-80de-863846831655"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:54:33 crc kubenswrapper[4743]: I1011 00:54:33.654631 4743 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9862ed4b-81c0-40c0-80de-863846831655-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 11 00:54:33 crc kubenswrapper[4743]: I1011 00:54:33.654704 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9862ed4b-81c0-40c0-80de-863846831655-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 11 00:54:33 crc kubenswrapper[4743]: I1011 00:54:33.817037 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9862ed4b-81c0-40c0-80de-863846831655","Type":"ContainerDied","Data":"ddf6fee6cca240443a1ab5bc370d94ac7950148f381d7cc10cc2276691550534"} Oct 11 00:54:33 crc kubenswrapper[4743]: I1011 00:54:33.817074 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddf6fee6cca240443a1ab5bc370d94ac7950148f381d7cc10cc2276691550534" Oct 11 00:54:33 crc kubenswrapper[4743]: I1011 00:54:33.817114 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 11 00:54:39 crc kubenswrapper[4743]: I1011 00:54:39.632237 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 00:54:44 crc kubenswrapper[4743]: I1011 00:54:44.458054 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 00:54:44 crc kubenswrapper[4743]: I1011 00:54:44.458568 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 00:54:44 crc kubenswrapper[4743]: I1011 00:54:44.499234 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cb5z5"] Oct 11 00:54:44 crc kubenswrapper[4743]: I1011 00:54:44.888912 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-drlgz" event={"ID":"ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef","Type":"ContainerStarted","Data":"83c8fcf161e196c7f545a0647f73181c943a7af0bcfdd18dc7acb190acc6e9a9"} Oct 11 00:54:44 crc kubenswrapper[4743]: I1011 00:54:44.891100 4743 generic.go:334] "Generic (PLEG): container finished" podID="0f26ac0d-8683-415e-850c-5aef3da4b59f" containerID="fdbb287bcae0625eba994d102bc94cd017ab3bb48ee16dfebf2df90c7556de0f" exitCode=0 Oct 11 00:54:44 crc kubenswrapper[4743]: I1011 00:54:44.891162 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29335680-fmgl6" event={"ID":"0f26ac0d-8683-415e-850c-5aef3da4b59f","Type":"ContainerDied","Data":"fdbb287bcae0625eba994d102bc94cd017ab3bb48ee16dfebf2df90c7556de0f"} Oct 11 00:54:44 crc kubenswrapper[4743]: I1011 00:54:44.894021 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cb8zp" event={"ID":"b87ffc73-065d-4570-867f-b91e442a4c73","Type":"ContainerStarted","Data":"93f102bfae867e58f3e6c378f5062007eaca8ce5b58a4f29f23b6cc13e2c0ffb"} Oct 11 00:54:44 crc kubenswrapper[4743]: I1011 00:54:44.895403 4743 generic.go:334] "Generic (PLEG): container finished" podID="9b7adc79-4e39-4ed1-8024-3cd0bb96f661" containerID="3a7f04248633cb7a1fb0fa72fa8f5409cb70ebb2e21ba5ee8ab80ab66def61fa" exitCode=0 Oct 11 00:54:44 crc kubenswrapper[4743]: I1011 00:54:44.895459 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67kfm" event={"ID":"9b7adc79-4e39-4ed1-8024-3cd0bb96f661","Type":"ContainerDied","Data":"3a7f04248633cb7a1fb0fa72fa8f5409cb70ebb2e21ba5ee8ab80ab66def61fa"} Oct 11 00:54:44 crc kubenswrapper[4743]: I1011 00:54:44.897745 4743 generic.go:334] "Generic (PLEG): container finished" podID="e405bfea-f17b-4c2b-b69e-fc7284876cdc" containerID="36d6b47b860840adbad156b7277b4d0e3c768e24e5b67732c5b1ed66780ed753" exitCode=0 Oct 11 00:54:44 crc kubenswrapper[4743]: I1011 00:54:44.897820 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jb4p2" event={"ID":"e405bfea-f17b-4c2b-b69e-fc7284876cdc","Type":"ContainerDied","Data":"36d6b47b860840adbad156b7277b4d0e3c768e24e5b67732c5b1ed66780ed753"} Oct 11 00:54:44 crc kubenswrapper[4743]: I1011 00:54:44.901105 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxvl7" event={"ID":"7d89bc6e-6240-4acc-866a-347de2b7bc0a","Type":"ContainerStarted","Data":"da00ca160bc7183c9da8bb68bfb60368865594cca393277d0d9328553eb02d10"} Oct 11 00:54:44 crc kubenswrapper[4743]: I1011 00:54:44.926526 4743 generic.go:334] "Generic (PLEG): container finished" podID="712f7dea-8de1-43e3-802b-b4e9b521b0b6" containerID="d4b6177c05349064d4f375c923f2a480844cdaf96464f5ef4fb65e2157a305aa" exitCode=0 Oct 11 00:54:44 crc kubenswrapper[4743]: I1011 00:54:44.926721 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvz7n" event={"ID":"712f7dea-8de1-43e3-802b-b4e9b521b0b6","Type":"ContainerDied","Data":"d4b6177c05349064d4f375c923f2a480844cdaf96464f5ef4fb65e2157a305aa"} Oct 11 00:54:44 crc kubenswrapper[4743]: I1011 00:54:44.934142 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjs95" event={"ID":"de5b1941-3ffa-477d-9b7f-7e07fe3ed206","Type":"ContainerStarted","Data":"f214539c566eeec4dff3304e94776c2d5605229a598427472143c604e5ef6980"} Oct 11 00:54:44 crc kubenswrapper[4743]: I1011 00:54:44.936082 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cb5z5" event={"ID":"b02b8636-a5c4-447d-b1cf-401b3dcfa02b","Type":"ContainerStarted","Data":"1dfb8bb527abc9a06f81d912457d79560fb7a557c89767b04d846471bd6297e2"} Oct 11 00:54:44 crc kubenswrapper[4743]: I1011 00:54:44.936105 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cb5z5" event={"ID":"b02b8636-a5c4-447d-b1cf-401b3dcfa02b","Type":"ContainerStarted","Data":"c488f73554ce6c4062229567ce56f6861684ad6e1d8e3925d673395ba912dbe4"} Oct 11 00:54:44 crc kubenswrapper[4743]: I1011 00:54:44.939146 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8dmc" event={"ID":"7c7ded1c-c0ce-47c4-9959-b95631f067ea","Type":"ContainerStarted","Data":"57d623e4158b8ddb2f2ab443ea88307b6e28debea675636884bef3dd8aea8595"} Oct 11 00:54:45 crc kubenswrapper[4743]: I1011 00:54:45.952603 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cb5z5" event={"ID":"b02b8636-a5c4-447d-b1cf-401b3dcfa02b","Type":"ContainerStarted","Data":"71d80110ea9bec325a56d1867a82ad761602d407321e7fefd86e11ef4a19fdba"} Oct 11 00:54:45 crc kubenswrapper[4743]: I1011 00:54:45.957674 4743 generic.go:334] "Generic (PLEG): container finished" podID="de5b1941-3ffa-477d-9b7f-7e07fe3ed206" containerID="f214539c566eeec4dff3304e94776c2d5605229a598427472143c604e5ef6980" exitCode=0 Oct 11 00:54:45 crc kubenswrapper[4743]: I1011 00:54:45.958120 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjs95" event={"ID":"de5b1941-3ffa-477d-9b7f-7e07fe3ed206","Type":"ContainerDied","Data":"f214539c566eeec4dff3304e94776c2d5605229a598427472143c604e5ef6980"} Oct 11 00:54:45 crc kubenswrapper[4743]: I1011 00:54:45.960984 4743 generic.go:334] "Generic (PLEG): container finished" podID="7c7ded1c-c0ce-47c4-9959-b95631f067ea" containerID="57d623e4158b8ddb2f2ab443ea88307b6e28debea675636884bef3dd8aea8595" exitCode=0 Oct 11 00:54:45 crc kubenswrapper[4743]: I1011 00:54:45.961090 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8dmc" event={"ID":"7c7ded1c-c0ce-47c4-9959-b95631f067ea","Type":"ContainerDied","Data":"57d623e4158b8ddb2f2ab443ea88307b6e28debea675636884bef3dd8aea8595"} Oct 11 00:54:45 crc kubenswrapper[4743]: I1011 00:54:45.966565 4743 generic.go:334] "Generic (PLEG): container finished" podID="7d89bc6e-6240-4acc-866a-347de2b7bc0a" containerID="da00ca160bc7183c9da8bb68bfb60368865594cca393277d0d9328553eb02d10" exitCode=0 Oct 11 00:54:45 crc kubenswrapper[4743]: I1011 00:54:45.966662 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxvl7" event={"ID":"7d89bc6e-6240-4acc-866a-347de2b7bc0a","Type":"ContainerDied","Data":"da00ca160bc7183c9da8bb68bfb60368865594cca393277d0d9328553eb02d10"} Oct 11 00:54:45 crc kubenswrapper[4743]: I1011 00:54:45.989379 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-cb5z5" podStartSLOduration=159.989345438 podStartE2EDuration="2m39.989345438s" podCreationTimestamp="2025-10-11 00:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:54:45.972934779 +0000 UTC m=+180.625915246" watchObservedRunningTime="2025-10-11 00:54:45.989345438 +0000 UTC m=+180.642325895" Oct 11 00:54:45 crc kubenswrapper[4743]: I1011 00:54:45.998136 4743 generic.go:334] "Generic (PLEG): container finished" podID="ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef" containerID="83c8fcf161e196c7f545a0647f73181c943a7af0bcfdd18dc7acb190acc6e9a9" exitCode=0 Oct 11 00:54:45 crc kubenswrapper[4743]: I1011 00:54:45.998263 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-drlgz" event={"ID":"ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef","Type":"ContainerDied","Data":"83c8fcf161e196c7f545a0647f73181c943a7af0bcfdd18dc7acb190acc6e9a9"} Oct 11 00:54:46 crc kubenswrapper[4743]: I1011 00:54:46.014782 4743 generic.go:334] "Generic (PLEG): container finished" podID="b87ffc73-065d-4570-867f-b91e442a4c73" containerID="93f102bfae867e58f3e6c378f5062007eaca8ce5b58a4f29f23b6cc13e2c0ffb" exitCode=0 Oct 11 00:54:46 crc kubenswrapper[4743]: I1011 00:54:46.014895 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cb8zp" event={"ID":"b87ffc73-065d-4570-867f-b91e442a4c73","Type":"ContainerDied","Data":"93f102bfae867e58f3e6c378f5062007eaca8ce5b58a4f29f23b6cc13e2c0ffb"} Oct 11 00:54:46 crc kubenswrapper[4743]: I1011 00:54:46.338034 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29335680-fmgl6" Oct 11 00:54:46 crc kubenswrapper[4743]: I1011 00:54:46.414208 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0f26ac0d-8683-415e-850c-5aef3da4b59f-serviceca\") pod \"0f26ac0d-8683-415e-850c-5aef3da4b59f\" (UID: \"0f26ac0d-8683-415e-850c-5aef3da4b59f\") " Oct 11 00:54:46 crc kubenswrapper[4743]: I1011 00:54:46.414396 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbbx5\" (UniqueName: \"kubernetes.io/projected/0f26ac0d-8683-415e-850c-5aef3da4b59f-kube-api-access-fbbx5\") pod \"0f26ac0d-8683-415e-850c-5aef3da4b59f\" (UID: \"0f26ac0d-8683-415e-850c-5aef3da4b59f\") " Oct 11 00:54:46 crc kubenswrapper[4743]: I1011 00:54:46.415507 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f26ac0d-8683-415e-850c-5aef3da4b59f-serviceca" (OuterVolumeSpecName: "serviceca") pod "0f26ac0d-8683-415e-850c-5aef3da4b59f" (UID: "0f26ac0d-8683-415e-850c-5aef3da4b59f"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:54:46 crc kubenswrapper[4743]: I1011 00:54:46.425415 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f26ac0d-8683-415e-850c-5aef3da4b59f-kube-api-access-fbbx5" (OuterVolumeSpecName: "kube-api-access-fbbx5") pod "0f26ac0d-8683-415e-850c-5aef3da4b59f" (UID: "0f26ac0d-8683-415e-850c-5aef3da4b59f"). InnerVolumeSpecName "kube-api-access-fbbx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:54:46 crc kubenswrapper[4743]: I1011 00:54:46.516039 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbbx5\" (UniqueName: \"kubernetes.io/projected/0f26ac0d-8683-415e-850c-5aef3da4b59f-kube-api-access-fbbx5\") on node \"crc\" DevicePath \"\"" Oct 11 00:54:46 crc kubenswrapper[4743]: I1011 00:54:46.516173 4743 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0f26ac0d-8683-415e-850c-5aef3da4b59f-serviceca\") on node \"crc\" DevicePath \"\"" Oct 11 00:54:47 crc kubenswrapper[4743]: I1011 00:54:47.025129 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29335680-fmgl6" Oct 11 00:54:47 crc kubenswrapper[4743]: I1011 00:54:47.025320 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29335680-fmgl6" event={"ID":"0f26ac0d-8683-415e-850c-5aef3da4b59f","Type":"ContainerDied","Data":"32d687856f9e5b99c79ac570d73ed4eb65c0644fc9f01a9a60bf8c45d039dad7"} Oct 11 00:54:47 crc kubenswrapper[4743]: I1011 00:54:47.026346 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32d687856f9e5b99c79ac570d73ed4eb65c0644fc9f01a9a60bf8c45d039dad7" Oct 11 00:54:48 crc kubenswrapper[4743]: I1011 00:54:48.032600 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvz7n" event={"ID":"712f7dea-8de1-43e3-802b-b4e9b521b0b6","Type":"ContainerStarted","Data":"532873f4abda070d515b349d5ffc2aaffd79c4961b72ce76335f2641dae37497"} Oct 11 00:54:48 crc kubenswrapper[4743]: I1011 00:54:48.066185 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nvz7n" podStartSLOduration=3.317425289 podStartE2EDuration="31.066159638s" podCreationTimestamp="2025-10-11 00:54:17 +0000 UTC" firstStartedPulling="2025-10-11 00:54:19.55402372 +0000 UTC m=+154.207004107" lastFinishedPulling="2025-10-11 00:54:47.302758059 +0000 UTC m=+181.955738456" observedRunningTime="2025-10-11 00:54:48.058470649 +0000 UTC m=+182.711451076" watchObservedRunningTime="2025-10-11 00:54:48.066159638 +0000 UTC m=+182.719140075" Oct 11 00:54:48 crc kubenswrapper[4743]: I1011 00:54:48.218551 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nvz7n" Oct 11 00:54:48 crc kubenswrapper[4743]: I1011 00:54:48.218618 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nvz7n" Oct 11 00:54:50 crc kubenswrapper[4743]: I1011 00:54:50.046396 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jb4p2" event={"ID":"e405bfea-f17b-4c2b-b69e-fc7284876cdc","Type":"ContainerStarted","Data":"6ce938bf525beadd39c9cd7e624b606c041bf5310133137678d744a156df1a8b"} Oct 11 00:54:50 crc kubenswrapper[4743]: I1011 00:54:50.057578 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-nvz7n" podUID="712f7dea-8de1-43e3-802b-b4e9b521b0b6" containerName="registry-server" probeResult="failure" output=< Oct 11 00:54:50 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Oct 11 00:54:50 crc kubenswrapper[4743]: > Oct 11 00:54:50 crc kubenswrapper[4743]: I1011 00:54:50.075695 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jb4p2" podStartSLOduration=2.499091224 podStartE2EDuration="31.075678473s" podCreationTimestamp="2025-10-11 00:54:19 +0000 UTC" firstStartedPulling="2025-10-11 00:54:20.573814294 +0000 UTC m=+155.226794691" lastFinishedPulling="2025-10-11 00:54:49.150401543 +0000 UTC m=+183.803381940" observedRunningTime="2025-10-11 00:54:50.073050575 +0000 UTC m=+184.726030992" watchObservedRunningTime="2025-10-11 00:54:50.075678473 +0000 UTC m=+184.728658870" Oct 11 00:54:51 crc kubenswrapper[4743]: I1011 00:54:51.053634 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67kfm" event={"ID":"9b7adc79-4e39-4ed1-8024-3cd0bb96f661","Type":"ContainerStarted","Data":"a047a084bd2c2b688f2a6df51215376c4197b49b7ce49e3058d09c1fe870d4d0"} Oct 11 00:54:51 crc kubenswrapper[4743]: I1011 00:54:51.073683 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-67kfm" podStartSLOduration=3.491761973 podStartE2EDuration="32.073663719s" podCreationTimestamp="2025-10-11 00:54:19 +0000 UTC" firstStartedPulling="2025-10-11 00:54:21.613641357 +0000 UTC m=+156.266621754" lastFinishedPulling="2025-10-11 00:54:50.195543103 +0000 UTC m=+184.848523500" observedRunningTime="2025-10-11 00:54:51.070573457 +0000 UTC m=+185.723553864" watchObservedRunningTime="2025-10-11 00:54:51.073663719 +0000 UTC m=+185.726644126" Oct 11 00:54:52 crc kubenswrapper[4743]: I1011 00:54:52.067472 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxvl7" event={"ID":"7d89bc6e-6240-4acc-866a-347de2b7bc0a","Type":"ContainerStarted","Data":"62c832c1b56d61416865683f5d999c1ea6185f16021f5f0592ef3589c8cd274b"} Oct 11 00:54:52 crc kubenswrapper[4743]: I1011 00:54:52.074434 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-drlgz" event={"ID":"ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef","Type":"ContainerStarted","Data":"ba4fccfabd9a2caeb93221f263cf98776151d0ab20ec12b5bbf43c7c6bfe7e84"} Oct 11 00:54:52 crc kubenswrapper[4743]: I1011 00:54:52.076320 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cb8zp" event={"ID":"b87ffc73-065d-4570-867f-b91e442a4c73","Type":"ContainerStarted","Data":"35be2ecabe3c5e2593e5dcede3f793bcae33386ba2c57334a76d9e259cfb10e8"} Oct 11 00:54:52 crc kubenswrapper[4743]: I1011 00:54:52.081840 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjs95" event={"ID":"de5b1941-3ffa-477d-9b7f-7e07fe3ed206","Type":"ContainerStarted","Data":"630e04b41586b2c7251db513be7ecdf9db8c21da888b1cd96b6a06136306a298"} Oct 11 00:54:52 crc kubenswrapper[4743]: I1011 00:54:52.084246 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8dmc" event={"ID":"7c7ded1c-c0ce-47c4-9959-b95631f067ea","Type":"ContainerStarted","Data":"beae56b2fff24fb0eeac7feabc2ade2cc29afb6bf52b13415e8b1adc5a084003"} Oct 11 00:54:52 crc kubenswrapper[4743]: I1011 00:54:52.086072 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wxvl7" podStartSLOduration=3.062272576 podStartE2EDuration="32.086056835s" podCreationTimestamp="2025-10-11 00:54:20 +0000 UTC" firstStartedPulling="2025-10-11 00:54:22.637847784 +0000 UTC m=+157.290828181" lastFinishedPulling="2025-10-11 00:54:51.661632053 +0000 UTC m=+186.314612440" observedRunningTime="2025-10-11 00:54:52.084050065 +0000 UTC m=+186.737030462" watchObservedRunningTime="2025-10-11 00:54:52.086056835 +0000 UTC m=+186.739037232" Oct 11 00:54:52 crc kubenswrapper[4743]: I1011 00:54:52.103074 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-drlgz" podStartSLOduration=3.190015074 podStartE2EDuration="35.103057461s" podCreationTimestamp="2025-10-11 00:54:17 +0000 UTC" firstStartedPulling="2025-10-11 00:54:19.549920928 +0000 UTC m=+154.202901325" lastFinishedPulling="2025-10-11 00:54:51.462963315 +0000 UTC m=+186.115943712" observedRunningTime="2025-10-11 00:54:52.100083322 +0000 UTC m=+186.753063719" watchObservedRunningTime="2025-10-11 00:54:52.103057461 +0000 UTC m=+186.756037858" Oct 11 00:54:52 crc kubenswrapper[4743]: I1011 00:54:52.118386 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cb8zp" podStartSLOduration=2.140290754 podStartE2EDuration="32.118362767s" podCreationTimestamp="2025-10-11 00:54:20 +0000 UTC" firstStartedPulling="2025-10-11 00:54:21.590148447 +0000 UTC m=+156.243128844" lastFinishedPulling="2025-10-11 00:54:51.56822045 +0000 UTC m=+186.221200857" observedRunningTime="2025-10-11 00:54:52.116541273 +0000 UTC m=+186.769521670" watchObservedRunningTime="2025-10-11 00:54:52.118362767 +0000 UTC m=+186.771343164" Oct 11 00:54:52 crc kubenswrapper[4743]: I1011 00:54:52.142116 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hjs95" podStartSLOduration=2.947023846 podStartE2EDuration="35.142093764s" podCreationTimestamp="2025-10-11 00:54:17 +0000 UTC" firstStartedPulling="2025-10-11 00:54:19.545499056 +0000 UTC m=+154.198479453" lastFinishedPulling="2025-10-11 00:54:51.740568974 +0000 UTC m=+186.393549371" observedRunningTime="2025-10-11 00:54:52.13961768 +0000 UTC m=+186.792598077" watchObservedRunningTime="2025-10-11 00:54:52.142093764 +0000 UTC m=+186.795074181" Oct 11 00:54:52 crc kubenswrapper[4743]: I1011 00:54:52.157406 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r8dmc" podStartSLOduration=3.215822882 podStartE2EDuration="35.157386679s" podCreationTimestamp="2025-10-11 00:54:17 +0000 UTC" firstStartedPulling="2025-10-11 00:54:19.548169915 +0000 UTC m=+154.201150312" lastFinishedPulling="2025-10-11 00:54:51.489733712 +0000 UTC m=+186.142714109" observedRunningTime="2025-10-11 00:54:52.156689879 +0000 UTC m=+186.809670276" watchObservedRunningTime="2025-10-11 00:54:52.157386679 +0000 UTC m=+186.810367076" Oct 11 00:54:52 crc kubenswrapper[4743]: I1011 00:54:52.537877 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-62mz2" Oct 11 00:54:54 crc kubenswrapper[4743]: I1011 00:54:54.187119 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 11 00:54:57 crc kubenswrapper[4743]: I1011 00:54:57.888447 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r8dmc" Oct 11 00:54:57 crc kubenswrapper[4743]: I1011 00:54:57.890019 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r8dmc" Oct 11 00:54:58 crc kubenswrapper[4743]: I1011 00:54:58.001006 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-drlgz" Oct 11 00:54:58 crc kubenswrapper[4743]: I1011 00:54:58.001368 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-drlgz" Oct 11 00:54:58 crc kubenswrapper[4743]: I1011 00:54:58.102030 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-drlgz" Oct 11 00:54:58 crc kubenswrapper[4743]: I1011 00:54:58.104103 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r8dmc" Oct 11 00:54:58 crc kubenswrapper[4743]: I1011 00:54:58.163847 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r8dmc" Oct 11 00:54:58 crc kubenswrapper[4743]: I1011 00:54:58.175051 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-drlgz" Oct 11 00:54:58 crc kubenswrapper[4743]: I1011 00:54:58.280593 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nvz7n" Oct 11 00:54:58 crc kubenswrapper[4743]: I1011 00:54:58.332582 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nvz7n" Oct 11 00:54:58 crc kubenswrapper[4743]: I1011 00:54:58.406281 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hjs95" Oct 11 00:54:58 crc kubenswrapper[4743]: I1011 00:54:58.406344 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hjs95" Oct 11 00:54:58 crc kubenswrapper[4743]: I1011 00:54:58.447246 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hjs95" Oct 11 00:54:59 crc kubenswrapper[4743]: I1011 00:54:59.181874 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hjs95" Oct 11 00:54:59 crc kubenswrapper[4743]: I1011 00:54:59.745586 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jb4p2" Oct 11 00:54:59 crc kubenswrapper[4743]: I1011 00:54:59.745898 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jb4p2" Oct 11 00:54:59 crc kubenswrapper[4743]: I1011 00:54:59.804672 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jb4p2" Oct 11 00:54:59 crc kubenswrapper[4743]: I1011 00:54:59.918684 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nvz7n"] Oct 11 00:55:00 crc kubenswrapper[4743]: I1011 00:55:00.125181 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-67kfm" Oct 11 00:55:00 crc kubenswrapper[4743]: I1011 00:55:00.125288 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-67kfm" Oct 11 00:55:00 crc kubenswrapper[4743]: I1011 00:55:00.126740 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nvz7n" podUID="712f7dea-8de1-43e3-802b-b4e9b521b0b6" containerName="registry-server" containerID="cri-o://532873f4abda070d515b349d5ffc2aaffd79c4961b72ce76335f2641dae37497" gracePeriod=2 Oct 11 00:55:00 crc kubenswrapper[4743]: I1011 00:55:00.177050 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jb4p2" Oct 11 00:55:00 crc kubenswrapper[4743]: I1011 00:55:00.180383 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-67kfm" Oct 11 00:55:00 crc kubenswrapper[4743]: I1011 00:55:00.522359 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hjs95"] Oct 11 00:55:00 crc kubenswrapper[4743]: I1011 00:55:00.971391 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cb8zp" Oct 11 00:55:00 crc kubenswrapper[4743]: I1011 00:55:00.972104 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cb8zp" Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.027222 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cb8zp" Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.106355 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nvz7n" Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.137677 4743 generic.go:334] "Generic (PLEG): container finished" podID="712f7dea-8de1-43e3-802b-b4e9b521b0b6" containerID="532873f4abda070d515b349d5ffc2aaffd79c4961b72ce76335f2641dae37497" exitCode=0 Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.138278 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nvz7n" Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.138567 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvz7n" event={"ID":"712f7dea-8de1-43e3-802b-b4e9b521b0b6","Type":"ContainerDied","Data":"532873f4abda070d515b349d5ffc2aaffd79c4961b72ce76335f2641dae37497"} Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.138589 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvz7n" event={"ID":"712f7dea-8de1-43e3-802b-b4e9b521b0b6","Type":"ContainerDied","Data":"7abed6724087a9fbe160e30f62148a63c6bf1e5a8c3e875c4f8e2e5ca7cce252"} Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.138604 4743 scope.go:117] "RemoveContainer" containerID="532873f4abda070d515b349d5ffc2aaffd79c4961b72ce76335f2641dae37497" Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.167327 4743 scope.go:117] "RemoveContainer" containerID="d4b6177c05349064d4f375c923f2a480844cdaf96464f5ef4fb65e2157a305aa" Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.182885 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cb8zp" Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.188797 4743 scope.go:117] "RemoveContainer" containerID="77c6419678ea74b7572a35af5b0841ced6c9503ceade00d1246027649c8320cf" Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.191351 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-67kfm" Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.215483 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/712f7dea-8de1-43e3-802b-b4e9b521b0b6-catalog-content\") pod \"712f7dea-8de1-43e3-802b-b4e9b521b0b6\" (UID: \"712f7dea-8de1-43e3-802b-b4e9b521b0b6\") " Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.215579 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-776lz\" (UniqueName: \"kubernetes.io/projected/712f7dea-8de1-43e3-802b-b4e9b521b0b6-kube-api-access-776lz\") pod \"712f7dea-8de1-43e3-802b-b4e9b521b0b6\" (UID: \"712f7dea-8de1-43e3-802b-b4e9b521b0b6\") " Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.215643 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/712f7dea-8de1-43e3-802b-b4e9b521b0b6-utilities\") pod \"712f7dea-8de1-43e3-802b-b4e9b521b0b6\" (UID: \"712f7dea-8de1-43e3-802b-b4e9b521b0b6\") " Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.217272 4743 scope.go:117] "RemoveContainer" containerID="532873f4abda070d515b349d5ffc2aaffd79c4961b72ce76335f2641dae37497" Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.218239 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/712f7dea-8de1-43e3-802b-b4e9b521b0b6-utilities" (OuterVolumeSpecName: "utilities") pod "712f7dea-8de1-43e3-802b-b4e9b521b0b6" (UID: "712f7dea-8de1-43e3-802b-b4e9b521b0b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:55:01 crc kubenswrapper[4743]: E1011 00:55:01.220385 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"532873f4abda070d515b349d5ffc2aaffd79c4961b72ce76335f2641dae37497\": container with ID starting with 532873f4abda070d515b349d5ffc2aaffd79c4961b72ce76335f2641dae37497 not found: ID does not exist" containerID="532873f4abda070d515b349d5ffc2aaffd79c4961b72ce76335f2641dae37497" Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.220454 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"532873f4abda070d515b349d5ffc2aaffd79c4961b72ce76335f2641dae37497"} err="failed to get container status \"532873f4abda070d515b349d5ffc2aaffd79c4961b72ce76335f2641dae37497\": rpc error: code = NotFound desc = could not find container \"532873f4abda070d515b349d5ffc2aaffd79c4961b72ce76335f2641dae37497\": container with ID starting with 532873f4abda070d515b349d5ffc2aaffd79c4961b72ce76335f2641dae37497 not found: ID does not exist" Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.220529 4743 scope.go:117] "RemoveContainer" containerID="d4b6177c05349064d4f375c923f2a480844cdaf96464f5ef4fb65e2157a305aa" Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.223509 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/712f7dea-8de1-43e3-802b-b4e9b521b0b6-kube-api-access-776lz" (OuterVolumeSpecName: "kube-api-access-776lz") pod "712f7dea-8de1-43e3-802b-b4e9b521b0b6" (UID: "712f7dea-8de1-43e3-802b-b4e9b521b0b6"). InnerVolumeSpecName "kube-api-access-776lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:55:01 crc kubenswrapper[4743]: E1011 00:55:01.226350 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4b6177c05349064d4f375c923f2a480844cdaf96464f5ef4fb65e2157a305aa\": container with ID starting with d4b6177c05349064d4f375c923f2a480844cdaf96464f5ef4fb65e2157a305aa not found: ID does not exist" containerID="d4b6177c05349064d4f375c923f2a480844cdaf96464f5ef4fb65e2157a305aa" Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.226399 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4b6177c05349064d4f375c923f2a480844cdaf96464f5ef4fb65e2157a305aa"} err="failed to get container status \"d4b6177c05349064d4f375c923f2a480844cdaf96464f5ef4fb65e2157a305aa\": rpc error: code = NotFound desc = could not find container \"d4b6177c05349064d4f375c923f2a480844cdaf96464f5ef4fb65e2157a305aa\": container with ID starting with d4b6177c05349064d4f375c923f2a480844cdaf96464f5ef4fb65e2157a305aa not found: ID does not exist" Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.226427 4743 scope.go:117] "RemoveContainer" containerID="77c6419678ea74b7572a35af5b0841ced6c9503ceade00d1246027649c8320cf" Oct 11 00:55:01 crc kubenswrapper[4743]: E1011 00:55:01.227086 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77c6419678ea74b7572a35af5b0841ced6c9503ceade00d1246027649c8320cf\": container with ID starting with 77c6419678ea74b7572a35af5b0841ced6c9503ceade00d1246027649c8320cf not found: ID does not exist" containerID="77c6419678ea74b7572a35af5b0841ced6c9503ceade00d1246027649c8320cf" Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.227109 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77c6419678ea74b7572a35af5b0841ced6c9503ceade00d1246027649c8320cf"} err="failed to get container status \"77c6419678ea74b7572a35af5b0841ced6c9503ceade00d1246027649c8320cf\": rpc error: code = NotFound desc = could not find container \"77c6419678ea74b7572a35af5b0841ced6c9503ceade00d1246027649c8320cf\": container with ID starting with 77c6419678ea74b7572a35af5b0841ced6c9503ceade00d1246027649c8320cf not found: ID does not exist" Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.269966 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/712f7dea-8de1-43e3-802b-b4e9b521b0b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "712f7dea-8de1-43e3-802b-b4e9b521b0b6" (UID: "712f7dea-8de1-43e3-802b-b4e9b521b0b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.316832 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-776lz\" (UniqueName: \"kubernetes.io/projected/712f7dea-8de1-43e3-802b-b4e9b521b0b6-kube-api-access-776lz\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.316908 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/712f7dea-8de1-43e3-802b-b4e9b521b0b6-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.316923 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/712f7dea-8de1-43e3-802b-b4e9b521b0b6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.353613 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wxvl7" Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.353961 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wxvl7" Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.405806 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wxvl7" Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.479445 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nvz7n"] Oct 11 00:55:01 crc kubenswrapper[4743]: I1011 00:55:01.486450 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nvz7n"] Oct 11 00:55:02 crc kubenswrapper[4743]: I1011 00:55:02.100271 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="712f7dea-8de1-43e3-802b-b4e9b521b0b6" path="/var/lib/kubelet/pods/712f7dea-8de1-43e3-802b-b4e9b521b0b6/volumes" Oct 11 00:55:02 crc kubenswrapper[4743]: I1011 00:55:02.171573 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hjs95" podUID="de5b1941-3ffa-477d-9b7f-7e07fe3ed206" containerName="registry-server" containerID="cri-o://630e04b41586b2c7251db513be7ecdf9db8c21da888b1cd96b6a06136306a298" gracePeriod=2 Oct 11 00:55:02 crc kubenswrapper[4743]: I1011 00:55:02.215647 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wxvl7" Oct 11 00:55:02 crc kubenswrapper[4743]: I1011 00:55:02.320620 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-67kfm"] Oct 11 00:55:02 crc kubenswrapper[4743]: I1011 00:55:02.551039 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hjs95" Oct 11 00:55:02 crc kubenswrapper[4743]: I1011 00:55:02.636591 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtwzw\" (UniqueName: \"kubernetes.io/projected/de5b1941-3ffa-477d-9b7f-7e07fe3ed206-kube-api-access-vtwzw\") pod \"de5b1941-3ffa-477d-9b7f-7e07fe3ed206\" (UID: \"de5b1941-3ffa-477d-9b7f-7e07fe3ed206\") " Oct 11 00:55:02 crc kubenswrapper[4743]: I1011 00:55:02.636670 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de5b1941-3ffa-477d-9b7f-7e07fe3ed206-utilities\") pod \"de5b1941-3ffa-477d-9b7f-7e07fe3ed206\" (UID: \"de5b1941-3ffa-477d-9b7f-7e07fe3ed206\") " Oct 11 00:55:02 crc kubenswrapper[4743]: I1011 00:55:02.636695 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de5b1941-3ffa-477d-9b7f-7e07fe3ed206-catalog-content\") pod \"de5b1941-3ffa-477d-9b7f-7e07fe3ed206\" (UID: \"de5b1941-3ffa-477d-9b7f-7e07fe3ed206\") " Oct 11 00:55:02 crc kubenswrapper[4743]: I1011 00:55:02.638726 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de5b1941-3ffa-477d-9b7f-7e07fe3ed206-utilities" (OuterVolumeSpecName: "utilities") pod "de5b1941-3ffa-477d-9b7f-7e07fe3ed206" (UID: "de5b1941-3ffa-477d-9b7f-7e07fe3ed206"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:55:02 crc kubenswrapper[4743]: I1011 00:55:02.643206 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de5b1941-3ffa-477d-9b7f-7e07fe3ed206-kube-api-access-vtwzw" (OuterVolumeSpecName: "kube-api-access-vtwzw") pod "de5b1941-3ffa-477d-9b7f-7e07fe3ed206" (UID: "de5b1941-3ffa-477d-9b7f-7e07fe3ed206"). InnerVolumeSpecName "kube-api-access-vtwzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:55:02 crc kubenswrapper[4743]: I1011 00:55:02.696777 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de5b1941-3ffa-477d-9b7f-7e07fe3ed206-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de5b1941-3ffa-477d-9b7f-7e07fe3ed206" (UID: "de5b1941-3ffa-477d-9b7f-7e07fe3ed206"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:55:02 crc kubenswrapper[4743]: I1011 00:55:02.738627 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtwzw\" (UniqueName: \"kubernetes.io/projected/de5b1941-3ffa-477d-9b7f-7e07fe3ed206-kube-api-access-vtwzw\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:02 crc kubenswrapper[4743]: I1011 00:55:02.738660 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de5b1941-3ffa-477d-9b7f-7e07fe3ed206-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:02 crc kubenswrapper[4743]: I1011 00:55:02.738669 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de5b1941-3ffa-477d-9b7f-7e07fe3ed206-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:03 crc kubenswrapper[4743]: I1011 00:55:03.175488 4743 generic.go:334] "Generic (PLEG): container finished" podID="de5b1941-3ffa-477d-9b7f-7e07fe3ed206" containerID="630e04b41586b2c7251db513be7ecdf9db8c21da888b1cd96b6a06136306a298" exitCode=0 Oct 11 00:55:03 crc kubenswrapper[4743]: I1011 00:55:03.175617 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjs95" event={"ID":"de5b1941-3ffa-477d-9b7f-7e07fe3ed206","Type":"ContainerDied","Data":"630e04b41586b2c7251db513be7ecdf9db8c21da888b1cd96b6a06136306a298"} Oct 11 00:55:03 crc kubenswrapper[4743]: I1011 00:55:03.175654 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjs95" event={"ID":"de5b1941-3ffa-477d-9b7f-7e07fe3ed206","Type":"ContainerDied","Data":"3312bbe422c232e5b9b9480145f8f718e20f139744526f4753b449dd623ed6de"} Oct 11 00:55:03 crc kubenswrapper[4743]: I1011 00:55:03.175673 4743 scope.go:117] "RemoveContainer" containerID="630e04b41586b2c7251db513be7ecdf9db8c21da888b1cd96b6a06136306a298" Oct 11 00:55:03 crc kubenswrapper[4743]: I1011 00:55:03.176548 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hjs95" Oct 11 00:55:03 crc kubenswrapper[4743]: I1011 00:55:03.192511 4743 scope.go:117] "RemoveContainer" containerID="f214539c566eeec4dff3304e94776c2d5605229a598427472143c604e5ef6980" Oct 11 00:55:03 crc kubenswrapper[4743]: I1011 00:55:03.200054 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hjs95"] Oct 11 00:55:03 crc kubenswrapper[4743]: I1011 00:55:03.204623 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hjs95"] Oct 11 00:55:03 crc kubenswrapper[4743]: I1011 00:55:03.218490 4743 scope.go:117] "RemoveContainer" containerID="04501ac8f93cace89c12dfe74e02008693776089e55e0d7f943c972cdad90534" Oct 11 00:55:03 crc kubenswrapper[4743]: I1011 00:55:03.233121 4743 scope.go:117] "RemoveContainer" containerID="630e04b41586b2c7251db513be7ecdf9db8c21da888b1cd96b6a06136306a298" Oct 11 00:55:03 crc kubenswrapper[4743]: E1011 00:55:03.233452 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"630e04b41586b2c7251db513be7ecdf9db8c21da888b1cd96b6a06136306a298\": container with ID starting with 630e04b41586b2c7251db513be7ecdf9db8c21da888b1cd96b6a06136306a298 not found: ID does not exist" containerID="630e04b41586b2c7251db513be7ecdf9db8c21da888b1cd96b6a06136306a298" Oct 11 00:55:03 crc kubenswrapper[4743]: I1011 00:55:03.233486 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"630e04b41586b2c7251db513be7ecdf9db8c21da888b1cd96b6a06136306a298"} err="failed to get container status \"630e04b41586b2c7251db513be7ecdf9db8c21da888b1cd96b6a06136306a298\": rpc error: code = NotFound desc = could not find container \"630e04b41586b2c7251db513be7ecdf9db8c21da888b1cd96b6a06136306a298\": container with ID starting with 630e04b41586b2c7251db513be7ecdf9db8c21da888b1cd96b6a06136306a298 not found: ID does not exist" Oct 11 00:55:03 crc kubenswrapper[4743]: I1011 00:55:03.233510 4743 scope.go:117] "RemoveContainer" containerID="f214539c566eeec4dff3304e94776c2d5605229a598427472143c604e5ef6980" Oct 11 00:55:03 crc kubenswrapper[4743]: E1011 00:55:03.234112 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f214539c566eeec4dff3304e94776c2d5605229a598427472143c604e5ef6980\": container with ID starting with f214539c566eeec4dff3304e94776c2d5605229a598427472143c604e5ef6980 not found: ID does not exist" containerID="f214539c566eeec4dff3304e94776c2d5605229a598427472143c604e5ef6980" Oct 11 00:55:03 crc kubenswrapper[4743]: I1011 00:55:03.234132 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f214539c566eeec4dff3304e94776c2d5605229a598427472143c604e5ef6980"} err="failed to get container status \"f214539c566eeec4dff3304e94776c2d5605229a598427472143c604e5ef6980\": rpc error: code = NotFound desc = could not find container \"f214539c566eeec4dff3304e94776c2d5605229a598427472143c604e5ef6980\": container with ID starting with f214539c566eeec4dff3304e94776c2d5605229a598427472143c604e5ef6980 not found: ID does not exist" Oct 11 00:55:03 crc kubenswrapper[4743]: I1011 00:55:03.234145 4743 scope.go:117] "RemoveContainer" containerID="04501ac8f93cace89c12dfe74e02008693776089e55e0d7f943c972cdad90534" Oct 11 00:55:03 crc kubenswrapper[4743]: E1011 00:55:03.234308 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04501ac8f93cace89c12dfe74e02008693776089e55e0d7f943c972cdad90534\": container with ID starting with 04501ac8f93cace89c12dfe74e02008693776089e55e0d7f943c972cdad90534 not found: ID does not exist" containerID="04501ac8f93cace89c12dfe74e02008693776089e55e0d7f943c972cdad90534" Oct 11 00:55:03 crc kubenswrapper[4743]: I1011 00:55:03.234324 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04501ac8f93cace89c12dfe74e02008693776089e55e0d7f943c972cdad90534"} err="failed to get container status \"04501ac8f93cace89c12dfe74e02008693776089e55e0d7f943c972cdad90534\": rpc error: code = NotFound desc = could not find container \"04501ac8f93cace89c12dfe74e02008693776089e55e0d7f943c972cdad90534\": container with ID starting with 04501ac8f93cace89c12dfe74e02008693776089e55e0d7f943c972cdad90534 not found: ID does not exist" Oct 11 00:55:04 crc kubenswrapper[4743]: I1011 00:55:04.100240 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de5b1941-3ffa-477d-9b7f-7e07fe3ed206" path="/var/lib/kubelet/pods/de5b1941-3ffa-477d-9b7f-7e07fe3ed206/volumes" Oct 11 00:55:04 crc kubenswrapper[4743]: I1011 00:55:04.182624 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-67kfm" podUID="9b7adc79-4e39-4ed1-8024-3cd0bb96f661" containerName="registry-server" containerID="cri-o://a047a084bd2c2b688f2a6df51215376c4197b49b7ce49e3058d09c1fe870d4d0" gracePeriod=2 Oct 11 00:55:04 crc kubenswrapper[4743]: I1011 00:55:04.565382 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67kfm" Oct 11 00:55:04 crc kubenswrapper[4743]: I1011 00:55:04.659919 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7adc79-4e39-4ed1-8024-3cd0bb96f661-utilities\") pod \"9b7adc79-4e39-4ed1-8024-3cd0bb96f661\" (UID: \"9b7adc79-4e39-4ed1-8024-3cd0bb96f661\") " Oct 11 00:55:04 crc kubenswrapper[4743]: I1011 00:55:04.660013 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7adc79-4e39-4ed1-8024-3cd0bb96f661-catalog-content\") pod \"9b7adc79-4e39-4ed1-8024-3cd0bb96f661\" (UID: \"9b7adc79-4e39-4ed1-8024-3cd0bb96f661\") " Oct 11 00:55:04 crc kubenswrapper[4743]: I1011 00:55:04.660079 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5xp4\" (UniqueName: \"kubernetes.io/projected/9b7adc79-4e39-4ed1-8024-3cd0bb96f661-kube-api-access-n5xp4\") pod \"9b7adc79-4e39-4ed1-8024-3cd0bb96f661\" (UID: \"9b7adc79-4e39-4ed1-8024-3cd0bb96f661\") " Oct 11 00:55:04 crc kubenswrapper[4743]: I1011 00:55:04.661282 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b7adc79-4e39-4ed1-8024-3cd0bb96f661-utilities" (OuterVolumeSpecName: "utilities") pod "9b7adc79-4e39-4ed1-8024-3cd0bb96f661" (UID: "9b7adc79-4e39-4ed1-8024-3cd0bb96f661"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:55:04 crc kubenswrapper[4743]: I1011 00:55:04.669004 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b7adc79-4e39-4ed1-8024-3cd0bb96f661-kube-api-access-n5xp4" (OuterVolumeSpecName: "kube-api-access-n5xp4") pod "9b7adc79-4e39-4ed1-8024-3cd0bb96f661" (UID: "9b7adc79-4e39-4ed1-8024-3cd0bb96f661"). InnerVolumeSpecName "kube-api-access-n5xp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:55:04 crc kubenswrapper[4743]: I1011 00:55:04.672810 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b7adc79-4e39-4ed1-8024-3cd0bb96f661-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b7adc79-4e39-4ed1-8024-3cd0bb96f661" (UID: "9b7adc79-4e39-4ed1-8024-3cd0bb96f661"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:55:04 crc kubenswrapper[4743]: I1011 00:55:04.761801 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5xp4\" (UniqueName: \"kubernetes.io/projected/9b7adc79-4e39-4ed1-8024-3cd0bb96f661-kube-api-access-n5xp4\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:04 crc kubenswrapper[4743]: I1011 00:55:04.761875 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7adc79-4e39-4ed1-8024-3cd0bb96f661-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:04 crc kubenswrapper[4743]: I1011 00:55:04.761890 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7adc79-4e39-4ed1-8024-3cd0bb96f661-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:04 crc kubenswrapper[4743]: I1011 00:55:04.922295 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wxvl7"] Oct 11 00:55:04 crc kubenswrapper[4743]: I1011 00:55:04.922578 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wxvl7" podUID="7d89bc6e-6240-4acc-866a-347de2b7bc0a" containerName="registry-server" containerID="cri-o://62c832c1b56d61416865683f5d999c1ea6185f16021f5f0592ef3589c8cd274b" gracePeriod=2 Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.191088 4743 generic.go:334] "Generic (PLEG): container finished" podID="9b7adc79-4e39-4ed1-8024-3cd0bb96f661" containerID="a047a084bd2c2b688f2a6df51215376c4197b49b7ce49e3058d09c1fe870d4d0" exitCode=0 Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.191167 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67kfm" event={"ID":"9b7adc79-4e39-4ed1-8024-3cd0bb96f661","Type":"ContainerDied","Data":"a047a084bd2c2b688f2a6df51215376c4197b49b7ce49e3058d09c1fe870d4d0"} Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.191203 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67kfm" event={"ID":"9b7adc79-4e39-4ed1-8024-3cd0bb96f661","Type":"ContainerDied","Data":"34465edd74ca7038e1977d3063f90f17baa2e488937a9e4de1a535ac180dce98"} Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.191226 4743 scope.go:117] "RemoveContainer" containerID="a047a084bd2c2b688f2a6df51215376c4197b49b7ce49e3058d09c1fe870d4d0" Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.191228 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67kfm" Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.193983 4743 generic.go:334] "Generic (PLEG): container finished" podID="7d89bc6e-6240-4acc-866a-347de2b7bc0a" containerID="62c832c1b56d61416865683f5d999c1ea6185f16021f5f0592ef3589c8cd274b" exitCode=0 Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.194035 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxvl7" event={"ID":"7d89bc6e-6240-4acc-866a-347de2b7bc0a","Type":"ContainerDied","Data":"62c832c1b56d61416865683f5d999c1ea6185f16021f5f0592ef3589c8cd274b"} Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.220563 4743 scope.go:117] "RemoveContainer" containerID="3a7f04248633cb7a1fb0fa72fa8f5409cb70ebb2e21ba5ee8ab80ab66def61fa" Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.236134 4743 scope.go:117] "RemoveContainer" containerID="c9fc6e92823ed599e63c13206db29c911e55943ed38c0edd5b6155f86f572b3a" Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.244851 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-67kfm"] Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.248136 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-67kfm"] Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.259927 4743 scope.go:117] "RemoveContainer" containerID="a047a084bd2c2b688f2a6df51215376c4197b49b7ce49e3058d09c1fe870d4d0" Oct 11 00:55:05 crc kubenswrapper[4743]: E1011 00:55:05.260382 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a047a084bd2c2b688f2a6df51215376c4197b49b7ce49e3058d09c1fe870d4d0\": container with ID starting with a047a084bd2c2b688f2a6df51215376c4197b49b7ce49e3058d09c1fe870d4d0 not found: ID does not exist" containerID="a047a084bd2c2b688f2a6df51215376c4197b49b7ce49e3058d09c1fe870d4d0" Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.260419 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a047a084bd2c2b688f2a6df51215376c4197b49b7ce49e3058d09c1fe870d4d0"} err="failed to get container status \"a047a084bd2c2b688f2a6df51215376c4197b49b7ce49e3058d09c1fe870d4d0\": rpc error: code = NotFound desc = could not find container \"a047a084bd2c2b688f2a6df51215376c4197b49b7ce49e3058d09c1fe870d4d0\": container with ID starting with a047a084bd2c2b688f2a6df51215376c4197b49b7ce49e3058d09c1fe870d4d0 not found: ID does not exist" Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.260475 4743 scope.go:117] "RemoveContainer" containerID="3a7f04248633cb7a1fb0fa72fa8f5409cb70ebb2e21ba5ee8ab80ab66def61fa" Oct 11 00:55:05 crc kubenswrapper[4743]: E1011 00:55:05.260722 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a7f04248633cb7a1fb0fa72fa8f5409cb70ebb2e21ba5ee8ab80ab66def61fa\": container with ID starting with 3a7f04248633cb7a1fb0fa72fa8f5409cb70ebb2e21ba5ee8ab80ab66def61fa not found: ID does not exist" containerID="3a7f04248633cb7a1fb0fa72fa8f5409cb70ebb2e21ba5ee8ab80ab66def61fa" Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.260760 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a7f04248633cb7a1fb0fa72fa8f5409cb70ebb2e21ba5ee8ab80ab66def61fa"} err="failed to get container status \"3a7f04248633cb7a1fb0fa72fa8f5409cb70ebb2e21ba5ee8ab80ab66def61fa\": rpc error: code = NotFound desc = could not find container \"3a7f04248633cb7a1fb0fa72fa8f5409cb70ebb2e21ba5ee8ab80ab66def61fa\": container with ID starting with 3a7f04248633cb7a1fb0fa72fa8f5409cb70ebb2e21ba5ee8ab80ab66def61fa not found: ID does not exist" Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.260775 4743 scope.go:117] "RemoveContainer" containerID="c9fc6e92823ed599e63c13206db29c911e55943ed38c0edd5b6155f86f572b3a" Oct 11 00:55:05 crc kubenswrapper[4743]: E1011 00:55:05.261140 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9fc6e92823ed599e63c13206db29c911e55943ed38c0edd5b6155f86f572b3a\": container with ID starting with c9fc6e92823ed599e63c13206db29c911e55943ed38c0edd5b6155f86f572b3a not found: ID does not exist" containerID="c9fc6e92823ed599e63c13206db29c911e55943ed38c0edd5b6155f86f572b3a" Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.261170 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9fc6e92823ed599e63c13206db29c911e55943ed38c0edd5b6155f86f572b3a"} err="failed to get container status \"c9fc6e92823ed599e63c13206db29c911e55943ed38c0edd5b6155f86f572b3a\": rpc error: code = NotFound desc = could not find container \"c9fc6e92823ed599e63c13206db29c911e55943ed38c0edd5b6155f86f572b3a\": container with ID starting with c9fc6e92823ed599e63c13206db29c911e55943ed38c0edd5b6155f86f572b3a not found: ID does not exist" Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.262535 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wxvl7" Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.367582 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d89bc6e-6240-4acc-866a-347de2b7bc0a-utilities\") pod \"7d89bc6e-6240-4acc-866a-347de2b7bc0a\" (UID: \"7d89bc6e-6240-4acc-866a-347de2b7bc0a\") " Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.367663 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d89bc6e-6240-4acc-866a-347de2b7bc0a-catalog-content\") pod \"7d89bc6e-6240-4acc-866a-347de2b7bc0a\" (UID: \"7d89bc6e-6240-4acc-866a-347de2b7bc0a\") " Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.367754 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98sjk\" (UniqueName: \"kubernetes.io/projected/7d89bc6e-6240-4acc-866a-347de2b7bc0a-kube-api-access-98sjk\") pod \"7d89bc6e-6240-4acc-866a-347de2b7bc0a\" (UID: \"7d89bc6e-6240-4acc-866a-347de2b7bc0a\") " Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.368433 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d89bc6e-6240-4acc-866a-347de2b7bc0a-utilities" (OuterVolumeSpecName: "utilities") pod "7d89bc6e-6240-4acc-866a-347de2b7bc0a" (UID: "7d89bc6e-6240-4acc-866a-347de2b7bc0a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.372280 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d89bc6e-6240-4acc-866a-347de2b7bc0a-kube-api-access-98sjk" (OuterVolumeSpecName: "kube-api-access-98sjk") pod "7d89bc6e-6240-4acc-866a-347de2b7bc0a" (UID: "7d89bc6e-6240-4acc-866a-347de2b7bc0a"). InnerVolumeSpecName "kube-api-access-98sjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.443497 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d89bc6e-6240-4acc-866a-347de2b7bc0a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d89bc6e-6240-4acc-866a-347de2b7bc0a" (UID: "7d89bc6e-6240-4acc-866a-347de2b7bc0a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.469646 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98sjk\" (UniqueName: \"kubernetes.io/projected/7d89bc6e-6240-4acc-866a-347de2b7bc0a-kube-api-access-98sjk\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.469678 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d89bc6e-6240-4acc-866a-347de2b7bc0a-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:05 crc kubenswrapper[4743]: I1011 00:55:05.469687 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d89bc6e-6240-4acc-866a-347de2b7bc0a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:06 crc kubenswrapper[4743]: I1011 00:55:06.103016 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b7adc79-4e39-4ed1-8024-3cd0bb96f661" path="/var/lib/kubelet/pods/9b7adc79-4e39-4ed1-8024-3cd0bb96f661/volumes" Oct 11 00:55:06 crc kubenswrapper[4743]: I1011 00:55:06.204227 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxvl7" event={"ID":"7d89bc6e-6240-4acc-866a-347de2b7bc0a","Type":"ContainerDied","Data":"400e311e0c0d4ba65bfc9d826bca26d578b7f0ce1503c1dc8992563f0af5b5e0"} Oct 11 00:55:06 crc kubenswrapper[4743]: I1011 00:55:06.204304 4743 scope.go:117] "RemoveContainer" containerID="62c832c1b56d61416865683f5d999c1ea6185f16021f5f0592ef3589c8cd274b" Oct 11 00:55:06 crc kubenswrapper[4743]: I1011 00:55:06.206012 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wxvl7" Oct 11 00:55:06 crc kubenswrapper[4743]: I1011 00:55:06.220332 4743 scope.go:117] "RemoveContainer" containerID="da00ca160bc7183c9da8bb68bfb60368865594cca393277d0d9328553eb02d10" Oct 11 00:55:06 crc kubenswrapper[4743]: I1011 00:55:06.232541 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wxvl7"] Oct 11 00:55:06 crc kubenswrapper[4743]: I1011 00:55:06.248144 4743 scope.go:117] "RemoveContainer" containerID="2f4d3c43be1e46ee9f55df140627d0c9a36385c5cc6061cf122657b8b3767fa8" Oct 11 00:55:06 crc kubenswrapper[4743]: I1011 00:55:06.253422 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wxvl7"] Oct 11 00:55:08 crc kubenswrapper[4743]: I1011 00:55:08.105219 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d89bc6e-6240-4acc-866a-347de2b7bc0a" path="/var/lib/kubelet/pods/7d89bc6e-6240-4acc-866a-347de2b7bc0a/volumes" Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.056152 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fv4x7"] Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.056654 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" podUID="8693817b-7cf6-486d-a055-93c4c0308d95" containerName="controller-manager" containerID="cri-o://4f7ac059cbc05beab2f9e31382692729609e8ab2a6ea74eb68eb64398b5a2280" gracePeriod=30 Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.151167 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt"] Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.151639 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt" podUID="2305c756-5c61-4d13-aac9-1ff5d3c6b2ad" containerName="route-controller-manager" containerID="cri-o://9909c64ed346016efe19ad99a71225803560950fbb3f81586ba367f3b5ea7856" gracePeriod=30 Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.257144 4743 generic.go:334] "Generic (PLEG): container finished" podID="8693817b-7cf6-486d-a055-93c4c0308d95" containerID="4f7ac059cbc05beab2f9e31382692729609e8ab2a6ea74eb68eb64398b5a2280" exitCode=0 Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.257183 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" event={"ID":"8693817b-7cf6-486d-a055-93c4c0308d95","Type":"ContainerDied","Data":"4f7ac059cbc05beab2f9e31382692729609e8ab2a6ea74eb68eb64398b5a2280"} Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.369150 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.443284 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt" Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.458119 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.458170 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.458207 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.458719 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.458767 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c" gracePeriod=600 Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.473398 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2305c756-5c61-4d13-aac9-1ff5d3c6b2ad-config\") pod \"2305c756-5c61-4d13-aac9-1ff5d3c6b2ad\" (UID: \"2305c756-5c61-4d13-aac9-1ff5d3c6b2ad\") " Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.473477 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2305c756-5c61-4d13-aac9-1ff5d3c6b2ad-serving-cert\") pod \"2305c756-5c61-4d13-aac9-1ff5d3c6b2ad\" (UID: \"2305c756-5c61-4d13-aac9-1ff5d3c6b2ad\") " Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.473520 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7p6h\" (UniqueName: \"kubernetes.io/projected/8693817b-7cf6-486d-a055-93c4c0308d95-kube-api-access-f7p6h\") pod \"8693817b-7cf6-486d-a055-93c4c0308d95\" (UID: \"8693817b-7cf6-486d-a055-93c4c0308d95\") " Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.473547 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8693817b-7cf6-486d-a055-93c4c0308d95-serving-cert\") pod \"8693817b-7cf6-486d-a055-93c4c0308d95\" (UID: \"8693817b-7cf6-486d-a055-93c4c0308d95\") " Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.473562 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2305c756-5c61-4d13-aac9-1ff5d3c6b2ad-client-ca\") pod \"2305c756-5c61-4d13-aac9-1ff5d3c6b2ad\" (UID: \"2305c756-5c61-4d13-aac9-1ff5d3c6b2ad\") " Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.473581 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8693817b-7cf6-486d-a055-93c4c0308d95-client-ca\") pod \"8693817b-7cf6-486d-a055-93c4c0308d95\" (UID: \"8693817b-7cf6-486d-a055-93c4c0308d95\") " Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.473601 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjmcn\" (UniqueName: \"kubernetes.io/projected/2305c756-5c61-4d13-aac9-1ff5d3c6b2ad-kube-api-access-hjmcn\") pod \"2305c756-5c61-4d13-aac9-1ff5d3c6b2ad\" (UID: \"2305c756-5c61-4d13-aac9-1ff5d3c6b2ad\") " Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.473655 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8693817b-7cf6-486d-a055-93c4c0308d95-config\") pod \"8693817b-7cf6-486d-a055-93c4c0308d95\" (UID: \"8693817b-7cf6-486d-a055-93c4c0308d95\") " Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.473677 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8693817b-7cf6-486d-a055-93c4c0308d95-proxy-ca-bundles\") pod \"8693817b-7cf6-486d-a055-93c4c0308d95\" (UID: \"8693817b-7cf6-486d-a055-93c4c0308d95\") " Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.474253 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8693817b-7cf6-486d-a055-93c4c0308d95-client-ca" (OuterVolumeSpecName: "client-ca") pod "8693817b-7cf6-486d-a055-93c4c0308d95" (UID: "8693817b-7cf6-486d-a055-93c4c0308d95"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.474435 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2305c756-5c61-4d13-aac9-1ff5d3c6b2ad-client-ca" (OuterVolumeSpecName: "client-ca") pod "2305c756-5c61-4d13-aac9-1ff5d3c6b2ad" (UID: "2305c756-5c61-4d13-aac9-1ff5d3c6b2ad"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.474450 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2305c756-5c61-4d13-aac9-1ff5d3c6b2ad-config" (OuterVolumeSpecName: "config") pod "2305c756-5c61-4d13-aac9-1ff5d3c6b2ad" (UID: "2305c756-5c61-4d13-aac9-1ff5d3c6b2ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.474662 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8693817b-7cf6-486d-a055-93c4c0308d95-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8693817b-7cf6-486d-a055-93c4c0308d95" (UID: "8693817b-7cf6-486d-a055-93c4c0308d95"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.474723 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8693817b-7cf6-486d-a055-93c4c0308d95-config" (OuterVolumeSpecName: "config") pod "8693817b-7cf6-486d-a055-93c4c0308d95" (UID: "8693817b-7cf6-486d-a055-93c4c0308d95"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.474444 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8693817b-7cf6-486d-a055-93c4c0308d95-client-ca\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.479191 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2305c756-5c61-4d13-aac9-1ff5d3c6b2ad-kube-api-access-hjmcn" (OuterVolumeSpecName: "kube-api-access-hjmcn") pod "2305c756-5c61-4d13-aac9-1ff5d3c6b2ad" (UID: "2305c756-5c61-4d13-aac9-1ff5d3c6b2ad"). InnerVolumeSpecName "kube-api-access-hjmcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.479270 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2305c756-5c61-4d13-aac9-1ff5d3c6b2ad-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2305c756-5c61-4d13-aac9-1ff5d3c6b2ad" (UID: "2305c756-5c61-4d13-aac9-1ff5d3c6b2ad"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.479403 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8693817b-7cf6-486d-a055-93c4c0308d95-kube-api-access-f7p6h" (OuterVolumeSpecName: "kube-api-access-f7p6h") pod "8693817b-7cf6-486d-a055-93c4c0308d95" (UID: "8693817b-7cf6-486d-a055-93c4c0308d95"). InnerVolumeSpecName "kube-api-access-f7p6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.480804 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8693817b-7cf6-486d-a055-93c4c0308d95-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8693817b-7cf6-486d-a055-93c4c0308d95" (UID: "8693817b-7cf6-486d-a055-93c4c0308d95"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.576058 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8693817b-7cf6-486d-a055-93c4c0308d95-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.576319 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2305c756-5c61-4d13-aac9-1ff5d3c6b2ad-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.576328 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2305c756-5c61-4d13-aac9-1ff5d3c6b2ad-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.576352 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7p6h\" (UniqueName: \"kubernetes.io/projected/8693817b-7cf6-486d-a055-93c4c0308d95-kube-api-access-f7p6h\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.576363 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8693817b-7cf6-486d-a055-93c4c0308d95-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.576370 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2305c756-5c61-4d13-aac9-1ff5d3c6b2ad-client-ca\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.576378 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjmcn\" (UniqueName: \"kubernetes.io/projected/2305c756-5c61-4d13-aac9-1ff5d3c6b2ad-kube-api-access-hjmcn\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:14 crc kubenswrapper[4743]: I1011 00:55:14.576386 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8693817b-7cf6-486d-a055-93c4c0308d95-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.264092 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.264087 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fv4x7" event={"ID":"8693817b-7cf6-486d-a055-93c4c0308d95","Type":"ContainerDied","Data":"925237d2417c4fb065179b4e9305fcb9f85e2f8b0c6276c96581ae216da79b22"} Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.264485 4743 scope.go:117] "RemoveContainer" containerID="4f7ac059cbc05beab2f9e31382692729609e8ab2a6ea74eb68eb64398b5a2280" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.266849 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c" exitCode=0 Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.266910 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c"} Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.266954 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"d4ccb047d6639dbadc8db37d34bacbcce79ae6b61d67f9ebe1557bf1798750cf"} Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.269042 4743 generic.go:334] "Generic (PLEG): container finished" podID="2305c756-5c61-4d13-aac9-1ff5d3c6b2ad" containerID="9909c64ed346016efe19ad99a71225803560950fbb3f81586ba367f3b5ea7856" exitCode=0 Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.269069 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt" event={"ID":"2305c756-5c61-4d13-aac9-1ff5d3c6b2ad","Type":"ContainerDied","Data":"9909c64ed346016efe19ad99a71225803560950fbb3f81586ba367f3b5ea7856"} Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.269085 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt" event={"ID":"2305c756-5c61-4d13-aac9-1ff5d3c6b2ad","Type":"ContainerDied","Data":"ab726670e1efb6c2cfcdffbcc9be2764cad49feb3f410b47bb64b0629ed4d67d"} Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.269102 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.291990 4743 scope.go:117] "RemoveContainer" containerID="9909c64ed346016efe19ad99a71225803560950fbb3f81586ba367f3b5ea7856" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.312824 4743 scope.go:117] "RemoveContainer" containerID="9909c64ed346016efe19ad99a71225803560950fbb3f81586ba367f3b5ea7856" Oct 11 00:55:15 crc kubenswrapper[4743]: E1011 00:55:15.313321 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9909c64ed346016efe19ad99a71225803560950fbb3f81586ba367f3b5ea7856\": container with ID starting with 9909c64ed346016efe19ad99a71225803560950fbb3f81586ba367f3b5ea7856 not found: ID does not exist" containerID="9909c64ed346016efe19ad99a71225803560950fbb3f81586ba367f3b5ea7856" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.313366 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9909c64ed346016efe19ad99a71225803560950fbb3f81586ba367f3b5ea7856"} err="failed to get container status \"9909c64ed346016efe19ad99a71225803560950fbb3f81586ba367f3b5ea7856\": rpc error: code = NotFound desc = could not find container \"9909c64ed346016efe19ad99a71225803560950fbb3f81586ba367f3b5ea7856\": container with ID starting with 9909c64ed346016efe19ad99a71225803560950fbb3f81586ba367f3b5ea7856 not found: ID does not exist" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.323023 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fv4x7"] Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.323282 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fv4x7"] Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.331717 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt"] Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.337994 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2gjt"] Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.344366 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl"] Oct 11 00:55:15 crc kubenswrapper[4743]: E1011 00:55:15.344544 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d89bc6e-6240-4acc-866a-347de2b7bc0a" containerName="extract-content" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.344555 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d89bc6e-6240-4acc-866a-347de2b7bc0a" containerName="extract-content" Oct 11 00:55:15 crc kubenswrapper[4743]: E1011 00:55:15.344575 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de5b1941-3ffa-477d-9b7f-7e07fe3ed206" containerName="registry-server" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.344581 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="de5b1941-3ffa-477d-9b7f-7e07fe3ed206" containerName="registry-server" Oct 11 00:55:15 crc kubenswrapper[4743]: E1011 00:55:15.344590 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="712f7dea-8de1-43e3-802b-b4e9b521b0b6" containerName="registry-server" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.344596 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="712f7dea-8de1-43e3-802b-b4e9b521b0b6" containerName="registry-server" Oct 11 00:55:15 crc kubenswrapper[4743]: E1011 00:55:15.344605 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f26ac0d-8683-415e-850c-5aef3da4b59f" containerName="image-pruner" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.344612 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f26ac0d-8683-415e-850c-5aef3da4b59f" containerName="image-pruner" Oct 11 00:55:15 crc kubenswrapper[4743]: E1011 00:55:15.344619 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="712f7dea-8de1-43e3-802b-b4e9b521b0b6" containerName="extract-utilities" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.344625 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="712f7dea-8de1-43e3-802b-b4e9b521b0b6" containerName="extract-utilities" Oct 11 00:55:15 crc kubenswrapper[4743]: E1011 00:55:15.344634 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de5b1941-3ffa-477d-9b7f-7e07fe3ed206" containerName="extract-utilities" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.344640 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="de5b1941-3ffa-477d-9b7f-7e07fe3ed206" containerName="extract-utilities" Oct 11 00:55:15 crc kubenswrapper[4743]: E1011 00:55:15.344648 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d89bc6e-6240-4acc-866a-347de2b7bc0a" containerName="extract-utilities" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.344654 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d89bc6e-6240-4acc-866a-347de2b7bc0a" containerName="extract-utilities" Oct 11 00:55:15 crc kubenswrapper[4743]: E1011 00:55:15.344662 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="712f7dea-8de1-43e3-802b-b4e9b521b0b6" containerName="extract-content" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.344667 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="712f7dea-8de1-43e3-802b-b4e9b521b0b6" containerName="extract-content" Oct 11 00:55:15 crc kubenswrapper[4743]: E1011 00:55:15.344674 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8693817b-7cf6-486d-a055-93c4c0308d95" containerName="controller-manager" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.344680 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8693817b-7cf6-486d-a055-93c4c0308d95" containerName="controller-manager" Oct 11 00:55:15 crc kubenswrapper[4743]: E1011 00:55:15.344686 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="414ae978-d3eb-4912-918a-72f472a48b45" containerName="pruner" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.344692 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="414ae978-d3eb-4912-918a-72f472a48b45" containerName="pruner" Oct 11 00:55:15 crc kubenswrapper[4743]: E1011 00:55:15.344700 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9862ed4b-81c0-40c0-80de-863846831655" containerName="pruner" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.344705 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9862ed4b-81c0-40c0-80de-863846831655" containerName="pruner" Oct 11 00:55:15 crc kubenswrapper[4743]: E1011 00:55:15.344712 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2305c756-5c61-4d13-aac9-1ff5d3c6b2ad" containerName="route-controller-manager" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.344717 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2305c756-5c61-4d13-aac9-1ff5d3c6b2ad" containerName="route-controller-manager" Oct 11 00:55:15 crc kubenswrapper[4743]: E1011 00:55:15.344724 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7adc79-4e39-4ed1-8024-3cd0bb96f661" containerName="extract-content" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.344730 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7adc79-4e39-4ed1-8024-3cd0bb96f661" containerName="extract-content" Oct 11 00:55:15 crc kubenswrapper[4743]: E1011 00:55:15.344740 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7adc79-4e39-4ed1-8024-3cd0bb96f661" containerName="registry-server" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.344745 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7adc79-4e39-4ed1-8024-3cd0bb96f661" containerName="registry-server" Oct 11 00:55:15 crc kubenswrapper[4743]: E1011 00:55:15.344752 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d89bc6e-6240-4acc-866a-347de2b7bc0a" containerName="registry-server" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.344758 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d89bc6e-6240-4acc-866a-347de2b7bc0a" containerName="registry-server" Oct 11 00:55:15 crc kubenswrapper[4743]: E1011 00:55:15.344764 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de5b1941-3ffa-477d-9b7f-7e07fe3ed206" containerName="extract-content" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.344769 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="de5b1941-3ffa-477d-9b7f-7e07fe3ed206" containerName="extract-content" Oct 11 00:55:15 crc kubenswrapper[4743]: E1011 00:55:15.344777 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7adc79-4e39-4ed1-8024-3cd0bb96f661" containerName="extract-utilities" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.344783 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7adc79-4e39-4ed1-8024-3cd0bb96f661" containerName="extract-utilities" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.347133 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f26ac0d-8683-415e-850c-5aef3da4b59f" containerName="image-pruner" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.347296 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d89bc6e-6240-4acc-866a-347de2b7bc0a" containerName="registry-server" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.347306 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b7adc79-4e39-4ed1-8024-3cd0bb96f661" containerName="registry-server" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.347316 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="414ae978-d3eb-4912-918a-72f472a48b45" containerName="pruner" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.347322 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9862ed4b-81c0-40c0-80de-863846831655" containerName="pruner" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.347331 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2305c756-5c61-4d13-aac9-1ff5d3c6b2ad" containerName="route-controller-manager" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.347341 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="712f7dea-8de1-43e3-802b-b4e9b521b0b6" containerName="registry-server" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.347351 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="de5b1941-3ffa-477d-9b7f-7e07fe3ed206" containerName="registry-server" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.347358 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8693817b-7cf6-486d-a055-93c4c0308d95" containerName="controller-manager" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.348105 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.351428 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.351769 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.352203 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.352452 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.352676 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.352973 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.355140 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk"] Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.363084 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.363895 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.371193 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.371388 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.374459 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl"] Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.377548 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk"] Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.380679 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.380771 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.380681 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.380814 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.391155 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77feb1b3-64be-403d-bf55-3f8ab1502a02-serving-cert\") pod \"route-controller-manager-85fc46bdd7-29hfk\" (UID: \"77feb1b3-64be-403d-bf55-3f8ab1502a02\") " pod="openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.391248 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-config\") pod \"controller-manager-5d4c8c5d5-cxvxl\" (UID: \"251a8f2c-3e2c-46cf-b2f8-6f0216dac488\") " pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.391315 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77feb1b3-64be-403d-bf55-3f8ab1502a02-client-ca\") pod \"route-controller-manager-85fc46bdd7-29hfk\" (UID: \"77feb1b3-64be-403d-bf55-3f8ab1502a02\") " pod="openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.391369 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwlck\" (UniqueName: \"kubernetes.io/projected/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-kube-api-access-wwlck\") pod \"controller-manager-5d4c8c5d5-cxvxl\" (UID: \"251a8f2c-3e2c-46cf-b2f8-6f0216dac488\") " pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.391390 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77feb1b3-64be-403d-bf55-3f8ab1502a02-config\") pod \"route-controller-manager-85fc46bdd7-29hfk\" (UID: \"77feb1b3-64be-403d-bf55-3f8ab1502a02\") " pod="openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.391457 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7czp\" (UniqueName: \"kubernetes.io/projected/77feb1b3-64be-403d-bf55-3f8ab1502a02-kube-api-access-l7czp\") pod \"route-controller-manager-85fc46bdd7-29hfk\" (UID: \"77feb1b3-64be-403d-bf55-3f8ab1502a02\") " pod="openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.391478 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-serving-cert\") pod \"controller-manager-5d4c8c5d5-cxvxl\" (UID: \"251a8f2c-3e2c-46cf-b2f8-6f0216dac488\") " pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.391520 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-client-ca\") pod \"controller-manager-5d4c8c5d5-cxvxl\" (UID: \"251a8f2c-3e2c-46cf-b2f8-6f0216dac488\") " pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.391540 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-proxy-ca-bundles\") pod \"controller-manager-5d4c8c5d5-cxvxl\" (UID: \"251a8f2c-3e2c-46cf-b2f8-6f0216dac488\") " pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.492023 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7czp\" (UniqueName: \"kubernetes.io/projected/77feb1b3-64be-403d-bf55-3f8ab1502a02-kube-api-access-l7czp\") pod \"route-controller-manager-85fc46bdd7-29hfk\" (UID: \"77feb1b3-64be-403d-bf55-3f8ab1502a02\") " pod="openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.492423 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-serving-cert\") pod \"controller-manager-5d4c8c5d5-cxvxl\" (UID: \"251a8f2c-3e2c-46cf-b2f8-6f0216dac488\") " pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.493367 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-client-ca\") pod \"controller-manager-5d4c8c5d5-cxvxl\" (UID: \"251a8f2c-3e2c-46cf-b2f8-6f0216dac488\") " pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.493406 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-proxy-ca-bundles\") pod \"controller-manager-5d4c8c5d5-cxvxl\" (UID: \"251a8f2c-3e2c-46cf-b2f8-6f0216dac488\") " pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.493442 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77feb1b3-64be-403d-bf55-3f8ab1502a02-serving-cert\") pod \"route-controller-manager-85fc46bdd7-29hfk\" (UID: \"77feb1b3-64be-403d-bf55-3f8ab1502a02\") " pod="openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.493497 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-config\") pod \"controller-manager-5d4c8c5d5-cxvxl\" (UID: \"251a8f2c-3e2c-46cf-b2f8-6f0216dac488\") " pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.493517 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77feb1b3-64be-403d-bf55-3f8ab1502a02-client-ca\") pod \"route-controller-manager-85fc46bdd7-29hfk\" (UID: \"77feb1b3-64be-403d-bf55-3f8ab1502a02\") " pod="openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.493542 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwlck\" (UniqueName: \"kubernetes.io/projected/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-kube-api-access-wwlck\") pod \"controller-manager-5d4c8c5d5-cxvxl\" (UID: \"251a8f2c-3e2c-46cf-b2f8-6f0216dac488\") " pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.493568 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77feb1b3-64be-403d-bf55-3f8ab1502a02-config\") pod \"route-controller-manager-85fc46bdd7-29hfk\" (UID: \"77feb1b3-64be-403d-bf55-3f8ab1502a02\") " pod="openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.494326 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-client-ca\") pod \"controller-manager-5d4c8c5d5-cxvxl\" (UID: \"251a8f2c-3e2c-46cf-b2f8-6f0216dac488\") " pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.494582 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-proxy-ca-bundles\") pod \"controller-manager-5d4c8c5d5-cxvxl\" (UID: \"251a8f2c-3e2c-46cf-b2f8-6f0216dac488\") " pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.494958 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77feb1b3-64be-403d-bf55-3f8ab1502a02-config\") pod \"route-controller-manager-85fc46bdd7-29hfk\" (UID: \"77feb1b3-64be-403d-bf55-3f8ab1502a02\") " pod="openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.495158 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77feb1b3-64be-403d-bf55-3f8ab1502a02-client-ca\") pod \"route-controller-manager-85fc46bdd7-29hfk\" (UID: \"77feb1b3-64be-403d-bf55-3f8ab1502a02\") " pod="openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.495379 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-config\") pod \"controller-manager-5d4c8c5d5-cxvxl\" (UID: \"251a8f2c-3e2c-46cf-b2f8-6f0216dac488\") " pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.498694 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-serving-cert\") pod \"controller-manager-5d4c8c5d5-cxvxl\" (UID: \"251a8f2c-3e2c-46cf-b2f8-6f0216dac488\") " pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.500311 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77feb1b3-64be-403d-bf55-3f8ab1502a02-serving-cert\") pod \"route-controller-manager-85fc46bdd7-29hfk\" (UID: \"77feb1b3-64be-403d-bf55-3f8ab1502a02\") " pod="openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.510485 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7czp\" (UniqueName: \"kubernetes.io/projected/77feb1b3-64be-403d-bf55-3f8ab1502a02-kube-api-access-l7czp\") pod \"route-controller-manager-85fc46bdd7-29hfk\" (UID: \"77feb1b3-64be-403d-bf55-3f8ab1502a02\") " pod="openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.514903 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwlck\" (UniqueName: \"kubernetes.io/projected/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-kube-api-access-wwlck\") pod \"controller-manager-5d4c8c5d5-cxvxl\" (UID: \"251a8f2c-3e2c-46cf-b2f8-6f0216dac488\") " pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.661666 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.685488 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk" Oct 11 00:55:15 crc kubenswrapper[4743]: I1011 00:55:15.922689 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl"] Oct 11 00:55:15 crc kubenswrapper[4743]: W1011 00:55:15.928720 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod251a8f2c_3e2c_46cf_b2f8_6f0216dac488.slice/crio-c34f2a65d23e48d222d8ade1da5ba5dd357a2f9c6a816566959e23ed5d046837 WatchSource:0}: Error finding container c34f2a65d23e48d222d8ade1da5ba5dd357a2f9c6a816566959e23ed5d046837: Status 404 returned error can't find the container with id c34f2a65d23e48d222d8ade1da5ba5dd357a2f9c6a816566959e23ed5d046837 Oct 11 00:55:16 crc kubenswrapper[4743]: I1011 00:55:16.090251 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk"] Oct 11 00:55:16 crc kubenswrapper[4743]: I1011 00:55:16.103399 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2305c756-5c61-4d13-aac9-1ff5d3c6b2ad" path="/var/lib/kubelet/pods/2305c756-5c61-4d13-aac9-1ff5d3c6b2ad/volumes" Oct 11 00:55:16 crc kubenswrapper[4743]: I1011 00:55:16.104707 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8693817b-7cf6-486d-a055-93c4c0308d95" path="/var/lib/kubelet/pods/8693817b-7cf6-486d-a055-93c4c0308d95/volumes" Oct 11 00:55:16 crc kubenswrapper[4743]: W1011 00:55:16.107192 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77feb1b3_64be_403d_bf55_3f8ab1502a02.slice/crio-0b70814c9f02b6c824b2edbc21129535f5bfc69a2165daf80a4637c8beda2585 WatchSource:0}: Error finding container 0b70814c9f02b6c824b2edbc21129535f5bfc69a2165daf80a4637c8beda2585: Status 404 returned error can't find the container with id 0b70814c9f02b6c824b2edbc21129535f5bfc69a2165daf80a4637c8beda2585 Oct 11 00:55:16 crc kubenswrapper[4743]: I1011 00:55:16.274936 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk" event={"ID":"77feb1b3-64be-403d-bf55-3f8ab1502a02","Type":"ContainerStarted","Data":"fa0126a3df486828ef9081598c48a636704167ce0b83fcabd4f0e2a400e72bfa"} Oct 11 00:55:16 crc kubenswrapper[4743]: I1011 00:55:16.275237 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk" event={"ID":"77feb1b3-64be-403d-bf55-3f8ab1502a02","Type":"ContainerStarted","Data":"0b70814c9f02b6c824b2edbc21129535f5bfc69a2165daf80a4637c8beda2585"} Oct 11 00:55:16 crc kubenswrapper[4743]: I1011 00:55:16.275586 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk" Oct 11 00:55:16 crc kubenswrapper[4743]: I1011 00:55:16.276811 4743 patch_prober.go:28] interesting pod/route-controller-manager-85fc46bdd7-29hfk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" start-of-body= Oct 11 00:55:16 crc kubenswrapper[4743]: I1011 00:55:16.276878 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk" podUID="77feb1b3-64be-403d-bf55-3f8ab1502a02" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" Oct 11 00:55:16 crc kubenswrapper[4743]: I1011 00:55:16.281173 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" event={"ID":"251a8f2c-3e2c-46cf-b2f8-6f0216dac488","Type":"ContainerStarted","Data":"340ae7b4886f1177d8ed1b33a38d7f4f5fb19d26a084f72b30a6c3bcbb88a24b"} Oct 11 00:55:16 crc kubenswrapper[4743]: I1011 00:55:16.281361 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" event={"ID":"251a8f2c-3e2c-46cf-b2f8-6f0216dac488","Type":"ContainerStarted","Data":"c34f2a65d23e48d222d8ade1da5ba5dd357a2f9c6a816566959e23ed5d046837"} Oct 11 00:55:16 crc kubenswrapper[4743]: I1011 00:55:16.282077 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" Oct 11 00:55:16 crc kubenswrapper[4743]: I1011 00:55:16.292035 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" Oct 11 00:55:16 crc kubenswrapper[4743]: I1011 00:55:16.338446 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" podStartSLOduration=2.338424734 podStartE2EDuration="2.338424734s" podCreationTimestamp="2025-10-11 00:55:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:55:16.334083282 +0000 UTC m=+210.987063679" watchObservedRunningTime="2025-10-11 00:55:16.338424734 +0000 UTC m=+210.991405132" Oct 11 00:55:16 crc kubenswrapper[4743]: I1011 00:55:16.340067 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk" podStartSLOduration=2.340060237 podStartE2EDuration="2.340060237s" podCreationTimestamp="2025-10-11 00:55:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:55:16.298584849 +0000 UTC m=+210.951565256" watchObservedRunningTime="2025-10-11 00:55:16.340060237 +0000 UTC m=+210.993040634" Oct 11 00:55:17 crc kubenswrapper[4743]: I1011 00:55:17.290403 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk" Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.101718 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl"] Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.102509 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" podUID="251a8f2c-3e2c-46cf-b2f8-6f0216dac488" containerName="controller-manager" containerID="cri-o://340ae7b4886f1177d8ed1b33a38d7f4f5fb19d26a084f72b30a6c3bcbb88a24b" gracePeriod=30 Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.109626 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk"] Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.110164 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk" podUID="77feb1b3-64be-403d-bf55-3f8ab1502a02" containerName="route-controller-manager" containerID="cri-o://fa0126a3df486828ef9081598c48a636704167ce0b83fcabd4f0e2a400e72bfa" gracePeriod=30 Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.397061 4743 generic.go:334] "Generic (PLEG): container finished" podID="251a8f2c-3e2c-46cf-b2f8-6f0216dac488" containerID="340ae7b4886f1177d8ed1b33a38d7f4f5fb19d26a084f72b30a6c3bcbb88a24b" exitCode=0 Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.397233 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" event={"ID":"251a8f2c-3e2c-46cf-b2f8-6f0216dac488","Type":"ContainerDied","Data":"340ae7b4886f1177d8ed1b33a38d7f4f5fb19d26a084f72b30a6c3bcbb88a24b"} Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.398576 4743 generic.go:334] "Generic (PLEG): container finished" podID="77feb1b3-64be-403d-bf55-3f8ab1502a02" containerID="fa0126a3df486828ef9081598c48a636704167ce0b83fcabd4f0e2a400e72bfa" exitCode=0 Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.398609 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk" event={"ID":"77feb1b3-64be-403d-bf55-3f8ab1502a02","Type":"ContainerDied","Data":"fa0126a3df486828ef9081598c48a636704167ce0b83fcabd4f0e2a400e72bfa"} Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.612045 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk" Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.707908 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.743404 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7czp\" (UniqueName: \"kubernetes.io/projected/77feb1b3-64be-403d-bf55-3f8ab1502a02-kube-api-access-l7czp\") pod \"77feb1b3-64be-403d-bf55-3f8ab1502a02\" (UID: \"77feb1b3-64be-403d-bf55-3f8ab1502a02\") " Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.743459 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77feb1b3-64be-403d-bf55-3f8ab1502a02-config\") pod \"77feb1b3-64be-403d-bf55-3f8ab1502a02\" (UID: \"77feb1b3-64be-403d-bf55-3f8ab1502a02\") " Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.743500 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77feb1b3-64be-403d-bf55-3f8ab1502a02-serving-cert\") pod \"77feb1b3-64be-403d-bf55-3f8ab1502a02\" (UID: \"77feb1b3-64be-403d-bf55-3f8ab1502a02\") " Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.743549 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77feb1b3-64be-403d-bf55-3f8ab1502a02-client-ca\") pod \"77feb1b3-64be-403d-bf55-3f8ab1502a02\" (UID: \"77feb1b3-64be-403d-bf55-3f8ab1502a02\") " Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.744259 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77feb1b3-64be-403d-bf55-3f8ab1502a02-client-ca" (OuterVolumeSpecName: "client-ca") pod "77feb1b3-64be-403d-bf55-3f8ab1502a02" (UID: "77feb1b3-64be-403d-bf55-3f8ab1502a02"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.744422 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77feb1b3-64be-403d-bf55-3f8ab1502a02-config" (OuterVolumeSpecName: "config") pod "77feb1b3-64be-403d-bf55-3f8ab1502a02" (UID: "77feb1b3-64be-403d-bf55-3f8ab1502a02"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.748734 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77feb1b3-64be-403d-bf55-3f8ab1502a02-kube-api-access-l7czp" (OuterVolumeSpecName: "kube-api-access-l7czp") pod "77feb1b3-64be-403d-bf55-3f8ab1502a02" (UID: "77feb1b3-64be-403d-bf55-3f8ab1502a02"). InnerVolumeSpecName "kube-api-access-l7czp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.749030 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77feb1b3-64be-403d-bf55-3f8ab1502a02-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "77feb1b3-64be-403d-bf55-3f8ab1502a02" (UID: "77feb1b3-64be-403d-bf55-3f8ab1502a02"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.844435 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-proxy-ca-bundles\") pod \"251a8f2c-3e2c-46cf-b2f8-6f0216dac488\" (UID: \"251a8f2c-3e2c-46cf-b2f8-6f0216dac488\") " Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.844578 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwlck\" (UniqueName: \"kubernetes.io/projected/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-kube-api-access-wwlck\") pod \"251a8f2c-3e2c-46cf-b2f8-6f0216dac488\" (UID: \"251a8f2c-3e2c-46cf-b2f8-6f0216dac488\") " Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.844635 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-client-ca\") pod \"251a8f2c-3e2c-46cf-b2f8-6f0216dac488\" (UID: \"251a8f2c-3e2c-46cf-b2f8-6f0216dac488\") " Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.844669 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-config\") pod \"251a8f2c-3e2c-46cf-b2f8-6f0216dac488\" (UID: \"251a8f2c-3e2c-46cf-b2f8-6f0216dac488\") " Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.844822 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-serving-cert\") pod \"251a8f2c-3e2c-46cf-b2f8-6f0216dac488\" (UID: \"251a8f2c-3e2c-46cf-b2f8-6f0216dac488\") " Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.845167 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7czp\" (UniqueName: \"kubernetes.io/projected/77feb1b3-64be-403d-bf55-3f8ab1502a02-kube-api-access-l7czp\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.845201 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77feb1b3-64be-403d-bf55-3f8ab1502a02-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.845219 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77feb1b3-64be-403d-bf55-3f8ab1502a02-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.845235 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77feb1b3-64be-403d-bf55-3f8ab1502a02-client-ca\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.845803 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "251a8f2c-3e2c-46cf-b2f8-6f0216dac488" (UID: "251a8f2c-3e2c-46cf-b2f8-6f0216dac488"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.845848 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-client-ca" (OuterVolumeSpecName: "client-ca") pod "251a8f2c-3e2c-46cf-b2f8-6f0216dac488" (UID: "251a8f2c-3e2c-46cf-b2f8-6f0216dac488"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.845914 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-config" (OuterVolumeSpecName: "config") pod "251a8f2c-3e2c-46cf-b2f8-6f0216dac488" (UID: "251a8f2c-3e2c-46cf-b2f8-6f0216dac488"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.847661 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "251a8f2c-3e2c-46cf-b2f8-6f0216dac488" (UID: "251a8f2c-3e2c-46cf-b2f8-6f0216dac488"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.847665 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-kube-api-access-wwlck" (OuterVolumeSpecName: "kube-api-access-wwlck") pod "251a8f2c-3e2c-46cf-b2f8-6f0216dac488" (UID: "251a8f2c-3e2c-46cf-b2f8-6f0216dac488"). InnerVolumeSpecName "kube-api-access-wwlck". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.946349 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.946417 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.946433 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwlck\" (UniqueName: \"kubernetes.io/projected/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-kube-api-access-wwlck\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.946444 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-client-ca\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:34 crc kubenswrapper[4743]: I1011 00:55:34.946466 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/251a8f2c-3e2c-46cf-b2f8-6f0216dac488-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.360025 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w"] Oct 11 00:55:35 crc kubenswrapper[4743]: E1011 00:55:35.360332 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="251a8f2c-3e2c-46cf-b2f8-6f0216dac488" containerName="controller-manager" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.360352 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="251a8f2c-3e2c-46cf-b2f8-6f0216dac488" containerName="controller-manager" Oct 11 00:55:35 crc kubenswrapper[4743]: E1011 00:55:35.360376 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77feb1b3-64be-403d-bf55-3f8ab1502a02" containerName="route-controller-manager" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.360389 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="77feb1b3-64be-403d-bf55-3f8ab1502a02" containerName="route-controller-manager" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.360566 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="77feb1b3-64be-403d-bf55-3f8ab1502a02" containerName="route-controller-manager" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.360589 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="251a8f2c-3e2c-46cf-b2f8-6f0216dac488" containerName="controller-manager" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.361166 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.362188 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79"] Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.362813 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.375248 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79"] Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.377609 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w"] Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.409572 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk" event={"ID":"77feb1b3-64be-403d-bf55-3f8ab1502a02","Type":"ContainerDied","Data":"0b70814c9f02b6c824b2edbc21129535f5bfc69a2165daf80a4637c8beda2585"} Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.409642 4743 scope.go:117] "RemoveContainer" containerID="fa0126a3df486828ef9081598c48a636704167ce0b83fcabd4f0e2a400e72bfa" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.409777 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.416147 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" event={"ID":"251a8f2c-3e2c-46cf-b2f8-6f0216dac488","Type":"ContainerDied","Data":"c34f2a65d23e48d222d8ade1da5ba5dd357a2f9c6a816566959e23ed5d046837"} Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.416213 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.431935 4743 scope.go:117] "RemoveContainer" containerID="340ae7b4886f1177d8ed1b33a38d7f4f5fb19d26a084f72b30a6c3bcbb88a24b" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.451213 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk"] Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.461971 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85fc46bdd7-29hfk"] Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.467267 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl"] Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.472087 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5d4c8c5d5-cxvxl"] Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.553844 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-proxy-ca-bundles\") pod \"controller-manager-5ddf76bfbf-2xj9w\" (UID: \"a7df2ac8-a6ab-49dc-98b1-7e12e16af253\") " pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.553950 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-serving-cert\") pod \"controller-manager-5ddf76bfbf-2xj9w\" (UID: \"a7df2ac8-a6ab-49dc-98b1-7e12e16af253\") " pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.553996 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea3117ba-711a-4055-bf89-d405d05a0154-config\") pod \"route-controller-manager-66d89dc4c-q6z79\" (UID: \"ea3117ba-711a-4055-bf89-d405d05a0154\") " pod="openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.554036 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9df8j\" (UniqueName: \"kubernetes.io/projected/ea3117ba-711a-4055-bf89-d405d05a0154-kube-api-access-9df8j\") pod \"route-controller-manager-66d89dc4c-q6z79\" (UID: \"ea3117ba-711a-4055-bf89-d405d05a0154\") " pod="openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.554142 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea3117ba-711a-4055-bf89-d405d05a0154-serving-cert\") pod \"route-controller-manager-66d89dc4c-q6z79\" (UID: \"ea3117ba-711a-4055-bf89-d405d05a0154\") " pod="openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.554217 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea3117ba-711a-4055-bf89-d405d05a0154-client-ca\") pod \"route-controller-manager-66d89dc4c-q6z79\" (UID: \"ea3117ba-711a-4055-bf89-d405d05a0154\") " pod="openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.554301 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-config\") pod \"controller-manager-5ddf76bfbf-2xj9w\" (UID: \"a7df2ac8-a6ab-49dc-98b1-7e12e16af253\") " pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.554374 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-client-ca\") pod \"controller-manager-5ddf76bfbf-2xj9w\" (UID: \"a7df2ac8-a6ab-49dc-98b1-7e12e16af253\") " pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.554408 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flbgl\" (UniqueName: \"kubernetes.io/projected/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-kube-api-access-flbgl\") pod \"controller-manager-5ddf76bfbf-2xj9w\" (UID: \"a7df2ac8-a6ab-49dc-98b1-7e12e16af253\") " pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.656125 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-proxy-ca-bundles\") pod \"controller-manager-5ddf76bfbf-2xj9w\" (UID: \"a7df2ac8-a6ab-49dc-98b1-7e12e16af253\") " pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.656161 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-serving-cert\") pod \"controller-manager-5ddf76bfbf-2xj9w\" (UID: \"a7df2ac8-a6ab-49dc-98b1-7e12e16af253\") " pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.656182 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea3117ba-711a-4055-bf89-d405d05a0154-config\") pod \"route-controller-manager-66d89dc4c-q6z79\" (UID: \"ea3117ba-711a-4055-bf89-d405d05a0154\") " pod="openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.656199 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9df8j\" (UniqueName: \"kubernetes.io/projected/ea3117ba-711a-4055-bf89-d405d05a0154-kube-api-access-9df8j\") pod \"route-controller-manager-66d89dc4c-q6z79\" (UID: \"ea3117ba-711a-4055-bf89-d405d05a0154\") " pod="openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.656226 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea3117ba-711a-4055-bf89-d405d05a0154-serving-cert\") pod \"route-controller-manager-66d89dc4c-q6z79\" (UID: \"ea3117ba-711a-4055-bf89-d405d05a0154\") " pod="openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.656253 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea3117ba-711a-4055-bf89-d405d05a0154-client-ca\") pod \"route-controller-manager-66d89dc4c-q6z79\" (UID: \"ea3117ba-711a-4055-bf89-d405d05a0154\") " pod="openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.656280 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-config\") pod \"controller-manager-5ddf76bfbf-2xj9w\" (UID: \"a7df2ac8-a6ab-49dc-98b1-7e12e16af253\") " pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.656300 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-client-ca\") pod \"controller-manager-5ddf76bfbf-2xj9w\" (UID: \"a7df2ac8-a6ab-49dc-98b1-7e12e16af253\") " pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.656319 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flbgl\" (UniqueName: \"kubernetes.io/projected/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-kube-api-access-flbgl\") pod \"controller-manager-5ddf76bfbf-2xj9w\" (UID: \"a7df2ac8-a6ab-49dc-98b1-7e12e16af253\") " pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.658304 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-proxy-ca-bundles\") pod \"controller-manager-5ddf76bfbf-2xj9w\" (UID: \"a7df2ac8-a6ab-49dc-98b1-7e12e16af253\") " pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.658417 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea3117ba-711a-4055-bf89-d405d05a0154-client-ca\") pod \"route-controller-manager-66d89dc4c-q6z79\" (UID: \"ea3117ba-711a-4055-bf89-d405d05a0154\") " pod="openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.658467 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-client-ca\") pod \"controller-manager-5ddf76bfbf-2xj9w\" (UID: \"a7df2ac8-a6ab-49dc-98b1-7e12e16af253\") " pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.659574 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea3117ba-711a-4055-bf89-d405d05a0154-config\") pod \"route-controller-manager-66d89dc4c-q6z79\" (UID: \"ea3117ba-711a-4055-bf89-d405d05a0154\") " pod="openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.659619 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-config\") pod \"controller-manager-5ddf76bfbf-2xj9w\" (UID: \"a7df2ac8-a6ab-49dc-98b1-7e12e16af253\") " pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.662552 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea3117ba-711a-4055-bf89-d405d05a0154-serving-cert\") pod \"route-controller-manager-66d89dc4c-q6z79\" (UID: \"ea3117ba-711a-4055-bf89-d405d05a0154\") " pod="openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.667807 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-serving-cert\") pod \"controller-manager-5ddf76bfbf-2xj9w\" (UID: \"a7df2ac8-a6ab-49dc-98b1-7e12e16af253\") " pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.677766 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flbgl\" (UniqueName: \"kubernetes.io/projected/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-kube-api-access-flbgl\") pod \"controller-manager-5ddf76bfbf-2xj9w\" (UID: \"a7df2ac8-a6ab-49dc-98b1-7e12e16af253\") " pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.685740 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9df8j\" (UniqueName: \"kubernetes.io/projected/ea3117ba-711a-4055-bf89-d405d05a0154-kube-api-access-9df8j\") pod \"route-controller-manager-66d89dc4c-q6z79\" (UID: \"ea3117ba-711a-4055-bf89-d405d05a0154\") " pod="openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.690391 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.696010 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79" Oct 11 00:55:35 crc kubenswrapper[4743]: I1011 00:55:35.989508 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79"] Oct 11 00:55:36 crc kubenswrapper[4743]: I1011 00:55:36.012962 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w"] Oct 11 00:55:36 crc kubenswrapper[4743]: W1011 00:55:36.020018 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7df2ac8_a6ab_49dc_98b1_7e12e16af253.slice/crio-516cf6bd6ba90116f2869cb3fdfcaa7bec1f505cd1cd8507fa5d831549013dfc WatchSource:0}: Error finding container 516cf6bd6ba90116f2869cb3fdfcaa7bec1f505cd1cd8507fa5d831549013dfc: Status 404 returned error can't find the container with id 516cf6bd6ba90116f2869cb3fdfcaa7bec1f505cd1cd8507fa5d831549013dfc Oct 11 00:55:36 crc kubenswrapper[4743]: I1011 00:55:36.099126 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="251a8f2c-3e2c-46cf-b2f8-6f0216dac488" path="/var/lib/kubelet/pods/251a8f2c-3e2c-46cf-b2f8-6f0216dac488/volumes" Oct 11 00:55:36 crc kubenswrapper[4743]: I1011 00:55:36.100039 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77feb1b3-64be-403d-bf55-3f8ab1502a02" path="/var/lib/kubelet/pods/77feb1b3-64be-403d-bf55-3f8ab1502a02/volumes" Oct 11 00:55:36 crc kubenswrapper[4743]: I1011 00:55:36.424695 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79" event={"ID":"ea3117ba-711a-4055-bf89-d405d05a0154","Type":"ContainerStarted","Data":"e17c49d6523ebd03966a2183efc2389710aa62df1bdae3412cbe9b1c5f286464"} Oct 11 00:55:36 crc kubenswrapper[4743]: I1011 00:55:36.424748 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79" event={"ID":"ea3117ba-711a-4055-bf89-d405d05a0154","Type":"ContainerStarted","Data":"edeb8271be18efa842f5713d0c23048e61d0c133aa10e57fd638814505887c48"} Oct 11 00:55:36 crc kubenswrapper[4743]: I1011 00:55:36.424770 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79" Oct 11 00:55:36 crc kubenswrapper[4743]: I1011 00:55:36.426977 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" event={"ID":"a7df2ac8-a6ab-49dc-98b1-7e12e16af253","Type":"ContainerStarted","Data":"ec6f3d9f986bb2fb82a37dab12df298e1367c7330ba09d9ac5119d35a3fa246c"} Oct 11 00:55:36 crc kubenswrapper[4743]: I1011 00:55:36.427015 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" event={"ID":"a7df2ac8-a6ab-49dc-98b1-7e12e16af253","Type":"ContainerStarted","Data":"516cf6bd6ba90116f2869cb3fdfcaa7bec1f505cd1cd8507fa5d831549013dfc"} Oct 11 00:55:36 crc kubenswrapper[4743]: I1011 00:55:36.427032 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" Oct 11 00:55:36 crc kubenswrapper[4743]: I1011 00:55:36.437766 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" Oct 11 00:55:36 crc kubenswrapper[4743]: I1011 00:55:36.441687 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79" podStartSLOduration=2.441673603 podStartE2EDuration="2.441673603s" podCreationTimestamp="2025-10-11 00:55:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:55:36.440820451 +0000 UTC m=+231.093800848" watchObservedRunningTime="2025-10-11 00:55:36.441673603 +0000 UTC m=+231.094654000" Oct 11 00:55:36 crc kubenswrapper[4743]: I1011 00:55:36.456082 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" podStartSLOduration=2.456065377 podStartE2EDuration="2.456065377s" podCreationTimestamp="2025-10-11 00:55:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:55:36.455303097 +0000 UTC m=+231.108283504" watchObservedRunningTime="2025-10-11 00:55:36.456065377 +0000 UTC m=+231.109045774" Oct 11 00:55:36 crc kubenswrapper[4743]: I1011 00:55:36.484802 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79" Oct 11 00:55:52 crc kubenswrapper[4743]: I1011 00:55:52.840173 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r8dmc"] Oct 11 00:55:52 crc kubenswrapper[4743]: I1011 00:55:52.840784 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r8dmc" podUID="7c7ded1c-c0ce-47c4-9959-b95631f067ea" containerName="registry-server" containerID="cri-o://beae56b2fff24fb0eeac7feabc2ade2cc29afb6bf52b13415e8b1adc5a084003" gracePeriod=30 Oct 11 00:55:52 crc kubenswrapper[4743]: I1011 00:55:52.849490 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-drlgz"] Oct 11 00:55:52 crc kubenswrapper[4743]: I1011 00:55:52.849702 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-drlgz" podUID="ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef" containerName="registry-server" containerID="cri-o://ba4fccfabd9a2caeb93221f263cf98776151d0ab20ec12b5bbf43c7c6bfe7e84" gracePeriod=30 Oct 11 00:55:52 crc kubenswrapper[4743]: I1011 00:55:52.858269 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xgtfn"] Oct 11 00:55:52 crc kubenswrapper[4743]: I1011 00:55:52.858511 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-xgtfn" podUID="16bef631-ee0f-4346-bb9b-c6eb48a09448" containerName="marketplace-operator" containerID="cri-o://a17b919a62d3e30de28933f816947e9668d1a111d03144fa9f95231658c5d17c" gracePeriod=30 Oct 11 00:55:52 crc kubenswrapper[4743]: I1011 00:55:52.872733 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jb4p2"] Oct 11 00:55:52 crc kubenswrapper[4743]: I1011 00:55:52.872981 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jb4p2" podUID="e405bfea-f17b-4c2b-b69e-fc7284876cdc" containerName="registry-server" containerID="cri-o://6ce938bf525beadd39c9cd7e624b606c041bf5310133137678d744a156df1a8b" gracePeriod=30 Oct 11 00:55:52 crc kubenswrapper[4743]: I1011 00:55:52.882753 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cpcdj"] Oct 11 00:55:52 crc kubenswrapper[4743]: I1011 00:55:52.883692 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cpcdj" Oct 11 00:55:52 crc kubenswrapper[4743]: I1011 00:55:52.885481 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cb8zp"] Oct 11 00:55:52 crc kubenswrapper[4743]: I1011 00:55:52.885716 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cb8zp" podUID="b87ffc73-065d-4570-867f-b91e442a4c73" containerName="registry-server" containerID="cri-o://35be2ecabe3c5e2593e5dcede3f793bcae33386ba2c57334a76d9e259cfb10e8" gracePeriod=30 Oct 11 00:55:52 crc kubenswrapper[4743]: I1011 00:55:52.903780 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cpcdj"] Oct 11 00:55:52 crc kubenswrapper[4743]: I1011 00:55:52.955984 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkfgl\" (UniqueName: \"kubernetes.io/projected/ecba19cf-13a3-40ee-8d5a-17af54a79caa-kube-api-access-hkfgl\") pod \"marketplace-operator-79b997595-cpcdj\" (UID: \"ecba19cf-13a3-40ee-8d5a-17af54a79caa\") " pod="openshift-marketplace/marketplace-operator-79b997595-cpcdj" Oct 11 00:55:52 crc kubenswrapper[4743]: I1011 00:55:52.956065 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ecba19cf-13a3-40ee-8d5a-17af54a79caa-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cpcdj\" (UID: \"ecba19cf-13a3-40ee-8d5a-17af54a79caa\") " pod="openshift-marketplace/marketplace-operator-79b997595-cpcdj" Oct 11 00:55:52 crc kubenswrapper[4743]: I1011 00:55:52.956117 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ecba19cf-13a3-40ee-8d5a-17af54a79caa-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cpcdj\" (UID: \"ecba19cf-13a3-40ee-8d5a-17af54a79caa\") " pod="openshift-marketplace/marketplace-operator-79b997595-cpcdj" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.059397 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ecba19cf-13a3-40ee-8d5a-17af54a79caa-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cpcdj\" (UID: \"ecba19cf-13a3-40ee-8d5a-17af54a79caa\") " pod="openshift-marketplace/marketplace-operator-79b997595-cpcdj" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.059471 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ecba19cf-13a3-40ee-8d5a-17af54a79caa-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cpcdj\" (UID: \"ecba19cf-13a3-40ee-8d5a-17af54a79caa\") " pod="openshift-marketplace/marketplace-operator-79b997595-cpcdj" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.059494 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkfgl\" (UniqueName: \"kubernetes.io/projected/ecba19cf-13a3-40ee-8d5a-17af54a79caa-kube-api-access-hkfgl\") pod \"marketplace-operator-79b997595-cpcdj\" (UID: \"ecba19cf-13a3-40ee-8d5a-17af54a79caa\") " pod="openshift-marketplace/marketplace-operator-79b997595-cpcdj" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.061200 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ecba19cf-13a3-40ee-8d5a-17af54a79caa-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cpcdj\" (UID: \"ecba19cf-13a3-40ee-8d5a-17af54a79caa\") " pod="openshift-marketplace/marketplace-operator-79b997595-cpcdj" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.069846 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ecba19cf-13a3-40ee-8d5a-17af54a79caa-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cpcdj\" (UID: \"ecba19cf-13a3-40ee-8d5a-17af54a79caa\") " pod="openshift-marketplace/marketplace-operator-79b997595-cpcdj" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.086640 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkfgl\" (UniqueName: \"kubernetes.io/projected/ecba19cf-13a3-40ee-8d5a-17af54a79caa-kube-api-access-hkfgl\") pod \"marketplace-operator-79b997595-cpcdj\" (UID: \"ecba19cf-13a3-40ee-8d5a-17af54a79caa\") " pod="openshift-marketplace/marketplace-operator-79b997595-cpcdj" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.110985 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sjtsw"] Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.200791 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cpcdj" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.412324 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-drlgz" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.463588 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef-utilities\") pod \"ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef\" (UID: \"ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef\") " Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.463668 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef-catalog-content\") pod \"ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef\" (UID: \"ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef\") " Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.463702 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qwzj\" (UniqueName: \"kubernetes.io/projected/ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef-kube-api-access-4qwzj\") pod \"ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef\" (UID: \"ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef\") " Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.464436 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef-utilities" (OuterVolumeSpecName: "utilities") pod "ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef" (UID: "ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.470019 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef-kube-api-access-4qwzj" (OuterVolumeSpecName: "kube-api-access-4qwzj") pod "ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef" (UID: "ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef"). InnerVolumeSpecName "kube-api-access-4qwzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.515590 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cb8zp" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.523989 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef" (UID: "ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.540624 4743 generic.go:334] "Generic (PLEG): container finished" podID="7c7ded1c-c0ce-47c4-9959-b95631f067ea" containerID="beae56b2fff24fb0eeac7feabc2ade2cc29afb6bf52b13415e8b1adc5a084003" exitCode=0 Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.540675 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8dmc" event={"ID":"7c7ded1c-c0ce-47c4-9959-b95631f067ea","Type":"ContainerDied","Data":"beae56b2fff24fb0eeac7feabc2ade2cc29afb6bf52b13415e8b1adc5a084003"} Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.547112 4743 generic.go:334] "Generic (PLEG): container finished" podID="e405bfea-f17b-4c2b-b69e-fc7284876cdc" containerID="6ce938bf525beadd39c9cd7e624b606c041bf5310133137678d744a156df1a8b" exitCode=0 Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.547172 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jb4p2" event={"ID":"e405bfea-f17b-4c2b-b69e-fc7284876cdc","Type":"ContainerDied","Data":"6ce938bf525beadd39c9cd7e624b606c041bf5310133137678d744a156df1a8b"} Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.559266 4743 generic.go:334] "Generic (PLEG): container finished" podID="ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef" containerID="ba4fccfabd9a2caeb93221f263cf98776151d0ab20ec12b5bbf43c7c6bfe7e84" exitCode=0 Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.559313 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-drlgz" event={"ID":"ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef","Type":"ContainerDied","Data":"ba4fccfabd9a2caeb93221f263cf98776151d0ab20ec12b5bbf43c7c6bfe7e84"} Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.559337 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-drlgz" event={"ID":"ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef","Type":"ContainerDied","Data":"9e7d864ff7d6210f6fa8bc3f4089dbe1954dae45a103cf1acaac9b2f7a8a0f57"} Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.559353 4743 scope.go:117] "RemoveContainer" containerID="ba4fccfabd9a2caeb93221f263cf98776151d0ab20ec12b5bbf43c7c6bfe7e84" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.559458 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-drlgz" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.566382 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b87ffc73-065d-4570-867f-b91e442a4c73-catalog-content\") pod \"b87ffc73-065d-4570-867f-b91e442a4c73\" (UID: \"b87ffc73-065d-4570-867f-b91e442a4c73\") " Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.566422 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm4k7\" (UniqueName: \"kubernetes.io/projected/b87ffc73-065d-4570-867f-b91e442a4c73-kube-api-access-qm4k7\") pod \"b87ffc73-065d-4570-867f-b91e442a4c73\" (UID: \"b87ffc73-065d-4570-867f-b91e442a4c73\") " Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.566444 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b87ffc73-065d-4570-867f-b91e442a4c73-utilities\") pod \"b87ffc73-065d-4570-867f-b91e442a4c73\" (UID: \"b87ffc73-065d-4570-867f-b91e442a4c73\") " Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.566616 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qwzj\" (UniqueName: \"kubernetes.io/projected/ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef-kube-api-access-4qwzj\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.566628 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.566636 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.568122 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b87ffc73-065d-4570-867f-b91e442a4c73-utilities" (OuterVolumeSpecName: "utilities") pod "b87ffc73-065d-4570-867f-b91e442a4c73" (UID: "b87ffc73-065d-4570-867f-b91e442a4c73"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.570151 4743 generic.go:334] "Generic (PLEG): container finished" podID="16bef631-ee0f-4346-bb9b-c6eb48a09448" containerID="a17b919a62d3e30de28933f816947e9668d1a111d03144fa9f95231658c5d17c" exitCode=0 Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.570198 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xgtfn" event={"ID":"16bef631-ee0f-4346-bb9b-c6eb48a09448","Type":"ContainerDied","Data":"a17b919a62d3e30de28933f816947e9668d1a111d03144fa9f95231658c5d17c"} Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.578023 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b87ffc73-065d-4570-867f-b91e442a4c73-kube-api-access-qm4k7" (OuterVolumeSpecName: "kube-api-access-qm4k7") pod "b87ffc73-065d-4570-867f-b91e442a4c73" (UID: "b87ffc73-065d-4570-867f-b91e442a4c73"). InnerVolumeSpecName "kube-api-access-qm4k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.580417 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8dmc" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.584534 4743 generic.go:334] "Generic (PLEG): container finished" podID="b87ffc73-065d-4570-867f-b91e442a4c73" containerID="35be2ecabe3c5e2593e5dcede3f793bcae33386ba2c57334a76d9e259cfb10e8" exitCode=0 Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.584571 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cb8zp" event={"ID":"b87ffc73-065d-4570-867f-b91e442a4c73","Type":"ContainerDied","Data":"35be2ecabe3c5e2593e5dcede3f793bcae33386ba2c57334a76d9e259cfb10e8"} Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.584596 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cb8zp" event={"ID":"b87ffc73-065d-4570-867f-b91e442a4c73","Type":"ContainerDied","Data":"4240f01a773773fadcc9fa0e5d12298e05b283f5f1be4ba0ccd1e69fb22dc336"} Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.584667 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cb8zp" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.585629 4743 scope.go:117] "RemoveContainer" containerID="83c8fcf161e196c7f545a0647f73181c943a7af0bcfdd18dc7acb190acc6e9a9" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.594399 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-drlgz"] Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.596912 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-drlgz"] Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.620392 4743 scope.go:117] "RemoveContainer" containerID="9b1bda191d798619c8ae3467915792f9eb3ab97b8cd570fd22cc01523c75ac69" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.641069 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xgtfn" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.658081 4743 scope.go:117] "RemoveContainer" containerID="ba4fccfabd9a2caeb93221f263cf98776151d0ab20ec12b5bbf43c7c6bfe7e84" Oct 11 00:55:53 crc kubenswrapper[4743]: E1011 00:55:53.659844 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba4fccfabd9a2caeb93221f263cf98776151d0ab20ec12b5bbf43c7c6bfe7e84\": container with ID starting with ba4fccfabd9a2caeb93221f263cf98776151d0ab20ec12b5bbf43c7c6bfe7e84 not found: ID does not exist" containerID="ba4fccfabd9a2caeb93221f263cf98776151d0ab20ec12b5bbf43c7c6bfe7e84" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.659884 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba4fccfabd9a2caeb93221f263cf98776151d0ab20ec12b5bbf43c7c6bfe7e84"} err="failed to get container status \"ba4fccfabd9a2caeb93221f263cf98776151d0ab20ec12b5bbf43c7c6bfe7e84\": rpc error: code = NotFound desc = could not find container \"ba4fccfabd9a2caeb93221f263cf98776151d0ab20ec12b5bbf43c7c6bfe7e84\": container with ID starting with ba4fccfabd9a2caeb93221f263cf98776151d0ab20ec12b5bbf43c7c6bfe7e84 not found: ID does not exist" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.659905 4743 scope.go:117] "RemoveContainer" containerID="83c8fcf161e196c7f545a0647f73181c943a7af0bcfdd18dc7acb190acc6e9a9" Oct 11 00:55:53 crc kubenswrapper[4743]: E1011 00:55:53.660102 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83c8fcf161e196c7f545a0647f73181c943a7af0bcfdd18dc7acb190acc6e9a9\": container with ID starting with 83c8fcf161e196c7f545a0647f73181c943a7af0bcfdd18dc7acb190acc6e9a9 not found: ID does not exist" containerID="83c8fcf161e196c7f545a0647f73181c943a7af0bcfdd18dc7acb190acc6e9a9" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.660128 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83c8fcf161e196c7f545a0647f73181c943a7af0bcfdd18dc7acb190acc6e9a9"} err="failed to get container status \"83c8fcf161e196c7f545a0647f73181c943a7af0bcfdd18dc7acb190acc6e9a9\": rpc error: code = NotFound desc = could not find container \"83c8fcf161e196c7f545a0647f73181c943a7af0bcfdd18dc7acb190acc6e9a9\": container with ID starting with 83c8fcf161e196c7f545a0647f73181c943a7af0bcfdd18dc7acb190acc6e9a9 not found: ID does not exist" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.660140 4743 scope.go:117] "RemoveContainer" containerID="9b1bda191d798619c8ae3467915792f9eb3ab97b8cd570fd22cc01523c75ac69" Oct 11 00:55:53 crc kubenswrapper[4743]: E1011 00:55:53.661005 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b1bda191d798619c8ae3467915792f9eb3ab97b8cd570fd22cc01523c75ac69\": container with ID starting with 9b1bda191d798619c8ae3467915792f9eb3ab97b8cd570fd22cc01523c75ac69 not found: ID does not exist" containerID="9b1bda191d798619c8ae3467915792f9eb3ab97b8cd570fd22cc01523c75ac69" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.661027 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b1bda191d798619c8ae3467915792f9eb3ab97b8cd570fd22cc01523c75ac69"} err="failed to get container status \"9b1bda191d798619c8ae3467915792f9eb3ab97b8cd570fd22cc01523c75ac69\": rpc error: code = NotFound desc = could not find container \"9b1bda191d798619c8ae3467915792f9eb3ab97b8cd570fd22cc01523c75ac69\": container with ID starting with 9b1bda191d798619c8ae3467915792f9eb3ab97b8cd570fd22cc01523c75ac69 not found: ID does not exist" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.661040 4743 scope.go:117] "RemoveContainer" containerID="35be2ecabe3c5e2593e5dcede3f793bcae33386ba2c57334a76d9e259cfb10e8" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.666986 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llb68\" (UniqueName: \"kubernetes.io/projected/7c7ded1c-c0ce-47c4-9959-b95631f067ea-kube-api-access-llb68\") pod \"7c7ded1c-c0ce-47c4-9959-b95631f067ea\" (UID: \"7c7ded1c-c0ce-47c4-9959-b95631f067ea\") " Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.667053 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/16bef631-ee0f-4346-bb9b-c6eb48a09448-marketplace-operator-metrics\") pod \"16bef631-ee0f-4346-bb9b-c6eb48a09448\" (UID: \"16bef631-ee0f-4346-bb9b-c6eb48a09448\") " Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.667106 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c7ded1c-c0ce-47c4-9959-b95631f067ea-utilities\") pod \"7c7ded1c-c0ce-47c4-9959-b95631f067ea\" (UID: \"7c7ded1c-c0ce-47c4-9959-b95631f067ea\") " Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.667176 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c7ded1c-c0ce-47c4-9959-b95631f067ea-catalog-content\") pod \"7c7ded1c-c0ce-47c4-9959-b95631f067ea\" (UID: \"7c7ded1c-c0ce-47c4-9959-b95631f067ea\") " Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.667258 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zlgv\" (UniqueName: \"kubernetes.io/projected/16bef631-ee0f-4346-bb9b-c6eb48a09448-kube-api-access-2zlgv\") pod \"16bef631-ee0f-4346-bb9b-c6eb48a09448\" (UID: \"16bef631-ee0f-4346-bb9b-c6eb48a09448\") " Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.667292 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16bef631-ee0f-4346-bb9b-c6eb48a09448-marketplace-trusted-ca\") pod \"16bef631-ee0f-4346-bb9b-c6eb48a09448\" (UID: \"16bef631-ee0f-4346-bb9b-c6eb48a09448\") " Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.667522 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm4k7\" (UniqueName: \"kubernetes.io/projected/b87ffc73-065d-4570-867f-b91e442a4c73-kube-api-access-qm4k7\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.667545 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b87ffc73-065d-4570-867f-b91e442a4c73-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.667926 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c7ded1c-c0ce-47c4-9959-b95631f067ea-utilities" (OuterVolumeSpecName: "utilities") pod "7c7ded1c-c0ce-47c4-9959-b95631f067ea" (UID: "7c7ded1c-c0ce-47c4-9959-b95631f067ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.668276 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16bef631-ee0f-4346-bb9b-c6eb48a09448-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "16bef631-ee0f-4346-bb9b-c6eb48a09448" (UID: "16bef631-ee0f-4346-bb9b-c6eb48a09448"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.668547 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jb4p2" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.669604 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c7ded1c-c0ce-47c4-9959-b95631f067ea-kube-api-access-llb68" (OuterVolumeSpecName: "kube-api-access-llb68") pod "7c7ded1c-c0ce-47c4-9959-b95631f067ea" (UID: "7c7ded1c-c0ce-47c4-9959-b95631f067ea"). InnerVolumeSpecName "kube-api-access-llb68". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.670601 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16bef631-ee0f-4346-bb9b-c6eb48a09448-kube-api-access-2zlgv" (OuterVolumeSpecName: "kube-api-access-2zlgv") pod "16bef631-ee0f-4346-bb9b-c6eb48a09448" (UID: "16bef631-ee0f-4346-bb9b-c6eb48a09448"). InnerVolumeSpecName "kube-api-access-2zlgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.671134 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16bef631-ee0f-4346-bb9b-c6eb48a09448-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "16bef631-ee0f-4346-bb9b-c6eb48a09448" (UID: "16bef631-ee0f-4346-bb9b-c6eb48a09448"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.679001 4743 scope.go:117] "RemoveContainer" containerID="93f102bfae867e58f3e6c378f5062007eaca8ce5b58a4f29f23b6cc13e2c0ffb" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.682031 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b87ffc73-065d-4570-867f-b91e442a4c73-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b87ffc73-065d-4570-867f-b91e442a4c73" (UID: "b87ffc73-065d-4570-867f-b91e442a4c73"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.695958 4743 scope.go:117] "RemoveContainer" containerID="32626fd9833452489f8dfdc37d17299e3395c605374e40511c31d6f063029bfb" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.716518 4743 scope.go:117] "RemoveContainer" containerID="35be2ecabe3c5e2593e5dcede3f793bcae33386ba2c57334a76d9e259cfb10e8" Oct 11 00:55:53 crc kubenswrapper[4743]: E1011 00:55:53.717016 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35be2ecabe3c5e2593e5dcede3f793bcae33386ba2c57334a76d9e259cfb10e8\": container with ID starting with 35be2ecabe3c5e2593e5dcede3f793bcae33386ba2c57334a76d9e259cfb10e8 not found: ID does not exist" containerID="35be2ecabe3c5e2593e5dcede3f793bcae33386ba2c57334a76d9e259cfb10e8" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.717042 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35be2ecabe3c5e2593e5dcede3f793bcae33386ba2c57334a76d9e259cfb10e8"} err="failed to get container status \"35be2ecabe3c5e2593e5dcede3f793bcae33386ba2c57334a76d9e259cfb10e8\": rpc error: code = NotFound desc = could not find container \"35be2ecabe3c5e2593e5dcede3f793bcae33386ba2c57334a76d9e259cfb10e8\": container with ID starting with 35be2ecabe3c5e2593e5dcede3f793bcae33386ba2c57334a76d9e259cfb10e8 not found: ID does not exist" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.717063 4743 scope.go:117] "RemoveContainer" containerID="93f102bfae867e58f3e6c378f5062007eaca8ce5b58a4f29f23b6cc13e2c0ffb" Oct 11 00:55:53 crc kubenswrapper[4743]: E1011 00:55:53.717253 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93f102bfae867e58f3e6c378f5062007eaca8ce5b58a4f29f23b6cc13e2c0ffb\": container with ID starting with 93f102bfae867e58f3e6c378f5062007eaca8ce5b58a4f29f23b6cc13e2c0ffb not found: ID does not exist" containerID="93f102bfae867e58f3e6c378f5062007eaca8ce5b58a4f29f23b6cc13e2c0ffb" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.717268 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93f102bfae867e58f3e6c378f5062007eaca8ce5b58a4f29f23b6cc13e2c0ffb"} err="failed to get container status \"93f102bfae867e58f3e6c378f5062007eaca8ce5b58a4f29f23b6cc13e2c0ffb\": rpc error: code = NotFound desc = could not find container \"93f102bfae867e58f3e6c378f5062007eaca8ce5b58a4f29f23b6cc13e2c0ffb\": container with ID starting with 93f102bfae867e58f3e6c378f5062007eaca8ce5b58a4f29f23b6cc13e2c0ffb not found: ID does not exist" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.717279 4743 scope.go:117] "RemoveContainer" containerID="32626fd9833452489f8dfdc37d17299e3395c605374e40511c31d6f063029bfb" Oct 11 00:55:53 crc kubenswrapper[4743]: E1011 00:55:53.717481 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32626fd9833452489f8dfdc37d17299e3395c605374e40511c31d6f063029bfb\": container with ID starting with 32626fd9833452489f8dfdc37d17299e3395c605374e40511c31d6f063029bfb not found: ID does not exist" containerID="32626fd9833452489f8dfdc37d17299e3395c605374e40511c31d6f063029bfb" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.717502 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32626fd9833452489f8dfdc37d17299e3395c605374e40511c31d6f063029bfb"} err="failed to get container status \"32626fd9833452489f8dfdc37d17299e3395c605374e40511c31d6f063029bfb\": rpc error: code = NotFound desc = could not find container \"32626fd9833452489f8dfdc37d17299e3395c605374e40511c31d6f063029bfb\": container with ID starting with 32626fd9833452489f8dfdc37d17299e3395c605374e40511c31d6f063029bfb not found: ID does not exist" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.728879 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c7ded1c-c0ce-47c4-9959-b95631f067ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c7ded1c-c0ce-47c4-9959-b95631f067ea" (UID: "7c7ded1c-c0ce-47c4-9959-b95631f067ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.768290 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e405bfea-f17b-4c2b-b69e-fc7284876cdc-catalog-content\") pod \"e405bfea-f17b-4c2b-b69e-fc7284876cdc\" (UID: \"e405bfea-f17b-4c2b-b69e-fc7284876cdc\") " Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.768364 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e405bfea-f17b-4c2b-b69e-fc7284876cdc-utilities\") pod \"e405bfea-f17b-4c2b-b69e-fc7284876cdc\" (UID: \"e405bfea-f17b-4c2b-b69e-fc7284876cdc\") " Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.768474 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgfkz\" (UniqueName: \"kubernetes.io/projected/e405bfea-f17b-4c2b-b69e-fc7284876cdc-kube-api-access-xgfkz\") pod \"e405bfea-f17b-4c2b-b69e-fc7284876cdc\" (UID: \"e405bfea-f17b-4c2b-b69e-fc7284876cdc\") " Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.768680 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b87ffc73-065d-4570-867f-b91e442a4c73-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.768697 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llb68\" (UniqueName: \"kubernetes.io/projected/7c7ded1c-c0ce-47c4-9959-b95631f067ea-kube-api-access-llb68\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.768708 4743 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/16bef631-ee0f-4346-bb9b-c6eb48a09448-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.768718 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c7ded1c-c0ce-47c4-9959-b95631f067ea-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.768726 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c7ded1c-c0ce-47c4-9959-b95631f067ea-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.768734 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zlgv\" (UniqueName: \"kubernetes.io/projected/16bef631-ee0f-4346-bb9b-c6eb48a09448-kube-api-access-2zlgv\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.768743 4743 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16bef631-ee0f-4346-bb9b-c6eb48a09448-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.769131 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e405bfea-f17b-4c2b-b69e-fc7284876cdc-utilities" (OuterVolumeSpecName: "utilities") pod "e405bfea-f17b-4c2b-b69e-fc7284876cdc" (UID: "e405bfea-f17b-4c2b-b69e-fc7284876cdc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.771311 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e405bfea-f17b-4c2b-b69e-fc7284876cdc-kube-api-access-xgfkz" (OuterVolumeSpecName: "kube-api-access-xgfkz") pod "e405bfea-f17b-4c2b-b69e-fc7284876cdc" (UID: "e405bfea-f17b-4c2b-b69e-fc7284876cdc"). InnerVolumeSpecName "kube-api-access-xgfkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.782307 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e405bfea-f17b-4c2b-b69e-fc7284876cdc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e405bfea-f17b-4c2b-b69e-fc7284876cdc" (UID: "e405bfea-f17b-4c2b-b69e-fc7284876cdc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.869696 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e405bfea-f17b-4c2b-b69e-fc7284876cdc-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.869800 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgfkz\" (UniqueName: \"kubernetes.io/projected/e405bfea-f17b-4c2b-b69e-fc7284876cdc-kube-api-access-xgfkz\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.869821 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e405bfea-f17b-4c2b-b69e-fc7284876cdc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.911024 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cb8zp"] Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.912278 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cb8zp"] Oct 11 00:55:53 crc kubenswrapper[4743]: I1011 00:55:53.937249 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cpcdj"] Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.047770 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w"] Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.048709 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" podUID="a7df2ac8-a6ab-49dc-98b1-7e12e16af253" containerName="controller-manager" containerID="cri-o://ec6f3d9f986bb2fb82a37dab12df298e1367c7330ba09d9ac5119d35a3fa246c" gracePeriod=30 Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.097433 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b87ffc73-065d-4570-867f-b91e442a4c73" path="/var/lib/kubelet/pods/b87ffc73-065d-4570-867f-b91e442a4c73/volumes" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.098171 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef" path="/var/lib/kubelet/pods/ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef/volumes" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.151331 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79"] Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.151574 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79" podUID="ea3117ba-711a-4055-bf89-d405d05a0154" containerName="route-controller-manager" containerID="cri-o://e17c49d6523ebd03966a2183efc2389710aa62df1bdae3412cbe9b1c5f286464" gracePeriod=30 Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.558487 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.565165 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.590504 4743 generic.go:334] "Generic (PLEG): container finished" podID="ea3117ba-711a-4055-bf89-d405d05a0154" containerID="e17c49d6523ebd03966a2183efc2389710aa62df1bdae3412cbe9b1c5f286464" exitCode=0 Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.590563 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79" event={"ID":"ea3117ba-711a-4055-bf89-d405d05a0154","Type":"ContainerDied","Data":"e17c49d6523ebd03966a2183efc2389710aa62df1bdae3412cbe9b1c5f286464"} Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.590590 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79" event={"ID":"ea3117ba-711a-4055-bf89-d405d05a0154","Type":"ContainerDied","Data":"edeb8271be18efa842f5713d0c23048e61d0c133aa10e57fd638814505887c48"} Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.590612 4743 scope.go:117] "RemoveContainer" containerID="e17c49d6523ebd03966a2183efc2389710aa62df1bdae3412cbe9b1c5f286464" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.590700 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.592358 4743 generic.go:334] "Generic (PLEG): container finished" podID="a7df2ac8-a6ab-49dc-98b1-7e12e16af253" containerID="ec6f3d9f986bb2fb82a37dab12df298e1367c7330ba09d9ac5119d35a3fa246c" exitCode=0 Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.592391 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" event={"ID":"a7df2ac8-a6ab-49dc-98b1-7e12e16af253","Type":"ContainerDied","Data":"ec6f3d9f986bb2fb82a37dab12df298e1367c7330ba09d9ac5119d35a3fa246c"} Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.592416 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" event={"ID":"a7df2ac8-a6ab-49dc-98b1-7e12e16af253","Type":"ContainerDied","Data":"516cf6bd6ba90116f2869cb3fdfcaa7bec1f505cd1cd8507fa5d831549013dfc"} Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.592454 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.593994 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xgtfn" event={"ID":"16bef631-ee0f-4346-bb9b-c6eb48a09448","Type":"ContainerDied","Data":"5ba6cfb5c39fff28953038d718c294177cb4aa7cf9206839ada218d601de67b3"} Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.594007 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xgtfn" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.597190 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cpcdj" event={"ID":"ecba19cf-13a3-40ee-8d5a-17af54a79caa","Type":"ContainerStarted","Data":"2a75ea291009b42be9584cc180c483ae3f961585b3ab42a7d7884c2e7822a8c5"} Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.597244 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cpcdj" event={"ID":"ecba19cf-13a3-40ee-8d5a-17af54a79caa","Type":"ContainerStarted","Data":"4c67b57205b71caa94eeb448a383015dc7db7853dd0eb2deb955a76fd3260d72"} Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.598058 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-cpcdj" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.601024 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8dmc" event={"ID":"7c7ded1c-c0ce-47c4-9959-b95631f067ea","Type":"ContainerDied","Data":"328acba2fcf7b182090b966a9107eff5a3cbb81f95c3ac070d3db395acd6530b"} Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.601149 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8dmc" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.601830 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-cpcdj" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.603973 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jb4p2" event={"ID":"e405bfea-f17b-4c2b-b69e-fc7284876cdc","Type":"ContainerDied","Data":"6b312ec61258615a36c62c7ff87b1fb25cfb7a42fdfe0803983be183945e4eb1"} Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.604023 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jb4p2" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.613088 4743 scope.go:117] "RemoveContainer" containerID="e17c49d6523ebd03966a2183efc2389710aa62df1bdae3412cbe9b1c5f286464" Oct 11 00:55:54 crc kubenswrapper[4743]: E1011 00:55:54.613571 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e17c49d6523ebd03966a2183efc2389710aa62df1bdae3412cbe9b1c5f286464\": container with ID starting with e17c49d6523ebd03966a2183efc2389710aa62df1bdae3412cbe9b1c5f286464 not found: ID does not exist" containerID="e17c49d6523ebd03966a2183efc2389710aa62df1bdae3412cbe9b1c5f286464" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.613611 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e17c49d6523ebd03966a2183efc2389710aa62df1bdae3412cbe9b1c5f286464"} err="failed to get container status \"e17c49d6523ebd03966a2183efc2389710aa62df1bdae3412cbe9b1c5f286464\": rpc error: code = NotFound desc = could not find container \"e17c49d6523ebd03966a2183efc2389710aa62df1bdae3412cbe9b1c5f286464\": container with ID starting with e17c49d6523ebd03966a2183efc2389710aa62df1bdae3412cbe9b1c5f286464 not found: ID does not exist" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.613707 4743 scope.go:117] "RemoveContainer" containerID="ec6f3d9f986bb2fb82a37dab12df298e1367c7330ba09d9ac5119d35a3fa246c" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.640895 4743 scope.go:117] "RemoveContainer" containerID="ec6f3d9f986bb2fb82a37dab12df298e1367c7330ba09d9ac5119d35a3fa246c" Oct 11 00:55:54 crc kubenswrapper[4743]: E1011 00:55:54.642355 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec6f3d9f986bb2fb82a37dab12df298e1367c7330ba09d9ac5119d35a3fa246c\": container with ID starting with ec6f3d9f986bb2fb82a37dab12df298e1367c7330ba09d9ac5119d35a3fa246c not found: ID does not exist" containerID="ec6f3d9f986bb2fb82a37dab12df298e1367c7330ba09d9ac5119d35a3fa246c" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.642417 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec6f3d9f986bb2fb82a37dab12df298e1367c7330ba09d9ac5119d35a3fa246c"} err="failed to get container status \"ec6f3d9f986bb2fb82a37dab12df298e1367c7330ba09d9ac5119d35a3fa246c\": rpc error: code = NotFound desc = could not find container \"ec6f3d9f986bb2fb82a37dab12df298e1367c7330ba09d9ac5119d35a3fa246c\": container with ID starting with ec6f3d9f986bb2fb82a37dab12df298e1367c7330ba09d9ac5119d35a3fa246c not found: ID does not exist" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.642453 4743 scope.go:117] "RemoveContainer" containerID="a17b919a62d3e30de28933f816947e9668d1a111d03144fa9f95231658c5d17c" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.647922 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-cpcdj" podStartSLOduration=2.647831996 podStartE2EDuration="2.647831996s" podCreationTimestamp="2025-10-11 00:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:55:54.640683011 +0000 UTC m=+249.293663408" watchObservedRunningTime="2025-10-11 00:55:54.647831996 +0000 UTC m=+249.300812393" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.655333 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xgtfn"] Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.669529 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xgtfn"] Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.672259 4743 scope.go:117] "RemoveContainer" containerID="beae56b2fff24fb0eeac7feabc2ade2cc29afb6bf52b13415e8b1adc5a084003" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.678764 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9df8j\" (UniqueName: \"kubernetes.io/projected/ea3117ba-711a-4055-bf89-d405d05a0154-kube-api-access-9df8j\") pod \"ea3117ba-711a-4055-bf89-d405d05a0154\" (UID: \"ea3117ba-711a-4055-bf89-d405d05a0154\") " Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.678805 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea3117ba-711a-4055-bf89-d405d05a0154-config\") pod \"ea3117ba-711a-4055-bf89-d405d05a0154\" (UID: \"ea3117ba-711a-4055-bf89-d405d05a0154\") " Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.678824 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-client-ca\") pod \"a7df2ac8-a6ab-49dc-98b1-7e12e16af253\" (UID: \"a7df2ac8-a6ab-49dc-98b1-7e12e16af253\") " Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.678920 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-proxy-ca-bundles\") pod \"a7df2ac8-a6ab-49dc-98b1-7e12e16af253\" (UID: \"a7df2ac8-a6ab-49dc-98b1-7e12e16af253\") " Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.678944 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flbgl\" (UniqueName: \"kubernetes.io/projected/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-kube-api-access-flbgl\") pod \"a7df2ac8-a6ab-49dc-98b1-7e12e16af253\" (UID: \"a7df2ac8-a6ab-49dc-98b1-7e12e16af253\") " Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.678964 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-serving-cert\") pod \"a7df2ac8-a6ab-49dc-98b1-7e12e16af253\" (UID: \"a7df2ac8-a6ab-49dc-98b1-7e12e16af253\") " Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.678978 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea3117ba-711a-4055-bf89-d405d05a0154-serving-cert\") pod \"ea3117ba-711a-4055-bf89-d405d05a0154\" (UID: \"ea3117ba-711a-4055-bf89-d405d05a0154\") " Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.678999 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea3117ba-711a-4055-bf89-d405d05a0154-client-ca\") pod \"ea3117ba-711a-4055-bf89-d405d05a0154\" (UID: \"ea3117ba-711a-4055-bf89-d405d05a0154\") " Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.679025 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-config\") pod \"a7df2ac8-a6ab-49dc-98b1-7e12e16af253\" (UID: \"a7df2ac8-a6ab-49dc-98b1-7e12e16af253\") " Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.680988 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-client-ca" (OuterVolumeSpecName: "client-ca") pod "a7df2ac8-a6ab-49dc-98b1-7e12e16af253" (UID: "a7df2ac8-a6ab-49dc-98b1-7e12e16af253"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.681019 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-config" (OuterVolumeSpecName: "config") pod "a7df2ac8-a6ab-49dc-98b1-7e12e16af253" (UID: "a7df2ac8-a6ab-49dc-98b1-7e12e16af253"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.681404 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea3117ba-711a-4055-bf89-d405d05a0154-client-ca" (OuterVolumeSpecName: "client-ca") pod "ea3117ba-711a-4055-bf89-d405d05a0154" (UID: "ea3117ba-711a-4055-bf89-d405d05a0154"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.681647 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a7df2ac8-a6ab-49dc-98b1-7e12e16af253" (UID: "a7df2ac8-a6ab-49dc-98b1-7e12e16af253"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.682351 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea3117ba-711a-4055-bf89-d405d05a0154-config" (OuterVolumeSpecName: "config") pod "ea3117ba-711a-4055-bf89-d405d05a0154" (UID: "ea3117ba-711a-4055-bf89-d405d05a0154"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.687122 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-kube-api-access-flbgl" (OuterVolumeSpecName: "kube-api-access-flbgl") pod "a7df2ac8-a6ab-49dc-98b1-7e12e16af253" (UID: "a7df2ac8-a6ab-49dc-98b1-7e12e16af253"). InnerVolumeSpecName "kube-api-access-flbgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.687520 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a7df2ac8-a6ab-49dc-98b1-7e12e16af253" (UID: "a7df2ac8-a6ab-49dc-98b1-7e12e16af253"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.688748 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jb4p2"] Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.688901 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea3117ba-711a-4055-bf89-d405d05a0154-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ea3117ba-711a-4055-bf89-d405d05a0154" (UID: "ea3117ba-711a-4055-bf89-d405d05a0154"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.688944 4743 scope.go:117] "RemoveContainer" containerID="57d623e4158b8ddb2f2ab443ea88307b6e28debea675636884bef3dd8aea8595" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.689583 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea3117ba-711a-4055-bf89-d405d05a0154-kube-api-access-9df8j" (OuterVolumeSpecName: "kube-api-access-9df8j") pod "ea3117ba-711a-4055-bf89-d405d05a0154" (UID: "ea3117ba-711a-4055-bf89-d405d05a0154"). InnerVolumeSpecName "kube-api-access-9df8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.699294 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jb4p2"] Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.703755 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r8dmc"] Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.706705 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r8dmc"] Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.719565 4743 scope.go:117] "RemoveContainer" containerID="19e74fe5505ca96f8a8fb14c34ecf3090c0b787767ab0143fb8ffb55a5a10ef5" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.731791 4743 scope.go:117] "RemoveContainer" containerID="6ce938bf525beadd39c9cd7e624b606c041bf5310133137678d744a156df1a8b" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.745384 4743 scope.go:117] "RemoveContainer" containerID="36d6b47b860840adbad156b7277b4d0e3c768e24e5b67732c5b1ed66780ed753" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.757181 4743 scope.go:117] "RemoveContainer" containerID="c6b0ce73a1a2439c1501173eb8f60a0b29e4984619e479a4d11d2622e3ac8aad" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.779844 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.779962 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flbgl\" (UniqueName: \"kubernetes.io/projected/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-kube-api-access-flbgl\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.779975 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea3117ba-711a-4055-bf89-d405d05a0154-client-ca\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.779984 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.779992 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea3117ba-711a-4055-bf89-d405d05a0154-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.780001 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.780010 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9df8j\" (UniqueName: \"kubernetes.io/projected/ea3117ba-711a-4055-bf89-d405d05a0154-kube-api-access-9df8j\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.780018 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea3117ba-711a-4055-bf89-d405d05a0154-config\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.780044 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7df2ac8-a6ab-49dc-98b1-7e12e16af253-client-ca\") on node \"crc\" DevicePath \"\"" Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.918386 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w"] Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.922555 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5ddf76bfbf-2xj9w"] Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.930497 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79"] Oct 11 00:55:54 crc kubenswrapper[4743]: I1011 00:55:54.932519 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66d89dc4c-q6z79"] Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.269280 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l6gl2"] Oct 11 00:55:55 crc kubenswrapper[4743]: E1011 00:55:55.269729 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b87ffc73-065d-4570-867f-b91e442a4c73" containerName="extract-content" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.269749 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b87ffc73-065d-4570-867f-b91e442a4c73" containerName="extract-content" Oct 11 00:55:55 crc kubenswrapper[4743]: E1011 00:55:55.269764 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e405bfea-f17b-4c2b-b69e-fc7284876cdc" containerName="extract-content" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.269772 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e405bfea-f17b-4c2b-b69e-fc7284876cdc" containerName="extract-content" Oct 11 00:55:55 crc kubenswrapper[4743]: E1011 00:55:55.269788 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3117ba-711a-4055-bf89-d405d05a0154" containerName="route-controller-manager" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.269797 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3117ba-711a-4055-bf89-d405d05a0154" containerName="route-controller-manager" Oct 11 00:55:55 crc kubenswrapper[4743]: E1011 00:55:55.269809 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef" containerName="registry-server" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.269817 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef" containerName="registry-server" Oct 11 00:55:55 crc kubenswrapper[4743]: E1011 00:55:55.269826 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e405bfea-f17b-4c2b-b69e-fc7284876cdc" containerName="extract-utilities" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.269838 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e405bfea-f17b-4c2b-b69e-fc7284876cdc" containerName="extract-utilities" Oct 11 00:55:55 crc kubenswrapper[4743]: E1011 00:55:55.269849 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef" containerName="extract-content" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.269878 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef" containerName="extract-content" Oct 11 00:55:55 crc kubenswrapper[4743]: E1011 00:55:55.269895 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b87ffc73-065d-4570-867f-b91e442a4c73" containerName="registry-server" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.269904 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b87ffc73-065d-4570-867f-b91e442a4c73" containerName="registry-server" Oct 11 00:55:55 crc kubenswrapper[4743]: E1011 00:55:55.269920 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16bef631-ee0f-4346-bb9b-c6eb48a09448" containerName="marketplace-operator" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.269931 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="16bef631-ee0f-4346-bb9b-c6eb48a09448" containerName="marketplace-operator" Oct 11 00:55:55 crc kubenswrapper[4743]: E1011 00:55:55.269948 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7df2ac8-a6ab-49dc-98b1-7e12e16af253" containerName="controller-manager" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.269958 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7df2ac8-a6ab-49dc-98b1-7e12e16af253" containerName="controller-manager" Oct 11 00:55:55 crc kubenswrapper[4743]: E1011 00:55:55.269975 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef" containerName="extract-utilities" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.269985 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef" containerName="extract-utilities" Oct 11 00:55:55 crc kubenswrapper[4743]: E1011 00:55:55.270000 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c7ded1c-c0ce-47c4-9959-b95631f067ea" containerName="extract-utilities" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.270010 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c7ded1c-c0ce-47c4-9959-b95631f067ea" containerName="extract-utilities" Oct 11 00:55:55 crc kubenswrapper[4743]: E1011 00:55:55.270024 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c7ded1c-c0ce-47c4-9959-b95631f067ea" containerName="registry-server" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.270034 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c7ded1c-c0ce-47c4-9959-b95631f067ea" containerName="registry-server" Oct 11 00:55:55 crc kubenswrapper[4743]: E1011 00:55:55.270047 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e405bfea-f17b-4c2b-b69e-fc7284876cdc" containerName="registry-server" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.270056 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e405bfea-f17b-4c2b-b69e-fc7284876cdc" containerName="registry-server" Oct 11 00:55:55 crc kubenswrapper[4743]: E1011 00:55:55.270072 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b87ffc73-065d-4570-867f-b91e442a4c73" containerName="extract-utilities" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.270082 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b87ffc73-065d-4570-867f-b91e442a4c73" containerName="extract-utilities" Oct 11 00:55:55 crc kubenswrapper[4743]: E1011 00:55:55.270094 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c7ded1c-c0ce-47c4-9959-b95631f067ea" containerName="extract-content" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.270104 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c7ded1c-c0ce-47c4-9959-b95631f067ea" containerName="extract-content" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.270236 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b87ffc73-065d-4570-867f-b91e442a4c73" containerName="registry-server" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.270253 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec82a5a9-025d-43e4-a1f4-6b4245bcc7ef" containerName="registry-server" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.270267 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="16bef631-ee0f-4346-bb9b-c6eb48a09448" containerName="marketplace-operator" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.270280 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7df2ac8-a6ab-49dc-98b1-7e12e16af253" containerName="controller-manager" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.270290 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e405bfea-f17b-4c2b-b69e-fc7284876cdc" containerName="registry-server" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.270305 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3117ba-711a-4055-bf89-d405d05a0154" containerName="route-controller-manager" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.270319 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c7ded1c-c0ce-47c4-9959-b95631f067ea" containerName="registry-server" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.271208 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l6gl2" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.272824 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.277091 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l6gl2"] Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.376883 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-569fd8fb86-lj6x8"] Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.377931 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-569fd8fb86-lj6x8" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.379170 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-577544f748-kr9zv"] Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.379744 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-577544f748-kr9zv" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.381910 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.384277 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.384351 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.384598 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.384797 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.385354 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.385802 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.386071 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b619db5-930f-4298-9d1d-2c74a9e60783-utilities\") pod \"community-operators-l6gl2\" (UID: \"8b619db5-930f-4298-9d1d-2c74a9e60783\") " pod="openshift-marketplace/community-operators-l6gl2" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.386844 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b619db5-930f-4298-9d1d-2c74a9e60783-catalog-content\") pod \"community-operators-l6gl2\" (UID: \"8b619db5-930f-4298-9d1d-2c74a9e60783\") " pod="openshift-marketplace/community-operators-l6gl2" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.386989 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wk8l\" (UniqueName: \"kubernetes.io/projected/8b619db5-930f-4298-9d1d-2c74a9e60783-kube-api-access-9wk8l\") pod \"community-operators-l6gl2\" (UID: \"8b619db5-930f-4298-9d1d-2c74a9e60783\") " pod="openshift-marketplace/community-operators-l6gl2" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.386079 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.386943 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.387000 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.387052 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.387182 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.389040 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-577544f748-kr9zv"] Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.393885 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.398274 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-569fd8fb86-lj6x8"] Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.487595 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d19be4d9-f731-420a-870d-90dfc05e487e-serving-cert\") pod \"controller-manager-569fd8fb86-lj6x8\" (UID: \"d19be4d9-f731-420a-870d-90dfc05e487e\") " pod="openshift-controller-manager/controller-manager-569fd8fb86-lj6x8" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.487638 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b619db5-930f-4298-9d1d-2c74a9e60783-utilities\") pod \"community-operators-l6gl2\" (UID: \"8b619db5-930f-4298-9d1d-2c74a9e60783\") " pod="openshift-marketplace/community-operators-l6gl2" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.487668 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b619db5-930f-4298-9d1d-2c74a9e60783-catalog-content\") pod \"community-operators-l6gl2\" (UID: \"8b619db5-930f-4298-9d1d-2c74a9e60783\") " pod="openshift-marketplace/community-operators-l6gl2" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.487693 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c1614de-3cce-4b11-b6d2-9e1a799e7b7e-client-ca\") pod \"route-controller-manager-577544f748-kr9zv\" (UID: \"7c1614de-3cce-4b11-b6d2-9e1a799e7b7e\") " pod="openshift-route-controller-manager/route-controller-manager-577544f748-kr9zv" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.487715 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcjvz\" (UniqueName: \"kubernetes.io/projected/d19be4d9-f731-420a-870d-90dfc05e487e-kube-api-access-jcjvz\") pod \"controller-manager-569fd8fb86-lj6x8\" (UID: \"d19be4d9-f731-420a-870d-90dfc05e487e\") " pod="openshift-controller-manager/controller-manager-569fd8fb86-lj6x8" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.487737 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c1614de-3cce-4b11-b6d2-9e1a799e7b7e-config\") pod \"route-controller-manager-577544f748-kr9zv\" (UID: \"7c1614de-3cce-4b11-b6d2-9e1a799e7b7e\") " pod="openshift-route-controller-manager/route-controller-manager-577544f748-kr9zv" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.487756 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d19be4d9-f731-420a-870d-90dfc05e487e-proxy-ca-bundles\") pod \"controller-manager-569fd8fb86-lj6x8\" (UID: \"d19be4d9-f731-420a-870d-90dfc05e487e\") " pod="openshift-controller-manager/controller-manager-569fd8fb86-lj6x8" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.488119 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c1614de-3cce-4b11-b6d2-9e1a799e7b7e-serving-cert\") pod \"route-controller-manager-577544f748-kr9zv\" (UID: \"7c1614de-3cce-4b11-b6d2-9e1a799e7b7e\") " pod="openshift-route-controller-manager/route-controller-manager-577544f748-kr9zv" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.488269 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wk8l\" (UniqueName: \"kubernetes.io/projected/8b619db5-930f-4298-9d1d-2c74a9e60783-kube-api-access-9wk8l\") pod \"community-operators-l6gl2\" (UID: \"8b619db5-930f-4298-9d1d-2c74a9e60783\") " pod="openshift-marketplace/community-operators-l6gl2" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.488315 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdjmg\" (UniqueName: \"kubernetes.io/projected/7c1614de-3cce-4b11-b6d2-9e1a799e7b7e-kube-api-access-qdjmg\") pod \"route-controller-manager-577544f748-kr9zv\" (UID: \"7c1614de-3cce-4b11-b6d2-9e1a799e7b7e\") " pod="openshift-route-controller-manager/route-controller-manager-577544f748-kr9zv" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.488343 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d19be4d9-f731-420a-870d-90dfc05e487e-config\") pod \"controller-manager-569fd8fb86-lj6x8\" (UID: \"d19be4d9-f731-420a-870d-90dfc05e487e\") " pod="openshift-controller-manager/controller-manager-569fd8fb86-lj6x8" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.488336 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b619db5-930f-4298-9d1d-2c74a9e60783-catalog-content\") pod \"community-operators-l6gl2\" (UID: \"8b619db5-930f-4298-9d1d-2c74a9e60783\") " pod="openshift-marketplace/community-operators-l6gl2" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.488367 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d19be4d9-f731-420a-870d-90dfc05e487e-client-ca\") pod \"controller-manager-569fd8fb86-lj6x8\" (UID: \"d19be4d9-f731-420a-870d-90dfc05e487e\") " pod="openshift-controller-manager/controller-manager-569fd8fb86-lj6x8" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.488440 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b619db5-930f-4298-9d1d-2c74a9e60783-utilities\") pod \"community-operators-l6gl2\" (UID: \"8b619db5-930f-4298-9d1d-2c74a9e60783\") " pod="openshift-marketplace/community-operators-l6gl2" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.511461 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wk8l\" (UniqueName: \"kubernetes.io/projected/8b619db5-930f-4298-9d1d-2c74a9e60783-kube-api-access-9wk8l\") pod \"community-operators-l6gl2\" (UID: \"8b619db5-930f-4298-9d1d-2c74a9e60783\") " pod="openshift-marketplace/community-operators-l6gl2" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.589511 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c1614de-3cce-4b11-b6d2-9e1a799e7b7e-client-ca\") pod \"route-controller-manager-577544f748-kr9zv\" (UID: \"7c1614de-3cce-4b11-b6d2-9e1a799e7b7e\") " pod="openshift-route-controller-manager/route-controller-manager-577544f748-kr9zv" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.589559 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcjvz\" (UniqueName: \"kubernetes.io/projected/d19be4d9-f731-420a-870d-90dfc05e487e-kube-api-access-jcjvz\") pod \"controller-manager-569fd8fb86-lj6x8\" (UID: \"d19be4d9-f731-420a-870d-90dfc05e487e\") " pod="openshift-controller-manager/controller-manager-569fd8fb86-lj6x8" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.589583 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c1614de-3cce-4b11-b6d2-9e1a799e7b7e-config\") pod \"route-controller-manager-577544f748-kr9zv\" (UID: \"7c1614de-3cce-4b11-b6d2-9e1a799e7b7e\") " pod="openshift-route-controller-manager/route-controller-manager-577544f748-kr9zv" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.589606 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d19be4d9-f731-420a-870d-90dfc05e487e-proxy-ca-bundles\") pod \"controller-manager-569fd8fb86-lj6x8\" (UID: \"d19be4d9-f731-420a-870d-90dfc05e487e\") " pod="openshift-controller-manager/controller-manager-569fd8fb86-lj6x8" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.589626 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c1614de-3cce-4b11-b6d2-9e1a799e7b7e-serving-cert\") pod \"route-controller-manager-577544f748-kr9zv\" (UID: \"7c1614de-3cce-4b11-b6d2-9e1a799e7b7e\") " pod="openshift-route-controller-manager/route-controller-manager-577544f748-kr9zv" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.589649 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdjmg\" (UniqueName: \"kubernetes.io/projected/7c1614de-3cce-4b11-b6d2-9e1a799e7b7e-kube-api-access-qdjmg\") pod \"route-controller-manager-577544f748-kr9zv\" (UID: \"7c1614de-3cce-4b11-b6d2-9e1a799e7b7e\") " pod="openshift-route-controller-manager/route-controller-manager-577544f748-kr9zv" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.589665 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d19be4d9-f731-420a-870d-90dfc05e487e-config\") pod \"controller-manager-569fd8fb86-lj6x8\" (UID: \"d19be4d9-f731-420a-870d-90dfc05e487e\") " pod="openshift-controller-manager/controller-manager-569fd8fb86-lj6x8" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.589680 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d19be4d9-f731-420a-870d-90dfc05e487e-client-ca\") pod \"controller-manager-569fd8fb86-lj6x8\" (UID: \"d19be4d9-f731-420a-870d-90dfc05e487e\") " pod="openshift-controller-manager/controller-manager-569fd8fb86-lj6x8" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.589707 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d19be4d9-f731-420a-870d-90dfc05e487e-serving-cert\") pod \"controller-manager-569fd8fb86-lj6x8\" (UID: \"d19be4d9-f731-420a-870d-90dfc05e487e\") " pod="openshift-controller-manager/controller-manager-569fd8fb86-lj6x8" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.590587 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d19be4d9-f731-420a-870d-90dfc05e487e-client-ca\") pod \"controller-manager-569fd8fb86-lj6x8\" (UID: \"d19be4d9-f731-420a-870d-90dfc05e487e\") " pod="openshift-controller-manager/controller-manager-569fd8fb86-lj6x8" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.590931 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c1614de-3cce-4b11-b6d2-9e1a799e7b7e-client-ca\") pod \"route-controller-manager-577544f748-kr9zv\" (UID: \"7c1614de-3cce-4b11-b6d2-9e1a799e7b7e\") " pod="openshift-route-controller-manager/route-controller-manager-577544f748-kr9zv" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.591060 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d19be4d9-f731-420a-870d-90dfc05e487e-proxy-ca-bundles\") pod \"controller-manager-569fd8fb86-lj6x8\" (UID: \"d19be4d9-f731-420a-870d-90dfc05e487e\") " pod="openshift-controller-manager/controller-manager-569fd8fb86-lj6x8" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.591250 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d19be4d9-f731-420a-870d-90dfc05e487e-config\") pod \"controller-manager-569fd8fb86-lj6x8\" (UID: \"d19be4d9-f731-420a-870d-90dfc05e487e\") " pod="openshift-controller-manager/controller-manager-569fd8fb86-lj6x8" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.591781 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c1614de-3cce-4b11-b6d2-9e1a799e7b7e-config\") pod \"route-controller-manager-577544f748-kr9zv\" (UID: \"7c1614de-3cce-4b11-b6d2-9e1a799e7b7e\") " pod="openshift-route-controller-manager/route-controller-manager-577544f748-kr9zv" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.592048 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l6gl2" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.592732 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c1614de-3cce-4b11-b6d2-9e1a799e7b7e-serving-cert\") pod \"route-controller-manager-577544f748-kr9zv\" (UID: \"7c1614de-3cce-4b11-b6d2-9e1a799e7b7e\") " pod="openshift-route-controller-manager/route-controller-manager-577544f748-kr9zv" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.597653 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d19be4d9-f731-420a-870d-90dfc05e487e-serving-cert\") pod \"controller-manager-569fd8fb86-lj6x8\" (UID: \"d19be4d9-f731-420a-870d-90dfc05e487e\") " pod="openshift-controller-manager/controller-manager-569fd8fb86-lj6x8" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.621792 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdjmg\" (UniqueName: \"kubernetes.io/projected/7c1614de-3cce-4b11-b6d2-9e1a799e7b7e-kube-api-access-qdjmg\") pod \"route-controller-manager-577544f748-kr9zv\" (UID: \"7c1614de-3cce-4b11-b6d2-9e1a799e7b7e\") " pod="openshift-route-controller-manager/route-controller-manager-577544f748-kr9zv" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.622699 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcjvz\" (UniqueName: \"kubernetes.io/projected/d19be4d9-f731-420a-870d-90dfc05e487e-kube-api-access-jcjvz\") pod \"controller-manager-569fd8fb86-lj6x8\" (UID: \"d19be4d9-f731-420a-870d-90dfc05e487e\") " pod="openshift-controller-manager/controller-manager-569fd8fb86-lj6x8" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.716457 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-569fd8fb86-lj6x8" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.731381 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-577544f748-kr9zv" Oct 11 00:55:55 crc kubenswrapper[4743]: I1011 00:55:55.979460 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l6gl2"] Oct 11 00:55:55 crc kubenswrapper[4743]: W1011 00:55:55.988010 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b619db5_930f_4298_9d1d_2c74a9e60783.slice/crio-d50f0ce5ded39068a26834756f830df713033aa8eec140091c0da2fea9b5b6db WatchSource:0}: Error finding container d50f0ce5ded39068a26834756f830df713033aa8eec140091c0da2fea9b5b6db: Status 404 returned error can't find the container with id d50f0ce5ded39068a26834756f830df713033aa8eec140091c0da2fea9b5b6db Oct 11 00:55:56 crc kubenswrapper[4743]: I1011 00:55:56.074402 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-569fd8fb86-lj6x8"] Oct 11 00:55:56 crc kubenswrapper[4743]: W1011 00:55:56.083264 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd19be4d9_f731_420a_870d_90dfc05e487e.slice/crio-9870fe5f8a9f86837e5775a4381e9166aebb7091cf9dc3c4224c5990f91b7893 WatchSource:0}: Error finding container 9870fe5f8a9f86837e5775a4381e9166aebb7091cf9dc3c4224c5990f91b7893: Status 404 returned error can't find the container with id 9870fe5f8a9f86837e5775a4381e9166aebb7091cf9dc3c4224c5990f91b7893 Oct 11 00:55:56 crc kubenswrapper[4743]: I1011 00:55:56.100666 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16bef631-ee0f-4346-bb9b-c6eb48a09448" path="/var/lib/kubelet/pods/16bef631-ee0f-4346-bb9b-c6eb48a09448/volumes" Oct 11 00:55:56 crc kubenswrapper[4743]: I1011 00:55:56.101297 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c7ded1c-c0ce-47c4-9959-b95631f067ea" path="/var/lib/kubelet/pods/7c7ded1c-c0ce-47c4-9959-b95631f067ea/volumes" Oct 11 00:55:56 crc kubenswrapper[4743]: I1011 00:55:56.102097 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7df2ac8-a6ab-49dc-98b1-7e12e16af253" path="/var/lib/kubelet/pods/a7df2ac8-a6ab-49dc-98b1-7e12e16af253/volumes" Oct 11 00:55:56 crc kubenswrapper[4743]: I1011 00:55:56.103407 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e405bfea-f17b-4c2b-b69e-fc7284876cdc" path="/var/lib/kubelet/pods/e405bfea-f17b-4c2b-b69e-fc7284876cdc/volumes" Oct 11 00:55:56 crc kubenswrapper[4743]: I1011 00:55:56.104453 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea3117ba-711a-4055-bf89-d405d05a0154" path="/var/lib/kubelet/pods/ea3117ba-711a-4055-bf89-d405d05a0154/volumes" Oct 11 00:55:56 crc kubenswrapper[4743]: I1011 00:55:56.131564 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-577544f748-kr9zv"] Oct 11 00:55:56 crc kubenswrapper[4743]: I1011 00:55:56.649006 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-577544f748-kr9zv" event={"ID":"7c1614de-3cce-4b11-b6d2-9e1a799e7b7e","Type":"ContainerStarted","Data":"ccf135d754df3a30c382ffee5c6860d4182648a1236fecb84755c4b6d0f5020b"} Oct 11 00:55:56 crc kubenswrapper[4743]: I1011 00:55:56.649322 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-577544f748-kr9zv" Oct 11 00:55:56 crc kubenswrapper[4743]: I1011 00:55:56.649340 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-577544f748-kr9zv" event={"ID":"7c1614de-3cce-4b11-b6d2-9e1a799e7b7e","Type":"ContainerStarted","Data":"ebef2f24a2c84ed2ae4eacf267529079f868f262ed3a17e487a30d2bf01356cc"} Oct 11 00:55:56 crc kubenswrapper[4743]: I1011 00:55:56.651201 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-569fd8fb86-lj6x8" event={"ID":"d19be4d9-f731-420a-870d-90dfc05e487e","Type":"ContainerStarted","Data":"9cda1d580be3df25989d93a09506cf7d9f1c053eb504b9c52895a0d0baf4a6b4"} Oct 11 00:55:56 crc kubenswrapper[4743]: I1011 00:55:56.651276 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-569fd8fb86-lj6x8" event={"ID":"d19be4d9-f731-420a-870d-90dfc05e487e","Type":"ContainerStarted","Data":"9870fe5f8a9f86837e5775a4381e9166aebb7091cf9dc3c4224c5990f91b7893"} Oct 11 00:55:56 crc kubenswrapper[4743]: I1011 00:55:56.651464 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-569fd8fb86-lj6x8" Oct 11 00:55:56 crc kubenswrapper[4743]: I1011 00:55:56.652734 4743 generic.go:334] "Generic (PLEG): container finished" podID="8b619db5-930f-4298-9d1d-2c74a9e60783" containerID="4c798ed080b4fd559646a6d9a02f4f977a4c18c5596b89cd20e384e98a34f169" exitCode=0 Oct 11 00:55:56 crc kubenswrapper[4743]: I1011 00:55:56.653764 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l6gl2" event={"ID":"8b619db5-930f-4298-9d1d-2c74a9e60783","Type":"ContainerDied","Data":"4c798ed080b4fd559646a6d9a02f4f977a4c18c5596b89cd20e384e98a34f169"} Oct 11 00:55:56 crc kubenswrapper[4743]: I1011 00:55:56.653807 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l6gl2" event={"ID":"8b619db5-930f-4298-9d1d-2c74a9e60783","Type":"ContainerStarted","Data":"d50f0ce5ded39068a26834756f830df713033aa8eec140091c0da2fea9b5b6db"} Oct 11 00:55:56 crc kubenswrapper[4743]: I1011 00:55:56.661952 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-569fd8fb86-lj6x8" Oct 11 00:55:56 crc kubenswrapper[4743]: I1011 00:55:56.667269 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-577544f748-kr9zv" podStartSLOduration=2.667251811 podStartE2EDuration="2.667251811s" podCreationTimestamp="2025-10-11 00:55:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:55:56.665008413 +0000 UTC m=+251.317988820" watchObservedRunningTime="2025-10-11 00:55:56.667251811 +0000 UTC m=+251.320232208" Oct 11 00:55:56 crc kubenswrapper[4743]: I1011 00:55:56.698699 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-569fd8fb86-lj6x8" podStartSLOduration=2.698679048 podStartE2EDuration="2.698679048s" podCreationTimestamp="2025-10-11 00:55:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:55:56.682140248 +0000 UTC m=+251.335120655" watchObservedRunningTime="2025-10-11 00:55:56.698679048 +0000 UTC m=+251.351659455" Oct 11 00:55:56 crc kubenswrapper[4743]: I1011 00:55:56.754378 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-577544f748-kr9zv" Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.057163 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wqwgk"] Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.058458 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wqwgk" Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.062352 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.070304 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wqwgk"] Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.106926 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x57bp\" (UniqueName: \"kubernetes.io/projected/3543dfa9-ce3b-48b3-bc13-70ade6294a3b-kube-api-access-x57bp\") pod \"certified-operators-wqwgk\" (UID: \"3543dfa9-ce3b-48b3-bc13-70ade6294a3b\") " pod="openshift-marketplace/certified-operators-wqwgk" Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.106960 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3543dfa9-ce3b-48b3-bc13-70ade6294a3b-catalog-content\") pod \"certified-operators-wqwgk\" (UID: \"3543dfa9-ce3b-48b3-bc13-70ade6294a3b\") " pod="openshift-marketplace/certified-operators-wqwgk" Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.106992 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3543dfa9-ce3b-48b3-bc13-70ade6294a3b-utilities\") pod \"certified-operators-wqwgk\" (UID: \"3543dfa9-ce3b-48b3-bc13-70ade6294a3b\") " pod="openshift-marketplace/certified-operators-wqwgk" Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.208185 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x57bp\" (UniqueName: \"kubernetes.io/projected/3543dfa9-ce3b-48b3-bc13-70ade6294a3b-kube-api-access-x57bp\") pod \"certified-operators-wqwgk\" (UID: \"3543dfa9-ce3b-48b3-bc13-70ade6294a3b\") " pod="openshift-marketplace/certified-operators-wqwgk" Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.208234 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3543dfa9-ce3b-48b3-bc13-70ade6294a3b-catalog-content\") pod \"certified-operators-wqwgk\" (UID: \"3543dfa9-ce3b-48b3-bc13-70ade6294a3b\") " pod="openshift-marketplace/certified-operators-wqwgk" Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.208282 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3543dfa9-ce3b-48b3-bc13-70ade6294a3b-utilities\") pod \"certified-operators-wqwgk\" (UID: \"3543dfa9-ce3b-48b3-bc13-70ade6294a3b\") " pod="openshift-marketplace/certified-operators-wqwgk" Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.208817 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3543dfa9-ce3b-48b3-bc13-70ade6294a3b-catalog-content\") pod \"certified-operators-wqwgk\" (UID: \"3543dfa9-ce3b-48b3-bc13-70ade6294a3b\") " pod="openshift-marketplace/certified-operators-wqwgk" Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.208842 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3543dfa9-ce3b-48b3-bc13-70ade6294a3b-utilities\") pod \"certified-operators-wqwgk\" (UID: \"3543dfa9-ce3b-48b3-bc13-70ade6294a3b\") " pod="openshift-marketplace/certified-operators-wqwgk" Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.235226 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x57bp\" (UniqueName: \"kubernetes.io/projected/3543dfa9-ce3b-48b3-bc13-70ade6294a3b-kube-api-access-x57bp\") pod \"certified-operators-wqwgk\" (UID: \"3543dfa9-ce3b-48b3-bc13-70ade6294a3b\") " pod="openshift-marketplace/certified-operators-wqwgk" Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.399127 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wqwgk" Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.658307 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vs4lk"] Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.659652 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vs4lk" Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.663332 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.667720 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l6gl2" event={"ID":"8b619db5-930f-4298-9d1d-2c74a9e60783","Type":"ContainerStarted","Data":"55fe919f7d51ac77d5329f5609be168140365796147ddf79e37127329b83ac20"} Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.672695 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vs4lk"] Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.754556 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6061b50-92be-4942-b961-a094b28b50a9-catalog-content\") pod \"redhat-operators-vs4lk\" (UID: \"a6061b50-92be-4942-b961-a094b28b50a9\") " pod="openshift-marketplace/redhat-operators-vs4lk" Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.754740 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6061b50-92be-4942-b961-a094b28b50a9-utilities\") pod \"redhat-operators-vs4lk\" (UID: \"a6061b50-92be-4942-b961-a094b28b50a9\") " pod="openshift-marketplace/redhat-operators-vs4lk" Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.754791 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s4b8\" (UniqueName: \"kubernetes.io/projected/a6061b50-92be-4942-b961-a094b28b50a9-kube-api-access-7s4b8\") pod \"redhat-operators-vs4lk\" (UID: \"a6061b50-92be-4942-b961-a094b28b50a9\") " pod="openshift-marketplace/redhat-operators-vs4lk" Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.855916 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6061b50-92be-4942-b961-a094b28b50a9-catalog-content\") pod \"redhat-operators-vs4lk\" (UID: \"a6061b50-92be-4942-b961-a094b28b50a9\") " pod="openshift-marketplace/redhat-operators-vs4lk" Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.855996 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6061b50-92be-4942-b961-a094b28b50a9-utilities\") pod \"redhat-operators-vs4lk\" (UID: \"a6061b50-92be-4942-b961-a094b28b50a9\") " pod="openshift-marketplace/redhat-operators-vs4lk" Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.856025 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s4b8\" (UniqueName: \"kubernetes.io/projected/a6061b50-92be-4942-b961-a094b28b50a9-kube-api-access-7s4b8\") pod \"redhat-operators-vs4lk\" (UID: \"a6061b50-92be-4942-b961-a094b28b50a9\") " pod="openshift-marketplace/redhat-operators-vs4lk" Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.856755 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6061b50-92be-4942-b961-a094b28b50a9-catalog-content\") pod \"redhat-operators-vs4lk\" (UID: \"a6061b50-92be-4942-b961-a094b28b50a9\") " pod="openshift-marketplace/redhat-operators-vs4lk" Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.857078 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6061b50-92be-4942-b961-a094b28b50a9-utilities\") pod \"redhat-operators-vs4lk\" (UID: \"a6061b50-92be-4942-b961-a094b28b50a9\") " pod="openshift-marketplace/redhat-operators-vs4lk" Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.872142 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wqwgk"] Oct 11 00:55:57 crc kubenswrapper[4743]: W1011 00:55:57.889978 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3543dfa9_ce3b_48b3_bc13_70ade6294a3b.slice/crio-2fc358adc6b3c45cfbc777f7311c53ff6d9ed9bd6ec6127e301f343138a3bcd7 WatchSource:0}: Error finding container 2fc358adc6b3c45cfbc777f7311c53ff6d9ed9bd6ec6127e301f343138a3bcd7: Status 404 returned error can't find the container with id 2fc358adc6b3c45cfbc777f7311c53ff6d9ed9bd6ec6127e301f343138a3bcd7 Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.892102 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s4b8\" (UniqueName: \"kubernetes.io/projected/a6061b50-92be-4942-b961-a094b28b50a9-kube-api-access-7s4b8\") pod \"redhat-operators-vs4lk\" (UID: \"a6061b50-92be-4942-b961-a094b28b50a9\") " pod="openshift-marketplace/redhat-operators-vs4lk" Oct 11 00:55:57 crc kubenswrapper[4743]: I1011 00:55:57.988569 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vs4lk" Oct 11 00:55:58 crc kubenswrapper[4743]: I1011 00:55:58.406660 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vs4lk"] Oct 11 00:55:58 crc kubenswrapper[4743]: W1011 00:55:58.417572 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6061b50_92be_4942_b961_a094b28b50a9.slice/crio-a48c716c14bb8d15ad90d2c52fba7004e44fa7c6a52e659e15cf500c32e3706b WatchSource:0}: Error finding container a48c716c14bb8d15ad90d2c52fba7004e44fa7c6a52e659e15cf500c32e3706b: Status 404 returned error can't find the container with id a48c716c14bb8d15ad90d2c52fba7004e44fa7c6a52e659e15cf500c32e3706b Oct 11 00:55:58 crc kubenswrapper[4743]: I1011 00:55:58.675383 4743 generic.go:334] "Generic (PLEG): container finished" podID="8b619db5-930f-4298-9d1d-2c74a9e60783" containerID="55fe919f7d51ac77d5329f5609be168140365796147ddf79e37127329b83ac20" exitCode=0 Oct 11 00:55:58 crc kubenswrapper[4743]: I1011 00:55:58.675660 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l6gl2" event={"ID":"8b619db5-930f-4298-9d1d-2c74a9e60783","Type":"ContainerDied","Data":"55fe919f7d51ac77d5329f5609be168140365796147ddf79e37127329b83ac20"} Oct 11 00:55:58 crc kubenswrapper[4743]: I1011 00:55:58.677081 4743 generic.go:334] "Generic (PLEG): container finished" podID="3543dfa9-ce3b-48b3-bc13-70ade6294a3b" containerID="234a8bf1a9d6746699b6a929c885f0a4eb7492734ef77ddf41aa0622665d6060" exitCode=0 Oct 11 00:55:58 crc kubenswrapper[4743]: I1011 00:55:58.677118 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqwgk" event={"ID":"3543dfa9-ce3b-48b3-bc13-70ade6294a3b","Type":"ContainerDied","Data":"234a8bf1a9d6746699b6a929c885f0a4eb7492734ef77ddf41aa0622665d6060"} Oct 11 00:55:58 crc kubenswrapper[4743]: I1011 00:55:58.677174 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqwgk" event={"ID":"3543dfa9-ce3b-48b3-bc13-70ade6294a3b","Type":"ContainerStarted","Data":"2fc358adc6b3c45cfbc777f7311c53ff6d9ed9bd6ec6127e301f343138a3bcd7"} Oct 11 00:55:58 crc kubenswrapper[4743]: I1011 00:55:58.678581 4743 generic.go:334] "Generic (PLEG): container finished" podID="a6061b50-92be-4942-b961-a094b28b50a9" containerID="302a1880f75a09e0d6bdfff0930d7692ace8677144ffb2314547814e11ada7ca" exitCode=0 Oct 11 00:55:58 crc kubenswrapper[4743]: I1011 00:55:58.680391 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vs4lk" event={"ID":"a6061b50-92be-4942-b961-a094b28b50a9","Type":"ContainerDied","Data":"302a1880f75a09e0d6bdfff0930d7692ace8677144ffb2314547814e11ada7ca"} Oct 11 00:55:58 crc kubenswrapper[4743]: I1011 00:55:58.680461 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vs4lk" event={"ID":"a6061b50-92be-4942-b961-a094b28b50a9","Type":"ContainerStarted","Data":"a48c716c14bb8d15ad90d2c52fba7004e44fa7c6a52e659e15cf500c32e3706b"} Oct 11 00:55:59 crc kubenswrapper[4743]: I1011 00:55:59.462021 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ttsr5"] Oct 11 00:55:59 crc kubenswrapper[4743]: I1011 00:55:59.464519 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ttsr5" Oct 11 00:55:59 crc kubenswrapper[4743]: I1011 00:55:59.469181 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 11 00:55:59 crc kubenswrapper[4743]: I1011 00:55:59.477227 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ttsr5"] Oct 11 00:55:59 crc kubenswrapper[4743]: I1011 00:55:59.578939 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd2c6b7-4918-4cb2-abcd-efa1523befe0-utilities\") pod \"redhat-marketplace-ttsr5\" (UID: \"3bd2c6b7-4918-4cb2-abcd-efa1523befe0\") " pod="openshift-marketplace/redhat-marketplace-ttsr5" Oct 11 00:55:59 crc kubenswrapper[4743]: I1011 00:55:59.579005 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct7nn\" (UniqueName: \"kubernetes.io/projected/3bd2c6b7-4918-4cb2-abcd-efa1523befe0-kube-api-access-ct7nn\") pod \"redhat-marketplace-ttsr5\" (UID: \"3bd2c6b7-4918-4cb2-abcd-efa1523befe0\") " pod="openshift-marketplace/redhat-marketplace-ttsr5" Oct 11 00:55:59 crc kubenswrapper[4743]: I1011 00:55:59.579036 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd2c6b7-4918-4cb2-abcd-efa1523befe0-catalog-content\") pod \"redhat-marketplace-ttsr5\" (UID: \"3bd2c6b7-4918-4cb2-abcd-efa1523befe0\") " pod="openshift-marketplace/redhat-marketplace-ttsr5" Oct 11 00:55:59 crc kubenswrapper[4743]: I1011 00:55:59.680059 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd2c6b7-4918-4cb2-abcd-efa1523befe0-utilities\") pod \"redhat-marketplace-ttsr5\" (UID: \"3bd2c6b7-4918-4cb2-abcd-efa1523befe0\") " pod="openshift-marketplace/redhat-marketplace-ttsr5" Oct 11 00:55:59 crc kubenswrapper[4743]: I1011 00:55:59.680164 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct7nn\" (UniqueName: \"kubernetes.io/projected/3bd2c6b7-4918-4cb2-abcd-efa1523befe0-kube-api-access-ct7nn\") pod \"redhat-marketplace-ttsr5\" (UID: \"3bd2c6b7-4918-4cb2-abcd-efa1523befe0\") " pod="openshift-marketplace/redhat-marketplace-ttsr5" Oct 11 00:55:59 crc kubenswrapper[4743]: I1011 00:55:59.680204 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd2c6b7-4918-4cb2-abcd-efa1523befe0-catalog-content\") pod \"redhat-marketplace-ttsr5\" (UID: \"3bd2c6b7-4918-4cb2-abcd-efa1523befe0\") " pod="openshift-marketplace/redhat-marketplace-ttsr5" Oct 11 00:55:59 crc kubenswrapper[4743]: I1011 00:55:59.680588 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd2c6b7-4918-4cb2-abcd-efa1523befe0-utilities\") pod \"redhat-marketplace-ttsr5\" (UID: \"3bd2c6b7-4918-4cb2-abcd-efa1523befe0\") " pod="openshift-marketplace/redhat-marketplace-ttsr5" Oct 11 00:55:59 crc kubenswrapper[4743]: I1011 00:55:59.680745 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd2c6b7-4918-4cb2-abcd-efa1523befe0-catalog-content\") pod \"redhat-marketplace-ttsr5\" (UID: \"3bd2c6b7-4918-4cb2-abcd-efa1523befe0\") " pod="openshift-marketplace/redhat-marketplace-ttsr5" Oct 11 00:55:59 crc kubenswrapper[4743]: I1011 00:55:59.687362 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l6gl2" event={"ID":"8b619db5-930f-4298-9d1d-2c74a9e60783","Type":"ContainerStarted","Data":"0dc430a7a83126a44b4f105ff5fd6d1d94fe4aa10d65255587c57dad98bbb7f6"} Oct 11 00:55:59 crc kubenswrapper[4743]: I1011 00:55:59.689379 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqwgk" event={"ID":"3543dfa9-ce3b-48b3-bc13-70ade6294a3b","Type":"ContainerStarted","Data":"d385cb60448624ec5eca1eef01fe1ea6ac511725767568272a25f92bf0bfbb88"} Oct 11 00:55:59 crc kubenswrapper[4743]: I1011 00:55:59.691266 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vs4lk" event={"ID":"a6061b50-92be-4942-b961-a094b28b50a9","Type":"ContainerStarted","Data":"acb0cbae4e0ad2b700512171c3aec09e9e1c754de0ffcacba78407a18acf905e"} Oct 11 00:55:59 crc kubenswrapper[4743]: I1011 00:55:59.702946 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct7nn\" (UniqueName: \"kubernetes.io/projected/3bd2c6b7-4918-4cb2-abcd-efa1523befe0-kube-api-access-ct7nn\") pod \"redhat-marketplace-ttsr5\" (UID: \"3bd2c6b7-4918-4cb2-abcd-efa1523befe0\") " pod="openshift-marketplace/redhat-marketplace-ttsr5" Oct 11 00:55:59 crc kubenswrapper[4743]: I1011 00:55:59.704351 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l6gl2" podStartSLOduration=2.137035151 podStartE2EDuration="4.704336364s" podCreationTimestamp="2025-10-11 00:55:55 +0000 UTC" firstStartedPulling="2025-10-11 00:55:56.654759377 +0000 UTC m=+251.307739774" lastFinishedPulling="2025-10-11 00:55:59.22206057 +0000 UTC m=+253.875040987" observedRunningTime="2025-10-11 00:55:59.701546102 +0000 UTC m=+254.354526509" watchObservedRunningTime="2025-10-11 00:55:59.704336364 +0000 UTC m=+254.357316771" Oct 11 00:55:59 crc kubenswrapper[4743]: I1011 00:55:59.788640 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ttsr5" Oct 11 00:56:00 crc kubenswrapper[4743]: I1011 00:56:00.194999 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ttsr5"] Oct 11 00:56:00 crc kubenswrapper[4743]: W1011 00:56:00.200055 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bd2c6b7_4918_4cb2_abcd_efa1523befe0.slice/crio-fecb0a0ee3803fd6d327b91df054e53e7ce9d5498b2f2a4cb21487bc2f9f4824 WatchSource:0}: Error finding container fecb0a0ee3803fd6d327b91df054e53e7ce9d5498b2f2a4cb21487bc2f9f4824: Status 404 returned error can't find the container with id fecb0a0ee3803fd6d327b91df054e53e7ce9d5498b2f2a4cb21487bc2f9f4824 Oct 11 00:56:00 crc kubenswrapper[4743]: I1011 00:56:00.699512 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ttsr5" event={"ID":"3bd2c6b7-4918-4cb2-abcd-efa1523befe0","Type":"ContainerStarted","Data":"fecb0a0ee3803fd6d327b91df054e53e7ce9d5498b2f2a4cb21487bc2f9f4824"} Oct 11 00:56:00 crc kubenswrapper[4743]: I1011 00:56:00.704767 4743 generic.go:334] "Generic (PLEG): container finished" podID="3543dfa9-ce3b-48b3-bc13-70ade6294a3b" containerID="d385cb60448624ec5eca1eef01fe1ea6ac511725767568272a25f92bf0bfbb88" exitCode=0 Oct 11 00:56:00 crc kubenswrapper[4743]: I1011 00:56:00.704823 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqwgk" event={"ID":"3543dfa9-ce3b-48b3-bc13-70ade6294a3b","Type":"ContainerDied","Data":"d385cb60448624ec5eca1eef01fe1ea6ac511725767568272a25f92bf0bfbb88"} Oct 11 00:56:00 crc kubenswrapper[4743]: I1011 00:56:00.706845 4743 generic.go:334] "Generic (PLEG): container finished" podID="a6061b50-92be-4942-b961-a094b28b50a9" containerID="acb0cbae4e0ad2b700512171c3aec09e9e1c754de0ffcacba78407a18acf905e" exitCode=0 Oct 11 00:56:00 crc kubenswrapper[4743]: I1011 00:56:00.707579 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vs4lk" event={"ID":"a6061b50-92be-4942-b961-a094b28b50a9","Type":"ContainerDied","Data":"acb0cbae4e0ad2b700512171c3aec09e9e1c754de0ffcacba78407a18acf905e"} Oct 11 00:56:01 crc kubenswrapper[4743]: I1011 00:56:01.720223 4743 generic.go:334] "Generic (PLEG): container finished" podID="3bd2c6b7-4918-4cb2-abcd-efa1523befe0" containerID="e81f89ad2361711d114f2bafc30be170fe1e28428cf6745ace53211b1424b4a2" exitCode=0 Oct 11 00:56:01 crc kubenswrapper[4743]: I1011 00:56:01.720841 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ttsr5" event={"ID":"3bd2c6b7-4918-4cb2-abcd-efa1523befe0","Type":"ContainerDied","Data":"e81f89ad2361711d114f2bafc30be170fe1e28428cf6745ace53211b1424b4a2"} Oct 11 00:56:01 crc kubenswrapper[4743]: I1011 00:56:01.733017 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqwgk" event={"ID":"3543dfa9-ce3b-48b3-bc13-70ade6294a3b","Type":"ContainerStarted","Data":"eb2a5ce1c56f499fe074bb4d984f5f8d3c18a986e5f4a2787a470d37d6cbbf8f"} Oct 11 00:56:01 crc kubenswrapper[4743]: I1011 00:56:01.735028 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vs4lk" event={"ID":"a6061b50-92be-4942-b961-a094b28b50a9","Type":"ContainerStarted","Data":"adc6955c5027ef1ff03e5efcc8d50bbe43ee21bd50e436d4178b54a15b0ee9f7"} Oct 11 00:56:01 crc kubenswrapper[4743]: I1011 00:56:01.756538 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wqwgk" podStartSLOduration=2.094419893 podStartE2EDuration="4.75652154s" podCreationTimestamp="2025-10-11 00:55:57 +0000 UTC" firstStartedPulling="2025-10-11 00:55:58.678677687 +0000 UTC m=+253.331658084" lastFinishedPulling="2025-10-11 00:56:01.340779334 +0000 UTC m=+255.993759731" observedRunningTime="2025-10-11 00:56:01.756286433 +0000 UTC m=+256.409266830" watchObservedRunningTime="2025-10-11 00:56:01.75652154 +0000 UTC m=+256.409501937" Oct 11 00:56:02 crc kubenswrapper[4743]: I1011 00:56:02.739678 4743 generic.go:334] "Generic (PLEG): container finished" podID="3bd2c6b7-4918-4cb2-abcd-efa1523befe0" containerID="271bddcb48291c9f7cb50d5fff49f96fd33e0661ee377b0976948e3554cbd80f" exitCode=0 Oct 11 00:56:02 crc kubenswrapper[4743]: I1011 00:56:02.740797 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ttsr5" event={"ID":"3bd2c6b7-4918-4cb2-abcd-efa1523befe0","Type":"ContainerDied","Data":"271bddcb48291c9f7cb50d5fff49f96fd33e0661ee377b0976948e3554cbd80f"} Oct 11 00:56:02 crc kubenswrapper[4743]: I1011 00:56:02.765686 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vs4lk" podStartSLOduration=3.169493083 podStartE2EDuration="5.765667757s" podCreationTimestamp="2025-10-11 00:55:57 +0000 UTC" firstStartedPulling="2025-10-11 00:55:58.679840077 +0000 UTC m=+253.332820464" lastFinishedPulling="2025-10-11 00:56:01.276014741 +0000 UTC m=+255.928995138" observedRunningTime="2025-10-11 00:56:01.774044405 +0000 UTC m=+256.427024822" watchObservedRunningTime="2025-10-11 00:56:02.765667757 +0000 UTC m=+257.418648154" Oct 11 00:56:04 crc kubenswrapper[4743]: I1011 00:56:04.752340 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ttsr5" event={"ID":"3bd2c6b7-4918-4cb2-abcd-efa1523befe0","Type":"ContainerStarted","Data":"0e972d77f6652100b68a4b3bdd61bee63e988f96e8d7e61d6a5c5ad7d8ce2cd2"} Oct 11 00:56:04 crc kubenswrapper[4743]: I1011 00:56:04.767071 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ttsr5" podStartSLOduration=3.461923332 podStartE2EDuration="5.767053832s" podCreationTimestamp="2025-10-11 00:55:59 +0000 UTC" firstStartedPulling="2025-10-11 00:56:01.721764176 +0000 UTC m=+256.374744573" lastFinishedPulling="2025-10-11 00:56:04.026894666 +0000 UTC m=+258.679875073" observedRunningTime="2025-10-11 00:56:04.765376709 +0000 UTC m=+259.418357126" watchObservedRunningTime="2025-10-11 00:56:04.767053832 +0000 UTC m=+259.420034229" Oct 11 00:56:05 crc kubenswrapper[4743]: I1011 00:56:05.592648 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l6gl2" Oct 11 00:56:05 crc kubenswrapper[4743]: I1011 00:56:05.593625 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l6gl2" Oct 11 00:56:05 crc kubenswrapper[4743]: I1011 00:56:05.636660 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l6gl2" Oct 11 00:56:05 crc kubenswrapper[4743]: I1011 00:56:05.829527 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l6gl2" Oct 11 00:56:07 crc kubenswrapper[4743]: I1011 00:56:07.399607 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wqwgk" Oct 11 00:56:07 crc kubenswrapper[4743]: I1011 00:56:07.399913 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wqwgk" Oct 11 00:56:07 crc kubenswrapper[4743]: I1011 00:56:07.441071 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wqwgk" Oct 11 00:56:07 crc kubenswrapper[4743]: I1011 00:56:07.822933 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wqwgk" Oct 11 00:56:07 crc kubenswrapper[4743]: I1011 00:56:07.989906 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vs4lk" Oct 11 00:56:07 crc kubenswrapper[4743]: I1011 00:56:07.989964 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vs4lk" Oct 11 00:56:08 crc kubenswrapper[4743]: I1011 00:56:08.042504 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vs4lk" Oct 11 00:56:08 crc kubenswrapper[4743]: I1011 00:56:08.827438 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vs4lk" Oct 11 00:56:09 crc kubenswrapper[4743]: I1011 00:56:09.789733 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ttsr5" Oct 11 00:56:09 crc kubenswrapper[4743]: I1011 00:56:09.790140 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ttsr5" Oct 11 00:56:09 crc kubenswrapper[4743]: I1011 00:56:09.854998 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ttsr5" Oct 11 00:56:10 crc kubenswrapper[4743]: I1011 00:56:10.843515 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ttsr5" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.161776 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" podUID="284b66a9-09b0-4bc0-bc8a-5bd32f06d088" containerName="oauth-openshift" containerID="cri-o://56a42a64696f571fb3083137e133846b8b1b3a756e6f826d4684e9dd64cf8b99" gracePeriod=15 Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.609523 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.638986 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-58d58b5989-vrhmd"] Oct 11 00:56:18 crc kubenswrapper[4743]: E1011 00:56:18.639168 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="284b66a9-09b0-4bc0-bc8a-5bd32f06d088" containerName="oauth-openshift" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.639180 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="284b66a9-09b0-4bc0-bc8a-5bd32f06d088" containerName="oauth-openshift" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.639274 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="284b66a9-09b0-4bc0-bc8a-5bd32f06d088" containerName="oauth-openshift" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.639609 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.655742 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-58d58b5989-vrhmd"] Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.728590 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-session\") pod \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.728636 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-audit-policies\") pod \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.728660 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-user-idp-0-file-data\") pod \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.728680 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-router-certs\") pod \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.728708 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-user-template-error\") pod \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.728723 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pttq\" (UniqueName: \"kubernetes.io/projected/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-kube-api-access-4pttq\") pod \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.728753 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-trusted-ca-bundle\") pod \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.728778 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-audit-dir\") pod \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.728802 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-user-template-provider-selection\") pod \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.728827 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-cliconfig\") pod \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.728870 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-ocp-branding-template\") pod \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.728902 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-serving-cert\") pod \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.728922 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-user-template-login\") pod \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.728955 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-service-ca\") pod \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\" (UID: \"284b66a9-09b0-4bc0-bc8a-5bd32f06d088\") " Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.729904 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "284b66a9-09b0-4bc0-bc8a-5bd32f06d088" (UID: "284b66a9-09b0-4bc0-bc8a-5bd32f06d088"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.730553 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "284b66a9-09b0-4bc0-bc8a-5bd32f06d088" (UID: "284b66a9-09b0-4bc0-bc8a-5bd32f06d088"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.731366 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "284b66a9-09b0-4bc0-bc8a-5bd32f06d088" (UID: "284b66a9-09b0-4bc0-bc8a-5bd32f06d088"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.731384 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "284b66a9-09b0-4bc0-bc8a-5bd32f06d088" (UID: "284b66a9-09b0-4bc0-bc8a-5bd32f06d088"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.731557 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "284b66a9-09b0-4bc0-bc8a-5bd32f06d088" (UID: "284b66a9-09b0-4bc0-bc8a-5bd32f06d088"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.738901 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-kube-api-access-4pttq" (OuterVolumeSpecName: "kube-api-access-4pttq") pod "284b66a9-09b0-4bc0-bc8a-5bd32f06d088" (UID: "284b66a9-09b0-4bc0-bc8a-5bd32f06d088"). InnerVolumeSpecName "kube-api-access-4pttq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.738929 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "284b66a9-09b0-4bc0-bc8a-5bd32f06d088" (UID: "284b66a9-09b0-4bc0-bc8a-5bd32f06d088"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.739251 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "284b66a9-09b0-4bc0-bc8a-5bd32f06d088" (UID: "284b66a9-09b0-4bc0-bc8a-5bd32f06d088"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.740089 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "284b66a9-09b0-4bc0-bc8a-5bd32f06d088" (UID: "284b66a9-09b0-4bc0-bc8a-5bd32f06d088"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.742612 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "284b66a9-09b0-4bc0-bc8a-5bd32f06d088" (UID: "284b66a9-09b0-4bc0-bc8a-5bd32f06d088"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.745151 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "284b66a9-09b0-4bc0-bc8a-5bd32f06d088" (UID: "284b66a9-09b0-4bc0-bc8a-5bd32f06d088"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.749262 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "284b66a9-09b0-4bc0-bc8a-5bd32f06d088" (UID: "284b66a9-09b0-4bc0-bc8a-5bd32f06d088"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.749489 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "284b66a9-09b0-4bc0-bc8a-5bd32f06d088" (UID: "284b66a9-09b0-4bc0-bc8a-5bd32f06d088"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.751174 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "284b66a9-09b0-4bc0-bc8a-5bd32f06d088" (UID: "284b66a9-09b0-4bc0-bc8a-5bd32f06d088"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830009 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-system-router-certs\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830057 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830105 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830181 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-user-template-error\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830203 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-audit-dir\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830259 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830275 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-user-template-login\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830334 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830355 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-audit-policies\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830406 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-system-session\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830423 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830474 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcsb2\" (UniqueName: \"kubernetes.io/projected/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-kube-api-access-dcsb2\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830493 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-system-service-ca\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830513 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830576 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830610 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830621 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830630 4743 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830640 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830649 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830657 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830666 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pttq\" (UniqueName: \"kubernetes.io/projected/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-kube-api-access-4pttq\") on node \"crc\" DevicePath \"\"" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830675 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830685 4743 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830695 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830704 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830713 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.830721 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/284b66a9-09b0-4bc0-bc8a-5bd32f06d088-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.866944 4743 generic.go:334] "Generic (PLEG): container finished" podID="284b66a9-09b0-4bc0-bc8a-5bd32f06d088" containerID="56a42a64696f571fb3083137e133846b8b1b3a756e6f826d4684e9dd64cf8b99" exitCode=0 Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.866985 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" event={"ID":"284b66a9-09b0-4bc0-bc8a-5bd32f06d088","Type":"ContainerDied","Data":"56a42a64696f571fb3083137e133846b8b1b3a756e6f826d4684e9dd64cf8b99"} Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.867011 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" event={"ID":"284b66a9-09b0-4bc0-bc8a-5bd32f06d088","Type":"ContainerDied","Data":"3b95fe7e7935bb8d1343d8924f1756fedc81c2216a4f1b707f36335e3eca586e"} Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.867027 4743 scope.go:117] "RemoveContainer" containerID="56a42a64696f571fb3083137e133846b8b1b3a756e6f826d4684e9dd64cf8b99" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.867113 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-sjtsw" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.885896 4743 scope.go:117] "RemoveContainer" containerID="56a42a64696f571fb3083137e133846b8b1b3a756e6f826d4684e9dd64cf8b99" Oct 11 00:56:18 crc kubenswrapper[4743]: E1011 00:56:18.886530 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56a42a64696f571fb3083137e133846b8b1b3a756e6f826d4684e9dd64cf8b99\": container with ID starting with 56a42a64696f571fb3083137e133846b8b1b3a756e6f826d4684e9dd64cf8b99 not found: ID does not exist" containerID="56a42a64696f571fb3083137e133846b8b1b3a756e6f826d4684e9dd64cf8b99" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.886564 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56a42a64696f571fb3083137e133846b8b1b3a756e6f826d4684e9dd64cf8b99"} err="failed to get container status \"56a42a64696f571fb3083137e133846b8b1b3a756e6f826d4684e9dd64cf8b99\": rpc error: code = NotFound desc = could not find container \"56a42a64696f571fb3083137e133846b8b1b3a756e6f826d4684e9dd64cf8b99\": container with ID starting with 56a42a64696f571fb3083137e133846b8b1b3a756e6f826d4684e9dd64cf8b99 not found: ID does not exist" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.899079 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sjtsw"] Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.901918 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-sjtsw"] Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.931657 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.931694 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-system-router-certs\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.932359 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.932421 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-user-template-error\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.932438 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.932457 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-audit-dir\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.932484 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.932501 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-user-template-login\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.932520 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.932557 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-audit-policies\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.932557 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.932577 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-system-session\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.932593 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.932619 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcsb2\" (UniqueName: \"kubernetes.io/projected/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-kube-api-access-dcsb2\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.932638 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-system-service-ca\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.933138 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-system-service-ca\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.933346 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.935681 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.935735 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-audit-dir\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.935757 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-user-template-error\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.936185 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-system-router-certs\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.936449 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-audit-policies\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.936727 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.941307 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-system-session\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.941484 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.941695 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-user-template-login\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.943139 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.955374 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcsb2\" (UniqueName: \"kubernetes.io/projected/b2e1d5e4-f3c3-47b8-832e-be8a46849f79-kube-api-access-dcsb2\") pod \"oauth-openshift-58d58b5989-vrhmd\" (UID: \"b2e1d5e4-f3c3-47b8-832e-be8a46849f79\") " pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:18 crc kubenswrapper[4743]: I1011 00:56:18.960782 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:19 crc kubenswrapper[4743]: I1011 00:56:19.353741 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-58d58b5989-vrhmd"] Oct 11 00:56:19 crc kubenswrapper[4743]: I1011 00:56:19.875867 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" event={"ID":"b2e1d5e4-f3c3-47b8-832e-be8a46849f79","Type":"ContainerStarted","Data":"e8c6d0d37f1b09a5ea1fe0aa7c8d9115b19f9bacda50f5c554a43a4020e12bd7"} Oct 11 00:56:19 crc kubenswrapper[4743]: I1011 00:56:19.876223 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:56:19 crc kubenswrapper[4743]: I1011 00:56:19.876244 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" event={"ID":"b2e1d5e4-f3c3-47b8-832e-be8a46849f79","Type":"ContainerStarted","Data":"867c36eeac1cb5ba22e9eda1c390f08a2378c9126c8f9d8af35f765fdfa81ad5"} Oct 11 00:56:19 crc kubenswrapper[4743]: I1011 00:56:19.902717 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" podStartSLOduration=26.902698454 podStartE2EDuration="26.902698454s" podCreationTimestamp="2025-10-11 00:55:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:56:19.900182574 +0000 UTC m=+274.553162971" watchObservedRunningTime="2025-10-11 00:56:19.902698454 +0000 UTC m=+274.555678871" Oct 11 00:56:20 crc kubenswrapper[4743]: I1011 00:56:20.097221 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="284b66a9-09b0-4bc0-bc8a-5bd32f06d088" path="/var/lib/kubelet/pods/284b66a9-09b0-4bc0-bc8a-5bd32f06d088/volumes" Oct 11 00:56:20 crc kubenswrapper[4743]: I1011 00:56:20.185211 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-58d58b5989-vrhmd" Oct 11 00:57:14 crc kubenswrapper[4743]: I1011 00:57:14.459000 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 00:57:14 crc kubenswrapper[4743]: I1011 00:57:14.460046 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 00:57:44 crc kubenswrapper[4743]: I1011 00:57:44.458335 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 00:57:44 crc kubenswrapper[4743]: I1011 00:57:44.459124 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 00:58:14 crc kubenswrapper[4743]: I1011 00:58:14.458769 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 00:58:14 crc kubenswrapper[4743]: I1011 00:58:14.459573 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 00:58:14 crc kubenswrapper[4743]: I1011 00:58:14.459647 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 00:58:14 crc kubenswrapper[4743]: I1011 00:58:14.460614 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d4ccb047d6639dbadc8db37d34bacbcce79ae6b61d67f9ebe1557bf1798750cf"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 00:58:14 crc kubenswrapper[4743]: I1011 00:58:14.460716 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://d4ccb047d6639dbadc8db37d34bacbcce79ae6b61d67f9ebe1557bf1798750cf" gracePeriod=600 Oct 11 00:58:14 crc kubenswrapper[4743]: I1011 00:58:14.628148 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="d4ccb047d6639dbadc8db37d34bacbcce79ae6b61d67f9ebe1557bf1798750cf" exitCode=0 Oct 11 00:58:14 crc kubenswrapper[4743]: I1011 00:58:14.628344 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"d4ccb047d6639dbadc8db37d34bacbcce79ae6b61d67f9ebe1557bf1798750cf"} Oct 11 00:58:14 crc kubenswrapper[4743]: I1011 00:58:14.628417 4743 scope.go:117] "RemoveContainer" containerID="16b18bcf80747537e2a7a31adc90eec7fdba85d8166f8cfc53709a1dba33de8c" Oct 11 00:58:15 crc kubenswrapper[4743]: I1011 00:58:15.639049 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"f4a50228ff369b6861f5b6579c1f5b36360b57624267c00cfb8d313cffab1c5d"} Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.495045 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lhnj4"] Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.496449 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.519216 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lhnj4"] Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.601397 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-lhnj4\" (UID: \"7d3a65af-16fb-4bd7-aba7-11266c7354d3\") " pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.601484 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d3a65af-16fb-4bd7-aba7-11266c7354d3-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lhnj4\" (UID: \"7d3a65af-16fb-4bd7-aba7-11266c7354d3\") " pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.601528 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d3a65af-16fb-4bd7-aba7-11266c7354d3-registry-tls\") pod \"image-registry-66df7c8f76-lhnj4\" (UID: \"7d3a65af-16fb-4bd7-aba7-11266c7354d3\") " pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.601568 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d3a65af-16fb-4bd7-aba7-11266c7354d3-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lhnj4\" (UID: \"7d3a65af-16fb-4bd7-aba7-11266c7354d3\") " pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.601719 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d3a65af-16fb-4bd7-aba7-11266c7354d3-bound-sa-token\") pod \"image-registry-66df7c8f76-lhnj4\" (UID: \"7d3a65af-16fb-4bd7-aba7-11266c7354d3\") " pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.601845 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d3a65af-16fb-4bd7-aba7-11266c7354d3-registry-certificates\") pod \"image-registry-66df7c8f76-lhnj4\" (UID: \"7d3a65af-16fb-4bd7-aba7-11266c7354d3\") " pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.601921 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d3a65af-16fb-4bd7-aba7-11266c7354d3-trusted-ca\") pod \"image-registry-66df7c8f76-lhnj4\" (UID: \"7d3a65af-16fb-4bd7-aba7-11266c7354d3\") " pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.601955 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9gtq\" (UniqueName: \"kubernetes.io/projected/7d3a65af-16fb-4bd7-aba7-11266c7354d3-kube-api-access-d9gtq\") pod \"image-registry-66df7c8f76-lhnj4\" (UID: \"7d3a65af-16fb-4bd7-aba7-11266c7354d3\") " pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.619464 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-lhnj4\" (UID: \"7d3a65af-16fb-4bd7-aba7-11266c7354d3\") " pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.703247 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d3a65af-16fb-4bd7-aba7-11266c7354d3-registry-tls\") pod \"image-registry-66df7c8f76-lhnj4\" (UID: \"7d3a65af-16fb-4bd7-aba7-11266c7354d3\") " pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.703303 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d3a65af-16fb-4bd7-aba7-11266c7354d3-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lhnj4\" (UID: \"7d3a65af-16fb-4bd7-aba7-11266c7354d3\") " pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.703323 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d3a65af-16fb-4bd7-aba7-11266c7354d3-bound-sa-token\") pod \"image-registry-66df7c8f76-lhnj4\" (UID: \"7d3a65af-16fb-4bd7-aba7-11266c7354d3\") " pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.703347 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d3a65af-16fb-4bd7-aba7-11266c7354d3-registry-certificates\") pod \"image-registry-66df7c8f76-lhnj4\" (UID: \"7d3a65af-16fb-4bd7-aba7-11266c7354d3\") " pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.703364 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d3a65af-16fb-4bd7-aba7-11266c7354d3-trusted-ca\") pod \"image-registry-66df7c8f76-lhnj4\" (UID: \"7d3a65af-16fb-4bd7-aba7-11266c7354d3\") " pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.703380 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9gtq\" (UniqueName: \"kubernetes.io/projected/7d3a65af-16fb-4bd7-aba7-11266c7354d3-kube-api-access-d9gtq\") pod \"image-registry-66df7c8f76-lhnj4\" (UID: \"7d3a65af-16fb-4bd7-aba7-11266c7354d3\") " pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.703424 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d3a65af-16fb-4bd7-aba7-11266c7354d3-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lhnj4\" (UID: \"7d3a65af-16fb-4bd7-aba7-11266c7354d3\") " pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.704132 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d3a65af-16fb-4bd7-aba7-11266c7354d3-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lhnj4\" (UID: \"7d3a65af-16fb-4bd7-aba7-11266c7354d3\") " pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.705108 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d3a65af-16fb-4bd7-aba7-11266c7354d3-trusted-ca\") pod \"image-registry-66df7c8f76-lhnj4\" (UID: \"7d3a65af-16fb-4bd7-aba7-11266c7354d3\") " pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.705333 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d3a65af-16fb-4bd7-aba7-11266c7354d3-registry-certificates\") pod \"image-registry-66df7c8f76-lhnj4\" (UID: \"7d3a65af-16fb-4bd7-aba7-11266c7354d3\") " pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.709352 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d3a65af-16fb-4bd7-aba7-11266c7354d3-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lhnj4\" (UID: \"7d3a65af-16fb-4bd7-aba7-11266c7354d3\") " pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.709466 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d3a65af-16fb-4bd7-aba7-11266c7354d3-registry-tls\") pod \"image-registry-66df7c8f76-lhnj4\" (UID: \"7d3a65af-16fb-4bd7-aba7-11266c7354d3\") " pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.719125 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d3a65af-16fb-4bd7-aba7-11266c7354d3-bound-sa-token\") pod \"image-registry-66df7c8f76-lhnj4\" (UID: \"7d3a65af-16fb-4bd7-aba7-11266c7354d3\") " pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.721037 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9gtq\" (UniqueName: \"kubernetes.io/projected/7d3a65af-16fb-4bd7-aba7-11266c7354d3-kube-api-access-d9gtq\") pod \"image-registry-66df7c8f76-lhnj4\" (UID: \"7d3a65af-16fb-4bd7-aba7-11266c7354d3\") " pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:17 crc kubenswrapper[4743]: I1011 00:59:17.813160 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:18 crc kubenswrapper[4743]: I1011 00:59:18.236075 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lhnj4"] Oct 11 00:59:19 crc kubenswrapper[4743]: I1011 00:59:19.054182 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" event={"ID":"7d3a65af-16fb-4bd7-aba7-11266c7354d3","Type":"ContainerStarted","Data":"ed6a2a679a6a757db6611bea300cd9846c011027d16e8143ab211d7a4abf08ad"} Oct 11 00:59:19 crc kubenswrapper[4743]: I1011 00:59:19.054535 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" event={"ID":"7d3a65af-16fb-4bd7-aba7-11266c7354d3","Type":"ContainerStarted","Data":"5df9c7e1815162fdf19ae09250673d94173347d8adfed503dc051030c87a6231"} Oct 11 00:59:19 crc kubenswrapper[4743]: I1011 00:59:19.054568 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:19 crc kubenswrapper[4743]: I1011 00:59:19.080177 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" podStartSLOduration=2.080143013 podStartE2EDuration="2.080143013s" podCreationTimestamp="2025-10-11 00:59:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 00:59:19.075326949 +0000 UTC m=+453.728307446" watchObservedRunningTime="2025-10-11 00:59:19.080143013 +0000 UTC m=+453.733123480" Oct 11 00:59:37 crc kubenswrapper[4743]: I1011 00:59:37.820214 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-lhnj4" Oct 11 00:59:37 crc kubenswrapper[4743]: I1011 00:59:37.879336 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-24m6m"] Oct 11 01:00:00 crc kubenswrapper[4743]: I1011 01:00:00.150728 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335740-fndp4"] Oct 11 01:00:00 crc kubenswrapper[4743]: I1011 01:00:00.152331 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335740-fndp4" Oct 11 01:00:00 crc kubenswrapper[4743]: I1011 01:00:00.162373 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 11 01:00:00 crc kubenswrapper[4743]: I1011 01:00:00.163101 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335740-fndp4"] Oct 11 01:00:00 crc kubenswrapper[4743]: I1011 01:00:00.163224 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 11 01:00:00 crc kubenswrapper[4743]: I1011 01:00:00.215406 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psn76\" (UniqueName: \"kubernetes.io/projected/2c46c590-571e-42eb-9fa4-a6ccabdc12a8-kube-api-access-psn76\") pod \"collect-profiles-29335740-fndp4\" (UID: \"2c46c590-571e-42eb-9fa4-a6ccabdc12a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335740-fndp4" Oct 11 01:00:00 crc kubenswrapper[4743]: I1011 01:00:00.215497 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c46c590-571e-42eb-9fa4-a6ccabdc12a8-secret-volume\") pod \"collect-profiles-29335740-fndp4\" (UID: \"2c46c590-571e-42eb-9fa4-a6ccabdc12a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335740-fndp4" Oct 11 01:00:00 crc kubenswrapper[4743]: I1011 01:00:00.215534 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c46c590-571e-42eb-9fa4-a6ccabdc12a8-config-volume\") pod \"collect-profiles-29335740-fndp4\" (UID: \"2c46c590-571e-42eb-9fa4-a6ccabdc12a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335740-fndp4" Oct 11 01:00:00 crc kubenswrapper[4743]: I1011 01:00:00.316741 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psn76\" (UniqueName: \"kubernetes.io/projected/2c46c590-571e-42eb-9fa4-a6ccabdc12a8-kube-api-access-psn76\") pod \"collect-profiles-29335740-fndp4\" (UID: \"2c46c590-571e-42eb-9fa4-a6ccabdc12a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335740-fndp4" Oct 11 01:00:00 crc kubenswrapper[4743]: I1011 01:00:00.316806 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c46c590-571e-42eb-9fa4-a6ccabdc12a8-secret-volume\") pod \"collect-profiles-29335740-fndp4\" (UID: \"2c46c590-571e-42eb-9fa4-a6ccabdc12a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335740-fndp4" Oct 11 01:00:00 crc kubenswrapper[4743]: I1011 01:00:00.316841 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c46c590-571e-42eb-9fa4-a6ccabdc12a8-config-volume\") pod \"collect-profiles-29335740-fndp4\" (UID: \"2c46c590-571e-42eb-9fa4-a6ccabdc12a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335740-fndp4" Oct 11 01:00:00 crc kubenswrapper[4743]: I1011 01:00:00.317948 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c46c590-571e-42eb-9fa4-a6ccabdc12a8-config-volume\") pod \"collect-profiles-29335740-fndp4\" (UID: \"2c46c590-571e-42eb-9fa4-a6ccabdc12a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335740-fndp4" Oct 11 01:00:00 crc kubenswrapper[4743]: I1011 01:00:00.324830 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c46c590-571e-42eb-9fa4-a6ccabdc12a8-secret-volume\") pod \"collect-profiles-29335740-fndp4\" (UID: \"2c46c590-571e-42eb-9fa4-a6ccabdc12a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335740-fndp4" Oct 11 01:00:00 crc kubenswrapper[4743]: I1011 01:00:00.346817 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psn76\" (UniqueName: \"kubernetes.io/projected/2c46c590-571e-42eb-9fa4-a6ccabdc12a8-kube-api-access-psn76\") pod \"collect-profiles-29335740-fndp4\" (UID: \"2c46c590-571e-42eb-9fa4-a6ccabdc12a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335740-fndp4" Oct 11 01:00:00 crc kubenswrapper[4743]: I1011 01:00:00.483797 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335740-fndp4" Oct 11 01:00:00 crc kubenswrapper[4743]: I1011 01:00:00.746137 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335740-fndp4"] Oct 11 01:00:01 crc kubenswrapper[4743]: I1011 01:00:01.301783 4743 generic.go:334] "Generic (PLEG): container finished" podID="2c46c590-571e-42eb-9fa4-a6ccabdc12a8" containerID="16c011b3dbb50faac3d0e841f0c5671da27839de7eace05fa026551671c0cca3" exitCode=0 Oct 11 01:00:01 crc kubenswrapper[4743]: I1011 01:00:01.301954 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335740-fndp4" event={"ID":"2c46c590-571e-42eb-9fa4-a6ccabdc12a8","Type":"ContainerDied","Data":"16c011b3dbb50faac3d0e841f0c5671da27839de7eace05fa026551671c0cca3"} Oct 11 01:00:01 crc kubenswrapper[4743]: I1011 01:00:01.302243 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335740-fndp4" event={"ID":"2c46c590-571e-42eb-9fa4-a6ccabdc12a8","Type":"ContainerStarted","Data":"d79c7489f15181143216e90fabc02245df54ad6f77e33d6e95e33173935f28da"} Oct 11 01:00:02 crc kubenswrapper[4743]: I1011 01:00:02.591315 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335740-fndp4" Oct 11 01:00:02 crc kubenswrapper[4743]: I1011 01:00:02.653166 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c46c590-571e-42eb-9fa4-a6ccabdc12a8-secret-volume\") pod \"2c46c590-571e-42eb-9fa4-a6ccabdc12a8\" (UID: \"2c46c590-571e-42eb-9fa4-a6ccabdc12a8\") " Oct 11 01:00:02 crc kubenswrapper[4743]: I1011 01:00:02.653220 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c46c590-571e-42eb-9fa4-a6ccabdc12a8-config-volume\") pod \"2c46c590-571e-42eb-9fa4-a6ccabdc12a8\" (UID: \"2c46c590-571e-42eb-9fa4-a6ccabdc12a8\") " Oct 11 01:00:02 crc kubenswrapper[4743]: I1011 01:00:02.653356 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psn76\" (UniqueName: \"kubernetes.io/projected/2c46c590-571e-42eb-9fa4-a6ccabdc12a8-kube-api-access-psn76\") pod \"2c46c590-571e-42eb-9fa4-a6ccabdc12a8\" (UID: \"2c46c590-571e-42eb-9fa4-a6ccabdc12a8\") " Oct 11 01:00:02 crc kubenswrapper[4743]: I1011 01:00:02.654524 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c46c590-571e-42eb-9fa4-a6ccabdc12a8-config-volume" (OuterVolumeSpecName: "config-volume") pod "2c46c590-571e-42eb-9fa4-a6ccabdc12a8" (UID: "2c46c590-571e-42eb-9fa4-a6ccabdc12a8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:00:02 crc kubenswrapper[4743]: I1011 01:00:02.660704 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c46c590-571e-42eb-9fa4-a6ccabdc12a8-kube-api-access-psn76" (OuterVolumeSpecName: "kube-api-access-psn76") pod "2c46c590-571e-42eb-9fa4-a6ccabdc12a8" (UID: "2c46c590-571e-42eb-9fa4-a6ccabdc12a8"). InnerVolumeSpecName "kube-api-access-psn76". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:00:02 crc kubenswrapper[4743]: I1011 01:00:02.660781 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c46c590-571e-42eb-9fa4-a6ccabdc12a8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2c46c590-571e-42eb-9fa4-a6ccabdc12a8" (UID: "2c46c590-571e-42eb-9fa4-a6ccabdc12a8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:00:02 crc kubenswrapper[4743]: I1011 01:00:02.754819 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psn76\" (UniqueName: \"kubernetes.io/projected/2c46c590-571e-42eb-9fa4-a6ccabdc12a8-kube-api-access-psn76\") on node \"crc\" DevicePath \"\"" Oct 11 01:00:02 crc kubenswrapper[4743]: I1011 01:00:02.754905 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c46c590-571e-42eb-9fa4-a6ccabdc12a8-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 11 01:00:02 crc kubenswrapper[4743]: I1011 01:00:02.754920 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c46c590-571e-42eb-9fa4-a6ccabdc12a8-config-volume\") on node \"crc\" DevicePath \"\"" Oct 11 01:00:02 crc kubenswrapper[4743]: I1011 01:00:02.930553 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" podUID="9d6b106d-6589-40f2-b694-eb158c541d82" containerName="registry" containerID="cri-o://3da0b8283ad0d2002a73e40a09d147bb70dee93ceddfdd6006d466b69da78fde" gracePeriod=30 Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.252762 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.260543 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d6b106d-6589-40f2-b694-eb158c541d82-trusted-ca\") pod \"9d6b106d-6589-40f2-b694-eb158c541d82\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.260782 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"9d6b106d-6589-40f2-b694-eb158c541d82\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.260890 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9d6b106d-6589-40f2-b694-eb158c541d82-ca-trust-extracted\") pod \"9d6b106d-6589-40f2-b694-eb158c541d82\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.260995 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swbdz\" (UniqueName: \"kubernetes.io/projected/9d6b106d-6589-40f2-b694-eb158c541d82-kube-api-access-swbdz\") pod \"9d6b106d-6589-40f2-b694-eb158c541d82\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.261612 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d6b106d-6589-40f2-b694-eb158c541d82-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d6b106d-6589-40f2-b694-eb158c541d82" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.261735 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d6b106d-6589-40f2-b694-eb158c541d82-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "9d6b106d-6589-40f2-b694-eb158c541d82" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.261088 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9d6b106d-6589-40f2-b694-eb158c541d82-registry-certificates\") pod \"9d6b106d-6589-40f2-b694-eb158c541d82\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.262372 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9d6b106d-6589-40f2-b694-eb158c541d82-registry-tls\") pod \"9d6b106d-6589-40f2-b694-eb158c541d82\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.262443 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d6b106d-6589-40f2-b694-eb158c541d82-bound-sa-token\") pod \"9d6b106d-6589-40f2-b694-eb158c541d82\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.262490 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9d6b106d-6589-40f2-b694-eb158c541d82-installation-pull-secrets\") pod \"9d6b106d-6589-40f2-b694-eb158c541d82\" (UID: \"9d6b106d-6589-40f2-b694-eb158c541d82\") " Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.262840 4743 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9d6b106d-6589-40f2-b694-eb158c541d82-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.262967 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d6b106d-6589-40f2-b694-eb158c541d82-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.267010 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d6b106d-6589-40f2-b694-eb158c541d82-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "9d6b106d-6589-40f2-b694-eb158c541d82" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.268132 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d6b106d-6589-40f2-b694-eb158c541d82-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "9d6b106d-6589-40f2-b694-eb158c541d82" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.270377 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d6b106d-6589-40f2-b694-eb158c541d82-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9d6b106d-6589-40f2-b694-eb158c541d82" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.270810 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d6b106d-6589-40f2-b694-eb158c541d82-kube-api-access-swbdz" (OuterVolumeSpecName: "kube-api-access-swbdz") pod "9d6b106d-6589-40f2-b694-eb158c541d82" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82"). InnerVolumeSpecName "kube-api-access-swbdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.278075 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "9d6b106d-6589-40f2-b694-eb158c541d82" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.304652 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d6b106d-6589-40f2-b694-eb158c541d82-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "9d6b106d-6589-40f2-b694-eb158c541d82" (UID: "9d6b106d-6589-40f2-b694-eb158c541d82"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.317459 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335740-fndp4" Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.317482 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335740-fndp4" event={"ID":"2c46c590-571e-42eb-9fa4-a6ccabdc12a8","Type":"ContainerDied","Data":"d79c7489f15181143216e90fabc02245df54ad6f77e33d6e95e33173935f28da"} Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.317586 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d79c7489f15181143216e90fabc02245df54ad6f77e33d6e95e33173935f28da" Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.319424 4743 generic.go:334] "Generic (PLEG): container finished" podID="9d6b106d-6589-40f2-b694-eb158c541d82" containerID="3da0b8283ad0d2002a73e40a09d147bb70dee93ceddfdd6006d466b69da78fde" exitCode=0 Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.319481 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" event={"ID":"9d6b106d-6589-40f2-b694-eb158c541d82","Type":"ContainerDied","Data":"3da0b8283ad0d2002a73e40a09d147bb70dee93ceddfdd6006d466b69da78fde"} Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.319520 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" event={"ID":"9d6b106d-6589-40f2-b694-eb158c541d82","Type":"ContainerDied","Data":"d04e702924fa574dfda36716020c9ef9cc9af74bde8d2f67803333dd7998b185"} Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.319557 4743 scope.go:117] "RemoveContainer" containerID="3da0b8283ad0d2002a73e40a09d147bb70dee93ceddfdd6006d466b69da78fde" Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.319719 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-24m6m" Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.336278 4743 scope.go:117] "RemoveContainer" containerID="3da0b8283ad0d2002a73e40a09d147bb70dee93ceddfdd6006d466b69da78fde" Oct 11 01:00:03 crc kubenswrapper[4743]: E1011 01:00:03.336993 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3da0b8283ad0d2002a73e40a09d147bb70dee93ceddfdd6006d466b69da78fde\": container with ID starting with 3da0b8283ad0d2002a73e40a09d147bb70dee93ceddfdd6006d466b69da78fde not found: ID does not exist" containerID="3da0b8283ad0d2002a73e40a09d147bb70dee93ceddfdd6006d466b69da78fde" Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.337038 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3da0b8283ad0d2002a73e40a09d147bb70dee93ceddfdd6006d466b69da78fde"} err="failed to get container status \"3da0b8283ad0d2002a73e40a09d147bb70dee93ceddfdd6006d466b69da78fde\": rpc error: code = NotFound desc = could not find container \"3da0b8283ad0d2002a73e40a09d147bb70dee93ceddfdd6006d466b69da78fde\": container with ID starting with 3da0b8283ad0d2002a73e40a09d147bb70dee93ceddfdd6006d466b69da78fde not found: ID does not exist" Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.360085 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-24m6m"] Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.364529 4743 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9d6b106d-6589-40f2-b694-eb158c541d82-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.364620 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swbdz\" (UniqueName: \"kubernetes.io/projected/9d6b106d-6589-40f2-b694-eb158c541d82-kube-api-access-swbdz\") on node \"crc\" DevicePath \"\"" Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.364688 4743 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9d6b106d-6589-40f2-b694-eb158c541d82-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.364709 4743 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d6b106d-6589-40f2-b694-eb158c541d82-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.364764 4743 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9d6b106d-6589-40f2-b694-eb158c541d82-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 11 01:00:03 crc kubenswrapper[4743]: I1011 01:00:03.365046 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-24m6m"] Oct 11 01:00:04 crc kubenswrapper[4743]: I1011 01:00:04.105432 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d6b106d-6589-40f2-b694-eb158c541d82" path="/var/lib/kubelet/pods/9d6b106d-6589-40f2-b694-eb158c541d82/volumes" Oct 11 01:00:14 crc kubenswrapper[4743]: I1011 01:00:14.458742 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:00:14 crc kubenswrapper[4743]: I1011 01:00:14.459403 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:00:44 crc kubenswrapper[4743]: I1011 01:00:44.458428 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:00:44 crc kubenswrapper[4743]: I1011 01:00:44.459064 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:00:58 crc kubenswrapper[4743]: I1011 01:00:58.357897 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt"] Oct 11 01:00:58 crc kubenswrapper[4743]: E1011 01:00:58.358525 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d6b106d-6589-40f2-b694-eb158c541d82" containerName="registry" Oct 11 01:00:58 crc kubenswrapper[4743]: I1011 01:00:58.358536 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d6b106d-6589-40f2-b694-eb158c541d82" containerName="registry" Oct 11 01:00:58 crc kubenswrapper[4743]: E1011 01:00:58.358548 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c46c590-571e-42eb-9fa4-a6ccabdc12a8" containerName="collect-profiles" Oct 11 01:00:58 crc kubenswrapper[4743]: I1011 01:00:58.358554 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c46c590-571e-42eb-9fa4-a6ccabdc12a8" containerName="collect-profiles" Oct 11 01:00:58 crc kubenswrapper[4743]: I1011 01:00:58.358640 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d6b106d-6589-40f2-b694-eb158c541d82" containerName="registry" Oct 11 01:00:58 crc kubenswrapper[4743]: I1011 01:00:58.358652 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c46c590-571e-42eb-9fa4-a6ccabdc12a8" containerName="collect-profiles" Oct 11 01:00:58 crc kubenswrapper[4743]: I1011 01:00:58.359329 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt" Oct 11 01:00:58 crc kubenswrapper[4743]: I1011 01:00:58.360982 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 11 01:00:58 crc kubenswrapper[4743]: I1011 01:00:58.376828 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt"] Oct 11 01:00:58 crc kubenswrapper[4743]: I1011 01:00:58.478381 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng27f\" (UniqueName: \"kubernetes.io/projected/f424c028-a0f6-4327-90f7-338a9d21043c-kube-api-access-ng27f\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt\" (UID: \"f424c028-a0f6-4327-90f7-338a9d21043c\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt" Oct 11 01:00:58 crc kubenswrapper[4743]: I1011 01:00:58.478441 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f424c028-a0f6-4327-90f7-338a9d21043c-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt\" (UID: \"f424c028-a0f6-4327-90f7-338a9d21043c\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt" Oct 11 01:00:58 crc kubenswrapper[4743]: I1011 01:00:58.478465 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f424c028-a0f6-4327-90f7-338a9d21043c-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt\" (UID: \"f424c028-a0f6-4327-90f7-338a9d21043c\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt" Oct 11 01:00:58 crc kubenswrapper[4743]: I1011 01:00:58.579982 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng27f\" (UniqueName: \"kubernetes.io/projected/f424c028-a0f6-4327-90f7-338a9d21043c-kube-api-access-ng27f\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt\" (UID: \"f424c028-a0f6-4327-90f7-338a9d21043c\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt" Oct 11 01:00:58 crc kubenswrapper[4743]: I1011 01:00:58.580097 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f424c028-a0f6-4327-90f7-338a9d21043c-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt\" (UID: \"f424c028-a0f6-4327-90f7-338a9d21043c\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt" Oct 11 01:00:58 crc kubenswrapper[4743]: I1011 01:00:58.580152 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f424c028-a0f6-4327-90f7-338a9d21043c-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt\" (UID: \"f424c028-a0f6-4327-90f7-338a9d21043c\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt" Oct 11 01:00:58 crc kubenswrapper[4743]: I1011 01:00:58.580917 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f424c028-a0f6-4327-90f7-338a9d21043c-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt\" (UID: \"f424c028-a0f6-4327-90f7-338a9d21043c\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt" Oct 11 01:00:58 crc kubenswrapper[4743]: I1011 01:00:58.581510 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f424c028-a0f6-4327-90f7-338a9d21043c-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt\" (UID: \"f424c028-a0f6-4327-90f7-338a9d21043c\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt" Oct 11 01:00:58 crc kubenswrapper[4743]: I1011 01:00:58.609516 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng27f\" (UniqueName: \"kubernetes.io/projected/f424c028-a0f6-4327-90f7-338a9d21043c-kube-api-access-ng27f\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt\" (UID: \"f424c028-a0f6-4327-90f7-338a9d21043c\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt" Oct 11 01:00:58 crc kubenswrapper[4743]: I1011 01:00:58.673668 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt" Oct 11 01:00:58 crc kubenswrapper[4743]: I1011 01:00:58.914263 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt"] Oct 11 01:00:59 crc kubenswrapper[4743]: I1011 01:00:59.694243 4743 generic.go:334] "Generic (PLEG): container finished" podID="f424c028-a0f6-4327-90f7-338a9d21043c" containerID="44f6202f73a392631c2429947db147409c125e75a6c49221fc20e59298ba07d5" exitCode=0 Oct 11 01:00:59 crc kubenswrapper[4743]: I1011 01:00:59.694398 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt" event={"ID":"f424c028-a0f6-4327-90f7-338a9d21043c","Type":"ContainerDied","Data":"44f6202f73a392631c2429947db147409c125e75a6c49221fc20e59298ba07d5"} Oct 11 01:00:59 crc kubenswrapper[4743]: I1011 01:00:59.694530 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt" event={"ID":"f424c028-a0f6-4327-90f7-338a9d21043c","Type":"ContainerStarted","Data":"0627c2d388612b6248cc98920cba11475e8c67dad9c8b2578ba612fb34c1d9cc"} Oct 11 01:00:59 crc kubenswrapper[4743]: I1011 01:00:59.697458 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 01:01:01 crc kubenswrapper[4743]: I1011 01:01:01.708143 4743 generic.go:334] "Generic (PLEG): container finished" podID="f424c028-a0f6-4327-90f7-338a9d21043c" containerID="65e06567e073b881f02a3178538f36c53bab95591a5262e73c0ca4056e50b81b" exitCode=0 Oct 11 01:01:01 crc kubenswrapper[4743]: I1011 01:01:01.708226 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt" event={"ID":"f424c028-a0f6-4327-90f7-338a9d21043c","Type":"ContainerDied","Data":"65e06567e073b881f02a3178538f36c53bab95591a5262e73c0ca4056e50b81b"} Oct 11 01:01:02 crc kubenswrapper[4743]: I1011 01:01:02.716753 4743 generic.go:334] "Generic (PLEG): container finished" podID="f424c028-a0f6-4327-90f7-338a9d21043c" containerID="941cfca3745a752c6c38dc1ecd3c7911ef8f19ec7d8905fbde479c9224038dac" exitCode=0 Oct 11 01:01:02 crc kubenswrapper[4743]: I1011 01:01:02.716805 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt" event={"ID":"f424c028-a0f6-4327-90f7-338a9d21043c","Type":"ContainerDied","Data":"941cfca3745a752c6c38dc1ecd3c7911ef8f19ec7d8905fbde479c9224038dac"} Oct 11 01:01:03 crc kubenswrapper[4743]: I1011 01:01:03.974044 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt" Oct 11 01:01:04 crc kubenswrapper[4743]: I1011 01:01:04.148360 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f424c028-a0f6-4327-90f7-338a9d21043c-bundle\") pod \"f424c028-a0f6-4327-90f7-338a9d21043c\" (UID: \"f424c028-a0f6-4327-90f7-338a9d21043c\") " Oct 11 01:01:04 crc kubenswrapper[4743]: I1011 01:01:04.148443 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f424c028-a0f6-4327-90f7-338a9d21043c-util\") pod \"f424c028-a0f6-4327-90f7-338a9d21043c\" (UID: \"f424c028-a0f6-4327-90f7-338a9d21043c\") " Oct 11 01:01:04 crc kubenswrapper[4743]: I1011 01:01:04.148537 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng27f\" (UniqueName: \"kubernetes.io/projected/f424c028-a0f6-4327-90f7-338a9d21043c-kube-api-access-ng27f\") pod \"f424c028-a0f6-4327-90f7-338a9d21043c\" (UID: \"f424c028-a0f6-4327-90f7-338a9d21043c\") " Oct 11 01:01:04 crc kubenswrapper[4743]: I1011 01:01:04.151049 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f424c028-a0f6-4327-90f7-338a9d21043c-bundle" (OuterVolumeSpecName: "bundle") pod "f424c028-a0f6-4327-90f7-338a9d21043c" (UID: "f424c028-a0f6-4327-90f7-338a9d21043c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:01:04 crc kubenswrapper[4743]: I1011 01:01:04.158177 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f424c028-a0f6-4327-90f7-338a9d21043c-kube-api-access-ng27f" (OuterVolumeSpecName: "kube-api-access-ng27f") pod "f424c028-a0f6-4327-90f7-338a9d21043c" (UID: "f424c028-a0f6-4327-90f7-338a9d21043c"). InnerVolumeSpecName "kube-api-access-ng27f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:01:04 crc kubenswrapper[4743]: I1011 01:01:04.186786 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f424c028-a0f6-4327-90f7-338a9d21043c-util" (OuterVolumeSpecName: "util") pod "f424c028-a0f6-4327-90f7-338a9d21043c" (UID: "f424c028-a0f6-4327-90f7-338a9d21043c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:01:04 crc kubenswrapper[4743]: I1011 01:01:04.250226 4743 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f424c028-a0f6-4327-90f7-338a9d21043c-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:01:04 crc kubenswrapper[4743]: I1011 01:01:04.250390 4743 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f424c028-a0f6-4327-90f7-338a9d21043c-util\") on node \"crc\" DevicePath \"\"" Oct 11 01:01:04 crc kubenswrapper[4743]: I1011 01:01:04.250448 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng27f\" (UniqueName: \"kubernetes.io/projected/f424c028-a0f6-4327-90f7-338a9d21043c-kube-api-access-ng27f\") on node \"crc\" DevicePath \"\"" Oct 11 01:01:04 crc kubenswrapper[4743]: I1011 01:01:04.730545 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt" event={"ID":"f424c028-a0f6-4327-90f7-338a9d21043c","Type":"ContainerDied","Data":"0627c2d388612b6248cc98920cba11475e8c67dad9c8b2578ba612fb34c1d9cc"} Oct 11 01:01:04 crc kubenswrapper[4743]: I1011 01:01:04.730590 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt" Oct 11 01:01:04 crc kubenswrapper[4743]: I1011 01:01:04.730611 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0627c2d388612b6248cc98920cba11475e8c67dad9c8b2578ba612fb34c1d9cc" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.028643 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-48ljj"] Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.029445 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="ovn-controller" containerID="cri-o://d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe" gracePeriod=30 Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.029547 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="northd" containerID="cri-o://acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885" gracePeriod=30 Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.029585 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b" gracePeriod=30 Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.029548 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="nbdb" containerID="cri-o://487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35" gracePeriod=30 Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.029618 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="kube-rbac-proxy-node" containerID="cri-o://d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee" gracePeriod=30 Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.029645 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="ovn-acl-logging" containerID="cri-o://17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63" gracePeriod=30 Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.029703 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="sbdb" containerID="cri-o://ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0" gracePeriod=30 Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.069551 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="ovnkube-controller" containerID="cri-o://4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae" gracePeriod=30 Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.357511 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48ljj_9ed16b35-862f-47f2-9e32-63c98f868fb8/ovnkube-controller/3.log" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.359170 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48ljj_9ed16b35-862f-47f2-9e32-63c98f868fb8/ovn-acl-logging/0.log" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.359545 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48ljj_9ed16b35-862f-47f2-9e32-63c98f868fb8/ovn-controller/0.log" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.359937 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.417979 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-run-netns\") pod \"9ed16b35-862f-47f2-9e32-63c98f868fb8\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418022 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9ed16b35-862f-47f2-9e32-63c98f868fb8-ovnkube-script-lib\") pod \"9ed16b35-862f-47f2-9e32-63c98f868fb8\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418041 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw92h\" (UniqueName: \"kubernetes.io/projected/9ed16b35-862f-47f2-9e32-63c98f868fb8-kube-api-access-xw92h\") pod \"9ed16b35-862f-47f2-9e32-63c98f868fb8\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418066 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-node-log\") pod \"9ed16b35-862f-47f2-9e32-63c98f868fb8\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418067 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "9ed16b35-862f-47f2-9e32-63c98f868fb8" (UID: "9ed16b35-862f-47f2-9e32-63c98f868fb8"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418086 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-cni-bin\") pod \"9ed16b35-862f-47f2-9e32-63c98f868fb8\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418107 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-run-openvswitch\") pod \"9ed16b35-862f-47f2-9e32-63c98f868fb8\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418122 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-run-ovn-kubernetes\") pod \"9ed16b35-862f-47f2-9e32-63c98f868fb8\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418137 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-cni-netd\") pod \"9ed16b35-862f-47f2-9e32-63c98f868fb8\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418154 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-var-lib-openvswitch\") pod \"9ed16b35-862f-47f2-9e32-63c98f868fb8\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418163 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "9ed16b35-862f-47f2-9e32-63c98f868fb8" (UID: "9ed16b35-862f-47f2-9e32-63c98f868fb8"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418177 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"9ed16b35-862f-47f2-9e32-63c98f868fb8\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418197 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-run-systemd\") pod \"9ed16b35-862f-47f2-9e32-63c98f868fb8\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418199 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "9ed16b35-862f-47f2-9e32-63c98f868fb8" (UID: "9ed16b35-862f-47f2-9e32-63c98f868fb8"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418229 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "9ed16b35-862f-47f2-9e32-63c98f868fb8" (UID: "9ed16b35-862f-47f2-9e32-63c98f868fb8"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418250 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "9ed16b35-862f-47f2-9e32-63c98f868fb8" (UID: "9ed16b35-862f-47f2-9e32-63c98f868fb8"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418245 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-node-log" (OuterVolumeSpecName: "node-log") pod "9ed16b35-862f-47f2-9e32-63c98f868fb8" (UID: "9ed16b35-862f-47f2-9e32-63c98f868fb8"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418260 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "9ed16b35-862f-47f2-9e32-63c98f868fb8" (UID: "9ed16b35-862f-47f2-9e32-63c98f868fb8"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418292 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-log-socket\") pod \"9ed16b35-862f-47f2-9e32-63c98f868fb8\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418308 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-log-socket" (OuterVolumeSpecName: "log-socket") pod "9ed16b35-862f-47f2-9e32-63c98f868fb8" (UID: "9ed16b35-862f-47f2-9e32-63c98f868fb8"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418371 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ed16b35-862f-47f2-9e32-63c98f868fb8-ovnkube-config\") pod \"9ed16b35-862f-47f2-9e32-63c98f868fb8\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418497 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-slash\") pod \"9ed16b35-862f-47f2-9e32-63c98f868fb8\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418263 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "9ed16b35-862f-47f2-9e32-63c98f868fb8" (UID: "9ed16b35-862f-47f2-9e32-63c98f868fb8"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418496 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed16b35-862f-47f2-9e32-63c98f868fb8-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "9ed16b35-862f-47f2-9e32-63c98f868fb8" (UID: "9ed16b35-862f-47f2-9e32-63c98f868fb8"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418528 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-kubelet\") pod \"9ed16b35-862f-47f2-9e32-63c98f868fb8\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418540 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-slash" (OuterVolumeSpecName: "host-slash") pod "9ed16b35-862f-47f2-9e32-63c98f868fb8" (UID: "9ed16b35-862f-47f2-9e32-63c98f868fb8"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418552 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "9ed16b35-862f-47f2-9e32-63c98f868fb8" (UID: "9ed16b35-862f-47f2-9e32-63c98f868fb8"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418555 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ed16b35-862f-47f2-9e32-63c98f868fb8-env-overrides\") pod \"9ed16b35-862f-47f2-9e32-63c98f868fb8\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418631 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ed16b35-862f-47f2-9e32-63c98f868fb8-ovn-node-metrics-cert\") pod \"9ed16b35-862f-47f2-9e32-63c98f868fb8\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418659 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-systemd-units\") pod \"9ed16b35-862f-47f2-9e32-63c98f868fb8\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418704 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-run-ovn\") pod \"9ed16b35-862f-47f2-9e32-63c98f868fb8\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418723 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-etc-openvswitch\") pod \"9ed16b35-862f-47f2-9e32-63c98f868fb8\" (UID: \"9ed16b35-862f-47f2-9e32-63c98f868fb8\") " Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418808 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "9ed16b35-862f-47f2-9e32-63c98f868fb8" (UID: "9ed16b35-862f-47f2-9e32-63c98f868fb8"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418834 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed16b35-862f-47f2-9e32-63c98f868fb8-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "9ed16b35-862f-47f2-9e32-63c98f868fb8" (UID: "9ed16b35-862f-47f2-9e32-63c98f868fb8"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418824 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "9ed16b35-862f-47f2-9e32-63c98f868fb8" (UID: "9ed16b35-862f-47f2-9e32-63c98f868fb8"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.418918 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "9ed16b35-862f-47f2-9e32-63c98f868fb8" (UID: "9ed16b35-862f-47f2-9e32-63c98f868fb8"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.419025 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed16b35-862f-47f2-9e32-63c98f868fb8-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "9ed16b35-862f-47f2-9e32-63c98f868fb8" (UID: "9ed16b35-862f-47f2-9e32-63c98f868fb8"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.419114 4743 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.419126 4743 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.419135 4743 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.419143 4743 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9ed16b35-862f-47f2-9e32-63c98f868fb8-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.419150 4743 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-node-log\") on node \"crc\" DevicePath \"\"" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.419158 4743 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.419165 4743 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.419173 4743 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.419180 4743 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.419187 4743 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.419196 4743 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.419204 4743 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-log-socket\") on node \"crc\" DevicePath \"\"" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.419212 4743 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ed16b35-862f-47f2-9e32-63c98f868fb8-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.419220 4743 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-slash\") on node \"crc\" DevicePath \"\"" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.419228 4743 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.419236 4743 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ed16b35-862f-47f2-9e32-63c98f868fb8-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.419244 4743 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.423611 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ed16b35-862f-47f2-9e32-63c98f868fb8-kube-api-access-xw92h" (OuterVolumeSpecName: "kube-api-access-xw92h") pod "9ed16b35-862f-47f2-9e32-63c98f868fb8" (UID: "9ed16b35-862f-47f2-9e32-63c98f868fb8"). InnerVolumeSpecName "kube-api-access-xw92h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.429360 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ed16b35-862f-47f2-9e32-63c98f868fb8-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "9ed16b35-862f-47f2-9e32-63c98f868fb8" (UID: "9ed16b35-862f-47f2-9e32-63c98f868fb8"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.436353 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "9ed16b35-862f-47f2-9e32-63c98f868fb8" (UID: "9ed16b35-862f-47f2-9e32-63c98f868fb8"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.442318 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9qrj4"] Oct 11 01:01:10 crc kubenswrapper[4743]: E1011 01:01:10.442497 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="ovn-controller" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.442512 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="ovn-controller" Oct 11 01:01:10 crc kubenswrapper[4743]: E1011 01:01:10.442521 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="northd" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.442527 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="northd" Oct 11 01:01:10 crc kubenswrapper[4743]: E1011 01:01:10.442534 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f424c028-a0f6-4327-90f7-338a9d21043c" containerName="util" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.442539 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f424c028-a0f6-4327-90f7-338a9d21043c" containerName="util" Oct 11 01:01:10 crc kubenswrapper[4743]: E1011 01:01:10.442547 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="kubecfg-setup" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.442552 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="kubecfg-setup" Oct 11 01:01:10 crc kubenswrapper[4743]: E1011 01:01:10.442558 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="ovnkube-controller" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.442564 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="ovnkube-controller" Oct 11 01:01:10 crc kubenswrapper[4743]: E1011 01:01:10.442652 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="kube-rbac-proxy-ovn-metrics" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.442663 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="kube-rbac-proxy-ovn-metrics" Oct 11 01:01:10 crc kubenswrapper[4743]: E1011 01:01:10.442672 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="ovnkube-controller" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.442680 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="ovnkube-controller" Oct 11 01:01:10 crc kubenswrapper[4743]: E1011 01:01:10.442692 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="sbdb" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.442699 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="sbdb" Oct 11 01:01:10 crc kubenswrapper[4743]: E1011 01:01:10.442707 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f424c028-a0f6-4327-90f7-338a9d21043c" containerName="pull" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.442712 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f424c028-a0f6-4327-90f7-338a9d21043c" containerName="pull" Oct 11 01:01:10 crc kubenswrapper[4743]: E1011 01:01:10.442720 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f424c028-a0f6-4327-90f7-338a9d21043c" containerName="extract" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.442726 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f424c028-a0f6-4327-90f7-338a9d21043c" containerName="extract" Oct 11 01:01:10 crc kubenswrapper[4743]: E1011 01:01:10.442739 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="ovn-acl-logging" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.442745 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="ovn-acl-logging" Oct 11 01:01:10 crc kubenswrapper[4743]: E1011 01:01:10.442751 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="kube-rbac-proxy-node" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.442757 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="kube-rbac-proxy-node" Oct 11 01:01:10 crc kubenswrapper[4743]: E1011 01:01:10.442763 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="ovnkube-controller" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.442768 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="ovnkube-controller" Oct 11 01:01:10 crc kubenswrapper[4743]: E1011 01:01:10.442775 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="nbdb" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.442780 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="nbdb" Oct 11 01:01:10 crc kubenswrapper[4743]: E1011 01:01:10.442789 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="ovnkube-controller" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.442795 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="ovnkube-controller" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.442980 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="kube-rbac-proxy-node" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.442992 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="kube-rbac-proxy-ovn-metrics" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.443000 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="ovnkube-controller" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.443006 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="northd" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.443013 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="ovnkube-controller" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.443020 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f424c028-a0f6-4327-90f7-338a9d21043c" containerName="extract" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.443027 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="sbdb" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.443035 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="ovn-controller" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.443043 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="ovnkube-controller" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.443049 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="ovnkube-controller" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.443057 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="nbdb" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.443063 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="ovn-acl-logging" Oct 11 01:01:10 crc kubenswrapper[4743]: E1011 01:01:10.443145 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="ovnkube-controller" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.443152 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="ovnkube-controller" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.443226 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerName="ovnkube-controller" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.444579 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.520323 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-host-cni-bin\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.520363 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-run-ovn\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.520396 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-host-cni-netd\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.520413 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-var-lib-openvswitch\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.520440 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/76209f67-b445-40c1-bdb9-7b82c1b18860-ovn-node-metrics-cert\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.520458 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-log-socket\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.520476 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/76209f67-b445-40c1-bdb9-7b82c1b18860-env-overrides\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.520490 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/76209f67-b445-40c1-bdb9-7b82c1b18860-ovnkube-config\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.520505 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpnft\" (UniqueName: \"kubernetes.io/projected/76209f67-b445-40c1-bdb9-7b82c1b18860-kube-api-access-tpnft\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.520522 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-etc-openvswitch\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.520536 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-host-slash\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.520553 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-host-run-ovn-kubernetes\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.520566 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-host-run-netns\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.520581 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/76209f67-b445-40c1-bdb9-7b82c1b18860-ovnkube-script-lib\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.520595 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-node-log\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.520614 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-run-systemd\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.520629 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-systemd-units\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.520645 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.520667 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-run-openvswitch\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.520683 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-host-kubelet\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.520711 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw92h\" (UniqueName: \"kubernetes.io/projected/9ed16b35-862f-47f2-9e32-63c98f868fb8-kube-api-access-xw92h\") on node \"crc\" DevicePath \"\"" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.520720 4743 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9ed16b35-862f-47f2-9e32-63c98f868fb8-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.520729 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ed16b35-862f-47f2-9e32-63c98f868fb8-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.621783 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/76209f67-b445-40c1-bdb9-7b82c1b18860-ovn-node-metrics-cert\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.621873 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-log-socket\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.621904 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/76209f67-b445-40c1-bdb9-7b82c1b18860-env-overrides\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.621925 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/76209f67-b445-40c1-bdb9-7b82c1b18860-ovnkube-config\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.622307 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpnft\" (UniqueName: \"kubernetes.io/projected/76209f67-b445-40c1-bdb9-7b82c1b18860-kube-api-access-tpnft\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.622351 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-etc-openvswitch\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.622357 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-log-socket\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.622384 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-host-slash\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.622442 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-host-slash\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.622446 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-host-run-ovn-kubernetes\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.622513 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-host-run-netns\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.622517 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-host-run-ovn-kubernetes\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.622576 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-host-run-netns\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.622582 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-etc-openvswitch\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.622704 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/76209f67-b445-40c1-bdb9-7b82c1b18860-ovnkube-script-lib\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.622766 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-node-log\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.622810 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-run-systemd\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.622839 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-systemd-units\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.622902 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.622939 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-node-log\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.622956 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-run-openvswitch\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.622987 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-run-openvswitch\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.623035 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-systemd-units\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.623035 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-run-systemd\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.623087 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-host-kubelet\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.623088 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.623160 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-host-cni-bin\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.623189 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-host-kubelet\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.623217 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-run-ovn\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.623217 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/76209f67-b445-40c1-bdb9-7b82c1b18860-ovnkube-config\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.623234 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-host-cni-bin\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.623309 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-run-ovn\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.623331 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-host-cni-netd\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.623383 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-var-lib-openvswitch\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.623427 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-host-cni-netd\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.623508 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/76209f67-b445-40c1-bdb9-7b82c1b18860-ovnkube-script-lib\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.623737 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76209f67-b445-40c1-bdb9-7b82c1b18860-var-lib-openvswitch\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.625329 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/76209f67-b445-40c1-bdb9-7b82c1b18860-ovn-node-metrics-cert\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.630537 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/76209f67-b445-40c1-bdb9-7b82c1b18860-env-overrides\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.645122 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpnft\" (UniqueName: \"kubernetes.io/projected/76209f67-b445-40c1-bdb9-7b82c1b18860-kube-api-access-tpnft\") pod \"ovnkube-node-9qrj4\" (UID: \"76209f67-b445-40c1-bdb9-7b82c1b18860\") " pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.756678 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9jfxn_e8c603f4-717c-4554-992a-8338b3bef24d/kube-multus/2.log" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.757142 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.757222 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9jfxn_e8c603f4-717c-4554-992a-8338b3bef24d/kube-multus/1.log" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.757267 4743 generic.go:334] "Generic (PLEG): container finished" podID="e8c603f4-717c-4554-992a-8338b3bef24d" containerID="bdc42fd21a8b6982fc5516915cecbd0521737b5b4fd27556f887dbf66219ef33" exitCode=2 Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.757318 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9jfxn" event={"ID":"e8c603f4-717c-4554-992a-8338b3bef24d","Type":"ContainerDied","Data":"bdc42fd21a8b6982fc5516915cecbd0521737b5b4fd27556f887dbf66219ef33"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.757351 4743 scope.go:117] "RemoveContainer" containerID="21793d4ae38fc6e714912dbedbba8c45e37482a03cbd76d10461c41851e16896" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.757835 4743 scope.go:117] "RemoveContainer" containerID="bdc42fd21a8b6982fc5516915cecbd0521737b5b4fd27556f887dbf66219ef33" Oct 11 01:01:10 crc kubenswrapper[4743]: E1011 01:01:10.758042 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-9jfxn_openshift-multus(e8c603f4-717c-4554-992a-8338b3bef24d)\"" pod="openshift-multus/multus-9jfxn" podUID="e8c603f4-717c-4554-992a-8338b3bef24d" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.769141 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48ljj_9ed16b35-862f-47f2-9e32-63c98f868fb8/ovnkube-controller/3.log" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.770788 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48ljj_9ed16b35-862f-47f2-9e32-63c98f868fb8/ovn-acl-logging/0.log" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771185 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48ljj_9ed16b35-862f-47f2-9e32-63c98f868fb8/ovn-controller/0.log" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771471 4743 generic.go:334] "Generic (PLEG): container finished" podID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerID="4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae" exitCode=0 Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771496 4743 generic.go:334] "Generic (PLEG): container finished" podID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerID="ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0" exitCode=0 Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771505 4743 generic.go:334] "Generic (PLEG): container finished" podID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerID="487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35" exitCode=0 Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771512 4743 generic.go:334] "Generic (PLEG): container finished" podID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerID="acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885" exitCode=0 Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771519 4743 generic.go:334] "Generic (PLEG): container finished" podID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerID="4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b" exitCode=0 Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771526 4743 generic.go:334] "Generic (PLEG): container finished" podID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerID="d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee" exitCode=0 Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771534 4743 generic.go:334] "Generic (PLEG): container finished" podID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerID="17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63" exitCode=143 Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771542 4743 generic.go:334] "Generic (PLEG): container finished" podID="9ed16b35-862f-47f2-9e32-63c98f868fb8" containerID="d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe" exitCode=143 Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771562 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerDied","Data":"4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771587 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerDied","Data":"ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771599 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerDied","Data":"487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771610 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerDied","Data":"acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771620 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerDied","Data":"4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771630 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerDied","Data":"d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771641 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771650 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771564 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771656 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771796 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771818 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771826 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771833 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771839 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771847 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771913 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771939 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerDied","Data":"17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771964 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771975 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771983 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771990 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.771999 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772007 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772013 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772020 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772027 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772053 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772063 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerDied","Data":"d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772076 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772086 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772093 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772101 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772108 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772114 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772121 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772128 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772134 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772141 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772151 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48ljj" event={"ID":"9ed16b35-862f-47f2-9e32-63c98f868fb8","Type":"ContainerDied","Data":"2f5badd1be3c857cf217ca9129b12849439ab560d2d8295c9af0ad9dfafc557f"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772163 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772171 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772178 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772185 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772192 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772198 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772205 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772214 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772221 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.772228 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736"} Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.805285 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-48ljj"] Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.810505 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-48ljj"] Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.830849 4743 scope.go:117] "RemoveContainer" containerID="4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.853922 4743 scope.go:117] "RemoveContainer" containerID="268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.871198 4743 scope.go:117] "RemoveContainer" containerID="ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.886642 4743 scope.go:117] "RemoveContainer" containerID="487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.902303 4743 scope.go:117] "RemoveContainer" containerID="acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.919347 4743 scope.go:117] "RemoveContainer" containerID="4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.964085 4743 scope.go:117] "RemoveContainer" containerID="d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.976050 4743 scope.go:117] "RemoveContainer" containerID="17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63" Oct 11 01:01:10 crc kubenswrapper[4743]: I1011 01:01:10.989100 4743 scope.go:117] "RemoveContainer" containerID="d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.000489 4743 scope.go:117] "RemoveContainer" containerID="f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.015886 4743 scope.go:117] "RemoveContainer" containerID="4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae" Oct 11 01:01:11 crc kubenswrapper[4743]: E1011 01:01:11.020410 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae\": container with ID starting with 4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae not found: ID does not exist" containerID="4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.020462 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae"} err="failed to get container status \"4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae\": rpc error: code = NotFound desc = could not find container \"4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae\": container with ID starting with 4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.020490 4743 scope.go:117] "RemoveContainer" containerID="268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6" Oct 11 01:01:11 crc kubenswrapper[4743]: E1011 01:01:11.024332 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6\": container with ID starting with 268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6 not found: ID does not exist" containerID="268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.024372 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6"} err="failed to get container status \"268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6\": rpc error: code = NotFound desc = could not find container \"268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6\": container with ID starting with 268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6 not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.024401 4743 scope.go:117] "RemoveContainer" containerID="ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0" Oct 11 01:01:11 crc kubenswrapper[4743]: E1011 01:01:11.024701 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\": container with ID starting with ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0 not found: ID does not exist" containerID="ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.024738 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0"} err="failed to get container status \"ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\": rpc error: code = NotFound desc = could not find container \"ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\": container with ID starting with ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0 not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.024763 4743 scope.go:117] "RemoveContainer" containerID="487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35" Oct 11 01:01:11 crc kubenswrapper[4743]: E1011 01:01:11.027191 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\": container with ID starting with 487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35 not found: ID does not exist" containerID="487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.027235 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35"} err="failed to get container status \"487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\": rpc error: code = NotFound desc = could not find container \"487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\": container with ID starting with 487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35 not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.027263 4743 scope.go:117] "RemoveContainer" containerID="acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885" Oct 11 01:01:11 crc kubenswrapper[4743]: E1011 01:01:11.027482 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\": container with ID starting with acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885 not found: ID does not exist" containerID="acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.027513 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885"} err="failed to get container status \"acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\": rpc error: code = NotFound desc = could not find container \"acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\": container with ID starting with acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885 not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.027529 4743 scope.go:117] "RemoveContainer" containerID="4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b" Oct 11 01:01:11 crc kubenswrapper[4743]: E1011 01:01:11.027721 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\": container with ID starting with 4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b not found: ID does not exist" containerID="4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.027751 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b"} err="failed to get container status \"4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\": rpc error: code = NotFound desc = could not find container \"4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\": container with ID starting with 4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.027770 4743 scope.go:117] "RemoveContainer" containerID="d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee" Oct 11 01:01:11 crc kubenswrapper[4743]: E1011 01:01:11.028099 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\": container with ID starting with d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee not found: ID does not exist" containerID="d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.028128 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee"} err="failed to get container status \"d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\": rpc error: code = NotFound desc = could not find container \"d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\": container with ID starting with d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.028146 4743 scope.go:117] "RemoveContainer" containerID="17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63" Oct 11 01:01:11 crc kubenswrapper[4743]: E1011 01:01:11.031116 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\": container with ID starting with 17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63 not found: ID does not exist" containerID="17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.031147 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63"} err="failed to get container status \"17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\": rpc error: code = NotFound desc = could not find container \"17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\": container with ID starting with 17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63 not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.031168 4743 scope.go:117] "RemoveContainer" containerID="d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe" Oct 11 01:01:11 crc kubenswrapper[4743]: E1011 01:01:11.031471 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\": container with ID starting with d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe not found: ID does not exist" containerID="d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.031521 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe"} err="failed to get container status \"d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\": rpc error: code = NotFound desc = could not find container \"d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\": container with ID starting with d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.031555 4743 scope.go:117] "RemoveContainer" containerID="f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736" Oct 11 01:01:11 crc kubenswrapper[4743]: E1011 01:01:11.031834 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\": container with ID starting with f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736 not found: ID does not exist" containerID="f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.031869 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736"} err="failed to get container status \"f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\": rpc error: code = NotFound desc = could not find container \"f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\": container with ID starting with f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736 not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.031885 4743 scope.go:117] "RemoveContainer" containerID="4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.032116 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae"} err="failed to get container status \"4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae\": rpc error: code = NotFound desc = could not find container \"4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae\": container with ID starting with 4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.032140 4743 scope.go:117] "RemoveContainer" containerID="268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.032452 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6"} err="failed to get container status \"268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6\": rpc error: code = NotFound desc = could not find container \"268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6\": container with ID starting with 268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6 not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.032477 4743 scope.go:117] "RemoveContainer" containerID="ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.032678 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0"} err="failed to get container status \"ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\": rpc error: code = NotFound desc = could not find container \"ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\": container with ID starting with ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0 not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.032697 4743 scope.go:117] "RemoveContainer" containerID="487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.032920 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35"} err="failed to get container status \"487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\": rpc error: code = NotFound desc = could not find container \"487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\": container with ID starting with 487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35 not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.032948 4743 scope.go:117] "RemoveContainer" containerID="acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.033149 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885"} err="failed to get container status \"acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\": rpc error: code = NotFound desc = could not find container \"acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\": container with ID starting with acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885 not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.033177 4743 scope.go:117] "RemoveContainer" containerID="4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.033457 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b"} err="failed to get container status \"4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\": rpc error: code = NotFound desc = could not find container \"4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\": container with ID starting with 4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.033502 4743 scope.go:117] "RemoveContainer" containerID="d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.033759 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee"} err="failed to get container status \"d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\": rpc error: code = NotFound desc = could not find container \"d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\": container with ID starting with d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.033781 4743 scope.go:117] "RemoveContainer" containerID="17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.033995 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63"} err="failed to get container status \"17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\": rpc error: code = NotFound desc = could not find container \"17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\": container with ID starting with 17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63 not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.034024 4743 scope.go:117] "RemoveContainer" containerID="d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.034234 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe"} err="failed to get container status \"d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\": rpc error: code = NotFound desc = could not find container \"d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\": container with ID starting with d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.034255 4743 scope.go:117] "RemoveContainer" containerID="f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.034458 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736"} err="failed to get container status \"f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\": rpc error: code = NotFound desc = could not find container \"f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\": container with ID starting with f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736 not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.034484 4743 scope.go:117] "RemoveContainer" containerID="4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.034707 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae"} err="failed to get container status \"4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae\": rpc error: code = NotFound desc = could not find container \"4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae\": container with ID starting with 4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.034735 4743 scope.go:117] "RemoveContainer" containerID="268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.034989 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6"} err="failed to get container status \"268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6\": rpc error: code = NotFound desc = could not find container \"268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6\": container with ID starting with 268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6 not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.035018 4743 scope.go:117] "RemoveContainer" containerID="ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.035237 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0"} err="failed to get container status \"ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\": rpc error: code = NotFound desc = could not find container \"ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\": container with ID starting with ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0 not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.035275 4743 scope.go:117] "RemoveContainer" containerID="487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.035459 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35"} err="failed to get container status \"487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\": rpc error: code = NotFound desc = could not find container \"487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\": container with ID starting with 487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35 not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.035480 4743 scope.go:117] "RemoveContainer" containerID="acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.035683 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885"} err="failed to get container status \"acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\": rpc error: code = NotFound desc = could not find container \"acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\": container with ID starting with acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885 not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.035709 4743 scope.go:117] "RemoveContainer" containerID="4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.035888 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b"} err="failed to get container status \"4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\": rpc error: code = NotFound desc = could not find container \"4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\": container with ID starting with 4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.035906 4743 scope.go:117] "RemoveContainer" containerID="d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.036993 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee"} err="failed to get container status \"d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\": rpc error: code = NotFound desc = could not find container \"d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\": container with ID starting with d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.037022 4743 scope.go:117] "RemoveContainer" containerID="17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.037226 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63"} err="failed to get container status \"17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\": rpc error: code = NotFound desc = could not find container \"17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\": container with ID starting with 17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63 not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.037245 4743 scope.go:117] "RemoveContainer" containerID="d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.037432 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe"} err="failed to get container status \"d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\": rpc error: code = NotFound desc = could not find container \"d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\": container with ID starting with d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.037459 4743 scope.go:117] "RemoveContainer" containerID="f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.037623 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736"} err="failed to get container status \"f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\": rpc error: code = NotFound desc = could not find container \"f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\": container with ID starting with f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736 not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.037642 4743 scope.go:117] "RemoveContainer" containerID="4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.037820 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae"} err="failed to get container status \"4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae\": rpc error: code = NotFound desc = could not find container \"4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae\": container with ID starting with 4f3ab6c36ef4c34891a3cc561fdea33a7a62ede85f6d648d443467ab907ad8ae not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.037838 4743 scope.go:117] "RemoveContainer" containerID="268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.038056 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6"} err="failed to get container status \"268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6\": rpc error: code = NotFound desc = could not find container \"268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6\": container with ID starting with 268a98616dcd60ad334eba93f4b3443073f8a901429f6b98e7f89c2740c81da6 not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.038076 4743 scope.go:117] "RemoveContainer" containerID="ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.038277 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0"} err="failed to get container status \"ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\": rpc error: code = NotFound desc = could not find container \"ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0\": container with ID starting with ecec5075382354c47a6ec8f53263fe6c244c7aadfeab1dda60dd2e88cc8724b0 not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.038304 4743 scope.go:117] "RemoveContainer" containerID="487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.038496 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35"} err="failed to get container status \"487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\": rpc error: code = NotFound desc = could not find container \"487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35\": container with ID starting with 487448bc7b1826df7dbc3aeac42ad10d6aed3390c27c95dfda1b81fe09e27f35 not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.038522 4743 scope.go:117] "RemoveContainer" containerID="acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.039046 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885"} err="failed to get container status \"acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\": rpc error: code = NotFound desc = could not find container \"acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885\": container with ID starting with acfe8e798b39769fe2643c6f761b388dc220ccfc707f0023dd680196514b8885 not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.039070 4743 scope.go:117] "RemoveContainer" containerID="4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.039362 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b"} err="failed to get container status \"4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\": rpc error: code = NotFound desc = could not find container \"4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b\": container with ID starting with 4127195cac8b12296c2aaefc500beb5f746bee518f3885c094e7bca83ade5e6b not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.039389 4743 scope.go:117] "RemoveContainer" containerID="d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.039578 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee"} err="failed to get container status \"d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\": rpc error: code = NotFound desc = could not find container \"d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee\": container with ID starting with d54b05ec17db8c8c6c3ff8e4d703de8f1f1f0d7e9b5b0133afdf30e3c2b8acee not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.039595 4743 scope.go:117] "RemoveContainer" containerID="17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.039773 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63"} err="failed to get container status \"17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\": rpc error: code = NotFound desc = could not find container \"17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63\": container with ID starting with 17db7cfee85ea1051eba348aa0ec7bc8e1354c49e3e4740745a51d3db0ffcf63 not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.039791 4743 scope.go:117] "RemoveContainer" containerID="d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.040000 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe"} err="failed to get container status \"d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\": rpc error: code = NotFound desc = could not find container \"d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe\": container with ID starting with d385e354cb831573c89886ade9127287f0974b7a7c244ead9e5547e0211fd9fe not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.040019 4743 scope.go:117] "RemoveContainer" containerID="f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.040196 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736"} err="failed to get container status \"f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\": rpc error: code = NotFound desc = could not find container \"f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736\": container with ID starting with f300eaf5d1c09f3637f07c6d71c96b49e955d8319841834440872f13c061e736 not found: ID does not exist" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.777204 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9jfxn_e8c603f4-717c-4554-992a-8338b3bef24d/kube-multus/2.log" Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.778396 4743 generic.go:334] "Generic (PLEG): container finished" podID="76209f67-b445-40c1-bdb9-7b82c1b18860" containerID="9b291e14fb6cf77cdd59f0154c79d02b5b2e7936a882cf0cd00e32c0ce6b5d07" exitCode=0 Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.778431 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" event={"ID":"76209f67-b445-40c1-bdb9-7b82c1b18860","Type":"ContainerDied","Data":"9b291e14fb6cf77cdd59f0154c79d02b5b2e7936a882cf0cd00e32c0ce6b5d07"} Oct 11 01:01:11 crc kubenswrapper[4743]: I1011 01:01:11.778453 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" event={"ID":"76209f67-b445-40c1-bdb9-7b82c1b18860","Type":"ContainerStarted","Data":"e12d4b1d3c5caa448c0002ead255c6edc24142fccc0a9e282fcbfdccdda7dec8"} Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.097525 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ed16b35-862f-47f2-9e32-63c98f868fb8" path="/var/lib/kubelet/pods/9ed16b35-862f-47f2-9e32-63c98f868fb8/volumes" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.155028 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm"] Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.155629 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.159871 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.159923 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-vnp4w" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.160657 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.242416 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7dfx\" (UniqueName: \"kubernetes.io/projected/9918e6d2-f9d4-4c2f-93ef-cc952577182b-kube-api-access-q7dfx\") pod \"obo-prometheus-operator-7c8cf85677-4ftxm\" (UID: \"9918e6d2-f9d4-4c2f-93ef-cc952577182b\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.291177 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb"] Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.291849 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.293160 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-bl4mz" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.293421 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.303899 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg"] Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.304541 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.343334 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97d6cff1-f86b-4110-9c64-907a97ea4ceb-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-55f6745849-chpjb\" (UID: \"97d6cff1-f86b-4110-9c64-907a97ea4ceb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.343376 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97d6cff1-f86b-4110-9c64-907a97ea4ceb-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-55f6745849-chpjb\" (UID: \"97d6cff1-f86b-4110-9c64-907a97ea4ceb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.343779 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7dfx\" (UniqueName: \"kubernetes.io/projected/9918e6d2-f9d4-4c2f-93ef-cc952577182b-kube-api-access-q7dfx\") pod \"obo-prometheus-operator-7c8cf85677-4ftxm\" (UID: \"9918e6d2-f9d4-4c2f-93ef-cc952577182b\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.361577 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7dfx\" (UniqueName: \"kubernetes.io/projected/9918e6d2-f9d4-4c2f-93ef-cc952577182b-kube-api-access-q7dfx\") pod \"obo-prometheus-operator-7c8cf85677-4ftxm\" (UID: \"9918e6d2-f9d4-4c2f-93ef-cc952577182b\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.444828 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97d6cff1-f86b-4110-9c64-907a97ea4ceb-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-55f6745849-chpjb\" (UID: \"97d6cff1-f86b-4110-9c64-907a97ea4ceb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.444883 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97d6cff1-f86b-4110-9c64-907a97ea4ceb-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-55f6745849-chpjb\" (UID: \"97d6cff1-f86b-4110-9c64-907a97ea4ceb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.444917 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e62c0910-f3a4-4c85-9ad5-88f6fa5262df-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-55f6745849-6fxfg\" (UID: \"e62c0910-f3a4-4c85-9ad5-88f6fa5262df\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.444976 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e62c0910-f3a4-4c85-9ad5-88f6fa5262df-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-55f6745849-6fxfg\" (UID: \"e62c0910-f3a4-4c85-9ad5-88f6fa5262df\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.447836 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97d6cff1-f86b-4110-9c64-907a97ea4ceb-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-55f6745849-chpjb\" (UID: \"97d6cff1-f86b-4110-9c64-907a97ea4ceb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.450256 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97d6cff1-f86b-4110-9c64-907a97ea4ceb-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-55f6745849-chpjb\" (UID: \"97d6cff1-f86b-4110-9c64-907a97ea4ceb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.474669 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-sgfrq"] Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.475320 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.476799 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-nfcxh" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.476921 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.493268 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm" Oct 11 01:01:12 crc kubenswrapper[4743]: E1011 01:01:12.514952 4743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-4ftxm_openshift-operators_9918e6d2-f9d4-4c2f-93ef-cc952577182b_0(bf6c31dbd791890217c66d16f88c619d162fc86d5769b0e050f04b5bb8732743): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 11 01:01:12 crc kubenswrapper[4743]: E1011 01:01:12.515024 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-4ftxm_openshift-operators_9918e6d2-f9d4-4c2f-93ef-cc952577182b_0(bf6c31dbd791890217c66d16f88c619d162fc86d5769b0e050f04b5bb8732743): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm" Oct 11 01:01:12 crc kubenswrapper[4743]: E1011 01:01:12.515047 4743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-4ftxm_openshift-operators_9918e6d2-f9d4-4c2f-93ef-cc952577182b_0(bf6c31dbd791890217c66d16f88c619d162fc86d5769b0e050f04b5bb8732743): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm" Oct 11 01:01:12 crc kubenswrapper[4743]: E1011 01:01:12.515089 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-7c8cf85677-4ftxm_openshift-operators(9918e6d2-f9d4-4c2f-93ef-cc952577182b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-7c8cf85677-4ftxm_openshift-operators(9918e6d2-f9d4-4c2f-93ef-cc952577182b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-4ftxm_openshift-operators_9918e6d2-f9d4-4c2f-93ef-cc952577182b_0(bf6c31dbd791890217c66d16f88c619d162fc86d5769b0e050f04b5bb8732743): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm" podUID="9918e6d2-f9d4-4c2f-93ef-cc952577182b" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.546298 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e62c0910-f3a4-4c85-9ad5-88f6fa5262df-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-55f6745849-6fxfg\" (UID: \"e62c0910-f3a4-4c85-9ad5-88f6fa5262df\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.546382 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e62c0910-f3a4-4c85-9ad5-88f6fa5262df-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-55f6745849-6fxfg\" (UID: \"e62c0910-f3a4-4c85-9ad5-88f6fa5262df\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.546414 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/c4642856-48b2-4843-a11e-1a207a8c8efc-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-sgfrq\" (UID: \"c4642856-48b2-4843-a11e-1a207a8c8efc\") " pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.546434 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6tck\" (UniqueName: \"kubernetes.io/projected/c4642856-48b2-4843-a11e-1a207a8c8efc-kube-api-access-v6tck\") pod \"observability-operator-cc5f78dfc-sgfrq\" (UID: \"c4642856-48b2-4843-a11e-1a207a8c8efc\") " pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.549639 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e62c0910-f3a4-4c85-9ad5-88f6fa5262df-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-55f6745849-6fxfg\" (UID: \"e62c0910-f3a4-4c85-9ad5-88f6fa5262df\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.550287 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e62c0910-f3a4-4c85-9ad5-88f6fa5262df-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-55f6745849-6fxfg\" (UID: \"e62c0910-f3a4-4c85-9ad5-88f6fa5262df\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.576071 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-bdtnb"] Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.576704 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.578718 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-m5k7g" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.621277 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" Oct 11 01:01:12 crc kubenswrapper[4743]: E1011 01:01:12.641484 4743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55f6745849-chpjb_openshift-operators_97d6cff1-f86b-4110-9c64-907a97ea4ceb_0(6487c7f00c552ba67689d3132c38b8a483c50a31d4bc487515f1c40697eb19d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 11 01:01:12 crc kubenswrapper[4743]: E1011 01:01:12.641549 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55f6745849-chpjb_openshift-operators_97d6cff1-f86b-4110-9c64-907a97ea4ceb_0(6487c7f00c552ba67689d3132c38b8a483c50a31d4bc487515f1c40697eb19d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" Oct 11 01:01:12 crc kubenswrapper[4743]: E1011 01:01:12.641571 4743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55f6745849-chpjb_openshift-operators_97d6cff1-f86b-4110-9c64-907a97ea4ceb_0(6487c7f00c552ba67689d3132c38b8a483c50a31d4bc487515f1c40697eb19d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" Oct 11 01:01:12 crc kubenswrapper[4743]: E1011 01:01:12.641613 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-55f6745849-chpjb_openshift-operators(97d6cff1-f86b-4110-9c64-907a97ea4ceb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-55f6745849-chpjb_openshift-operators(97d6cff1-f86b-4110-9c64-907a97ea4ceb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55f6745849-chpjb_openshift-operators_97d6cff1-f86b-4110-9c64-907a97ea4ceb_0(6487c7f00c552ba67689d3132c38b8a483c50a31d4bc487515f1c40697eb19d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" podUID="97d6cff1-f86b-4110-9c64-907a97ea4ceb" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.645229 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.647778 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/c4642856-48b2-4843-a11e-1a207a8c8efc-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-sgfrq\" (UID: \"c4642856-48b2-4843-a11e-1a207a8c8efc\") " pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.647820 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6tck\" (UniqueName: \"kubernetes.io/projected/c4642856-48b2-4843-a11e-1a207a8c8efc-kube-api-access-v6tck\") pod \"observability-operator-cc5f78dfc-sgfrq\" (UID: \"c4642856-48b2-4843-a11e-1a207a8c8efc\") " pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.647878 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/fe87b24f-db4d-49cd-a2de-ab949443ecea-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-bdtnb\" (UID: \"fe87b24f-db4d-49cd-a2de-ab949443ecea\") " pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.647918 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd6w2\" (UniqueName: \"kubernetes.io/projected/fe87b24f-db4d-49cd-a2de-ab949443ecea-kube-api-access-wd6w2\") pod \"perses-operator-54bc95c9fb-bdtnb\" (UID: \"fe87b24f-db4d-49cd-a2de-ab949443ecea\") " pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.651089 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/c4642856-48b2-4843-a11e-1a207a8c8efc-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-sgfrq\" (UID: \"c4642856-48b2-4843-a11e-1a207a8c8efc\") " pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" Oct 11 01:01:12 crc kubenswrapper[4743]: E1011 01:01:12.667030 4743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55f6745849-6fxfg_openshift-operators_e62c0910-f3a4-4c85-9ad5-88f6fa5262df_0(c1f1f72657db8c7d120d5d70c19d6cd67c3172693a70cfac3c07d3abad7b1725): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 11 01:01:12 crc kubenswrapper[4743]: E1011 01:01:12.667100 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55f6745849-6fxfg_openshift-operators_e62c0910-f3a4-4c85-9ad5-88f6fa5262df_0(c1f1f72657db8c7d120d5d70c19d6cd67c3172693a70cfac3c07d3abad7b1725): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" Oct 11 01:01:12 crc kubenswrapper[4743]: E1011 01:01:12.667120 4743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55f6745849-6fxfg_openshift-operators_e62c0910-f3a4-4c85-9ad5-88f6fa5262df_0(c1f1f72657db8c7d120d5d70c19d6cd67c3172693a70cfac3c07d3abad7b1725): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" Oct 11 01:01:12 crc kubenswrapper[4743]: E1011 01:01:12.667163 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-55f6745849-6fxfg_openshift-operators(e62c0910-f3a4-4c85-9ad5-88f6fa5262df)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-55f6745849-6fxfg_openshift-operators(e62c0910-f3a4-4c85-9ad5-88f6fa5262df)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55f6745849-6fxfg_openshift-operators_e62c0910-f3a4-4c85-9ad5-88f6fa5262df_0(c1f1f72657db8c7d120d5d70c19d6cd67c3172693a70cfac3c07d3abad7b1725): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" podUID="e62c0910-f3a4-4c85-9ad5-88f6fa5262df" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.672472 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6tck\" (UniqueName: \"kubernetes.io/projected/c4642856-48b2-4843-a11e-1a207a8c8efc-kube-api-access-v6tck\") pod \"observability-operator-cc5f78dfc-sgfrq\" (UID: \"c4642856-48b2-4843-a11e-1a207a8c8efc\") " pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.748809 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/fe87b24f-db4d-49cd-a2de-ab949443ecea-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-bdtnb\" (UID: \"fe87b24f-db4d-49cd-a2de-ab949443ecea\") " pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.748898 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd6w2\" (UniqueName: \"kubernetes.io/projected/fe87b24f-db4d-49cd-a2de-ab949443ecea-kube-api-access-wd6w2\") pod \"perses-operator-54bc95c9fb-bdtnb\" (UID: \"fe87b24f-db4d-49cd-a2de-ab949443ecea\") " pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.750037 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/fe87b24f-db4d-49cd-a2de-ab949443ecea-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-bdtnb\" (UID: \"fe87b24f-db4d-49cd-a2de-ab949443ecea\") " pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.764426 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd6w2\" (UniqueName: \"kubernetes.io/projected/fe87b24f-db4d-49cd-a2de-ab949443ecea-kube-api-access-wd6w2\") pod \"perses-operator-54bc95c9fb-bdtnb\" (UID: \"fe87b24f-db4d-49cd-a2de-ab949443ecea\") " pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.788265 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.788818 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" event={"ID":"76209f67-b445-40c1-bdb9-7b82c1b18860","Type":"ContainerStarted","Data":"5f3c58fedf5d69662cc576cf1cf51c3b694e5f39e731d9e0bed8ada4a39a03ac"} Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.788869 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" event={"ID":"76209f67-b445-40c1-bdb9-7b82c1b18860","Type":"ContainerStarted","Data":"9c1e48b4fa46df5c94dfd894f6e20e4a9c38f5d2dd1de96e0c428b5dae390703"} Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.788881 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" event={"ID":"76209f67-b445-40c1-bdb9-7b82c1b18860","Type":"ContainerStarted","Data":"2318b9046432c7b8a442868fa7ae81588b211f81d70d55455baf2a2db0dcb824"} Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.788891 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" event={"ID":"76209f67-b445-40c1-bdb9-7b82c1b18860","Type":"ContainerStarted","Data":"aac3dde65a0a05b2e6becea7de42dec86ff371732ad463c004a54b120e446293"} Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.788900 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" event={"ID":"76209f67-b445-40c1-bdb9-7b82c1b18860","Type":"ContainerStarted","Data":"efce69acabfec17d5d7f353eb95d91c98a12d6b7769d25db09cb1938b11264d2"} Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.788908 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" event={"ID":"76209f67-b445-40c1-bdb9-7b82c1b18860","Type":"ContainerStarted","Data":"76cbcd9c474199e800ae3837ebc909abb1726f02914cf2401624c93f3aff429b"} Oct 11 01:01:12 crc kubenswrapper[4743]: E1011 01:01:12.814287 4743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-sgfrq_openshift-operators_c4642856-48b2-4843-a11e-1a207a8c8efc_0(2c37ea922b741697c3db5ff5a022dec0abc788bd23320b260ae791c3924d990e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 11 01:01:12 crc kubenswrapper[4743]: E1011 01:01:12.814348 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-sgfrq_openshift-operators_c4642856-48b2-4843-a11e-1a207a8c8efc_0(2c37ea922b741697c3db5ff5a022dec0abc788bd23320b260ae791c3924d990e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" Oct 11 01:01:12 crc kubenswrapper[4743]: E1011 01:01:12.814370 4743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-sgfrq_openshift-operators_c4642856-48b2-4843-a11e-1a207a8c8efc_0(2c37ea922b741697c3db5ff5a022dec0abc788bd23320b260ae791c3924d990e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" Oct 11 01:01:12 crc kubenswrapper[4743]: E1011 01:01:12.814412 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-cc5f78dfc-sgfrq_openshift-operators(c4642856-48b2-4843-a11e-1a207a8c8efc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-cc5f78dfc-sgfrq_openshift-operators(c4642856-48b2-4843-a11e-1a207a8c8efc)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-sgfrq_openshift-operators_c4642856-48b2-4843-a11e-1a207a8c8efc_0(2c37ea922b741697c3db5ff5a022dec0abc788bd23320b260ae791c3924d990e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" podUID="c4642856-48b2-4843-a11e-1a207a8c8efc" Oct 11 01:01:12 crc kubenswrapper[4743]: I1011 01:01:12.893613 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" Oct 11 01:01:12 crc kubenswrapper[4743]: E1011 01:01:12.909560 4743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-bdtnb_openshift-operators_fe87b24f-db4d-49cd-a2de-ab949443ecea_0(5c1d838c969fd43a4731a09184c5523dd9f2d1c9f50805783a619e7db076780f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 11 01:01:12 crc kubenswrapper[4743]: E1011 01:01:12.909620 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-bdtnb_openshift-operators_fe87b24f-db4d-49cd-a2de-ab949443ecea_0(5c1d838c969fd43a4731a09184c5523dd9f2d1c9f50805783a619e7db076780f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" Oct 11 01:01:12 crc kubenswrapper[4743]: E1011 01:01:12.909646 4743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-bdtnb_openshift-operators_fe87b24f-db4d-49cd-a2de-ab949443ecea_0(5c1d838c969fd43a4731a09184c5523dd9f2d1c9f50805783a619e7db076780f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" Oct 11 01:01:12 crc kubenswrapper[4743]: E1011 01:01:12.909696 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-54bc95c9fb-bdtnb_openshift-operators(fe87b24f-db4d-49cd-a2de-ab949443ecea)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-54bc95c9fb-bdtnb_openshift-operators(fe87b24f-db4d-49cd-a2de-ab949443ecea)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-bdtnb_openshift-operators_fe87b24f-db4d-49cd-a2de-ab949443ecea_0(5c1d838c969fd43a4731a09184c5523dd9f2d1c9f50805783a619e7db076780f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" podUID="fe87b24f-db4d-49cd-a2de-ab949443ecea" Oct 11 01:01:14 crc kubenswrapper[4743]: I1011 01:01:14.458707 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:01:14 crc kubenswrapper[4743]: I1011 01:01:14.458999 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:01:14 crc kubenswrapper[4743]: I1011 01:01:14.459045 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 01:01:14 crc kubenswrapper[4743]: I1011 01:01:14.459582 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f4a50228ff369b6861f5b6579c1f5b36360b57624267c00cfb8d313cffab1c5d"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 01:01:14 crc kubenswrapper[4743]: I1011 01:01:14.459634 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://f4a50228ff369b6861f5b6579c1f5b36360b57624267c00cfb8d313cffab1c5d" gracePeriod=600 Oct 11 01:01:14 crc kubenswrapper[4743]: I1011 01:01:14.802273 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="f4a50228ff369b6861f5b6579c1f5b36360b57624267c00cfb8d313cffab1c5d" exitCode=0 Oct 11 01:01:14 crc kubenswrapper[4743]: I1011 01:01:14.802353 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"f4a50228ff369b6861f5b6579c1f5b36360b57624267c00cfb8d313cffab1c5d"} Oct 11 01:01:14 crc kubenswrapper[4743]: I1011 01:01:14.802689 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"88127d52f7db156c5804bc403a408594bcfb43a90269eb1302483bd25dec7ebe"} Oct 11 01:01:14 crc kubenswrapper[4743]: I1011 01:01:14.802717 4743 scope.go:117] "RemoveContainer" containerID="d4ccb047d6639dbadc8db37d34bacbcce79ae6b61d67f9ebe1557bf1798750cf" Oct 11 01:01:14 crc kubenswrapper[4743]: I1011 01:01:14.808478 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" event={"ID":"76209f67-b445-40c1-bdb9-7b82c1b18860","Type":"ContainerStarted","Data":"dd564675c32b1a1141ea69e26478829225ee71161ddc4a7ca6e29a646463ec3c"} Oct 11 01:01:17 crc kubenswrapper[4743]: I1011 01:01:17.767609 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-sgfrq"] Oct 11 01:01:17 crc kubenswrapper[4743]: I1011 01:01:17.768307 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" Oct 11 01:01:17 crc kubenswrapper[4743]: I1011 01:01:17.768863 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" Oct 11 01:01:17 crc kubenswrapper[4743]: E1011 01:01:17.824822 4743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-sgfrq_openshift-operators_c4642856-48b2-4843-a11e-1a207a8c8efc_0(94b0b3d3f31c172733e014a8f6065aebdf34701a2b1b6e84765b83feb9842dc4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 11 01:01:17 crc kubenswrapper[4743]: E1011 01:01:17.824901 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-sgfrq_openshift-operators_c4642856-48b2-4843-a11e-1a207a8c8efc_0(94b0b3d3f31c172733e014a8f6065aebdf34701a2b1b6e84765b83feb9842dc4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" Oct 11 01:01:17 crc kubenswrapper[4743]: E1011 01:01:17.824921 4743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-sgfrq_openshift-operators_c4642856-48b2-4843-a11e-1a207a8c8efc_0(94b0b3d3f31c172733e014a8f6065aebdf34701a2b1b6e84765b83feb9842dc4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" Oct 11 01:01:17 crc kubenswrapper[4743]: E1011 01:01:17.824967 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-cc5f78dfc-sgfrq_openshift-operators(c4642856-48b2-4843-a11e-1a207a8c8efc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-cc5f78dfc-sgfrq_openshift-operators(c4642856-48b2-4843-a11e-1a207a8c8efc)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-sgfrq_openshift-operators_c4642856-48b2-4843-a11e-1a207a8c8efc_0(94b0b3d3f31c172733e014a8f6065aebdf34701a2b1b6e84765b83feb9842dc4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" podUID="c4642856-48b2-4843-a11e-1a207a8c8efc" Oct 11 01:01:17 crc kubenswrapper[4743]: I1011 01:01:17.828553 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg"] Oct 11 01:01:17 crc kubenswrapper[4743]: I1011 01:01:17.828676 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" Oct 11 01:01:17 crc kubenswrapper[4743]: I1011 01:01:17.829115 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" Oct 11 01:01:17 crc kubenswrapper[4743]: I1011 01:01:17.835694 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" event={"ID":"76209f67-b445-40c1-bdb9-7b82c1b18860","Type":"ContainerStarted","Data":"3b13e8ace1b963fa5b89d6db97257477ff341c4d2f27613ffb2c4f600dee3937"} Oct 11 01:01:17 crc kubenswrapper[4743]: I1011 01:01:17.836836 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:17 crc kubenswrapper[4743]: I1011 01:01:17.836878 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:17 crc kubenswrapper[4743]: I1011 01:01:17.836950 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:17 crc kubenswrapper[4743]: I1011 01:01:17.850582 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-bdtnb"] Oct 11 01:01:17 crc kubenswrapper[4743]: I1011 01:01:17.850703 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" Oct 11 01:01:17 crc kubenswrapper[4743]: I1011 01:01:17.851139 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" Oct 11 01:01:17 crc kubenswrapper[4743]: I1011 01:01:17.863598 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm"] Oct 11 01:01:17 crc kubenswrapper[4743]: I1011 01:01:17.863704 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm" Oct 11 01:01:17 crc kubenswrapper[4743]: E1011 01:01:17.864922 4743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55f6745849-6fxfg_openshift-operators_e62c0910-f3a4-4c85-9ad5-88f6fa5262df_0(b7d96d4f7c778d1e4feebfad184eb7d93bfa25bcf0b4f5b3e50d27fcff07c0e8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 11 01:01:17 crc kubenswrapper[4743]: E1011 01:01:17.865030 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55f6745849-6fxfg_openshift-operators_e62c0910-f3a4-4c85-9ad5-88f6fa5262df_0(b7d96d4f7c778d1e4feebfad184eb7d93bfa25bcf0b4f5b3e50d27fcff07c0e8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" Oct 11 01:01:17 crc kubenswrapper[4743]: E1011 01:01:17.865109 4743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55f6745849-6fxfg_openshift-operators_e62c0910-f3a4-4c85-9ad5-88f6fa5262df_0(b7d96d4f7c778d1e4feebfad184eb7d93bfa25bcf0b4f5b3e50d27fcff07c0e8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" Oct 11 01:01:17 crc kubenswrapper[4743]: E1011 01:01:17.865206 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-55f6745849-6fxfg_openshift-operators(e62c0910-f3a4-4c85-9ad5-88f6fa5262df)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-55f6745849-6fxfg_openshift-operators(e62c0910-f3a4-4c85-9ad5-88f6fa5262df)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55f6745849-6fxfg_openshift-operators_e62c0910-f3a4-4c85-9ad5-88f6fa5262df_0(b7d96d4f7c778d1e4feebfad184eb7d93bfa25bcf0b4f5b3e50d27fcff07c0e8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" podUID="e62c0910-f3a4-4c85-9ad5-88f6fa5262df" Oct 11 01:01:17 crc kubenswrapper[4743]: I1011 01:01:17.869375 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm" Oct 11 01:01:17 crc kubenswrapper[4743]: I1011 01:01:17.878196 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb"] Oct 11 01:01:17 crc kubenswrapper[4743]: I1011 01:01:17.878296 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" Oct 11 01:01:17 crc kubenswrapper[4743]: I1011 01:01:17.878671 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" Oct 11 01:01:17 crc kubenswrapper[4743]: E1011 01:01:17.887560 4743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-bdtnb_openshift-operators_fe87b24f-db4d-49cd-a2de-ab949443ecea_0(7ee04f4078c2e8a2890e6142c978f7ceb11345ff51aad737514591e046f9203a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 11 01:01:17 crc kubenswrapper[4743]: E1011 01:01:17.887621 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-bdtnb_openshift-operators_fe87b24f-db4d-49cd-a2de-ab949443ecea_0(7ee04f4078c2e8a2890e6142c978f7ceb11345ff51aad737514591e046f9203a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" Oct 11 01:01:17 crc kubenswrapper[4743]: E1011 01:01:17.887641 4743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-bdtnb_openshift-operators_fe87b24f-db4d-49cd-a2de-ab949443ecea_0(7ee04f4078c2e8a2890e6142c978f7ceb11345ff51aad737514591e046f9203a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" Oct 11 01:01:17 crc kubenswrapper[4743]: E1011 01:01:17.887678 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-54bc95c9fb-bdtnb_openshift-operators(fe87b24f-db4d-49cd-a2de-ab949443ecea)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-54bc95c9fb-bdtnb_openshift-operators(fe87b24f-db4d-49cd-a2de-ab949443ecea)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-bdtnb_openshift-operators_fe87b24f-db4d-49cd-a2de-ab949443ecea_0(7ee04f4078c2e8a2890e6142c978f7ceb11345ff51aad737514591e046f9203a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" podUID="fe87b24f-db4d-49cd-a2de-ab949443ecea" Oct 11 01:01:17 crc kubenswrapper[4743]: I1011 01:01:17.892687 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" podStartSLOduration=7.892677481 podStartE2EDuration="7.892677481s" podCreationTimestamp="2025-10-11 01:01:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:01:17.890127473 +0000 UTC m=+572.543107880" watchObservedRunningTime="2025-10-11 01:01:17.892677481 +0000 UTC m=+572.545657868" Oct 11 01:01:17 crc kubenswrapper[4743]: I1011 01:01:17.895119 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:17 crc kubenswrapper[4743]: I1011 01:01:17.902274 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:17 crc kubenswrapper[4743]: E1011 01:01:17.920294 4743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55f6745849-chpjb_openshift-operators_97d6cff1-f86b-4110-9c64-907a97ea4ceb_0(2634b979d22490df7e9e7ec66ee48a43ed01b261b20502b3810d88eed4aa8e33): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 11 01:01:17 crc kubenswrapper[4743]: E1011 01:01:17.920622 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55f6745849-chpjb_openshift-operators_97d6cff1-f86b-4110-9c64-907a97ea4ceb_0(2634b979d22490df7e9e7ec66ee48a43ed01b261b20502b3810d88eed4aa8e33): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" Oct 11 01:01:17 crc kubenswrapper[4743]: E1011 01:01:17.920645 4743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55f6745849-chpjb_openshift-operators_97d6cff1-f86b-4110-9c64-907a97ea4ceb_0(2634b979d22490df7e9e7ec66ee48a43ed01b261b20502b3810d88eed4aa8e33): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" Oct 11 01:01:17 crc kubenswrapper[4743]: E1011 01:01:17.920691 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-55f6745849-chpjb_openshift-operators(97d6cff1-f86b-4110-9c64-907a97ea4ceb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-55f6745849-chpjb_openshift-operators(97d6cff1-f86b-4110-9c64-907a97ea4ceb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55f6745849-chpjb_openshift-operators_97d6cff1-f86b-4110-9c64-907a97ea4ceb_0(2634b979d22490df7e9e7ec66ee48a43ed01b261b20502b3810d88eed4aa8e33): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" podUID="97d6cff1-f86b-4110-9c64-907a97ea4ceb" Oct 11 01:01:17 crc kubenswrapper[4743]: E1011 01:01:17.923745 4743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-4ftxm_openshift-operators_9918e6d2-f9d4-4c2f-93ef-cc952577182b_0(ac31b17a1d98f764b7bbcd815d57d09f6687e581801e16befc99a3643ac344c8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 11 01:01:17 crc kubenswrapper[4743]: E1011 01:01:17.923777 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-4ftxm_openshift-operators_9918e6d2-f9d4-4c2f-93ef-cc952577182b_0(ac31b17a1d98f764b7bbcd815d57d09f6687e581801e16befc99a3643ac344c8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm" Oct 11 01:01:17 crc kubenswrapper[4743]: E1011 01:01:17.923793 4743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-4ftxm_openshift-operators_9918e6d2-f9d4-4c2f-93ef-cc952577182b_0(ac31b17a1d98f764b7bbcd815d57d09f6687e581801e16befc99a3643ac344c8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm" Oct 11 01:01:17 crc kubenswrapper[4743]: E1011 01:01:17.923824 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-7c8cf85677-4ftxm_openshift-operators(9918e6d2-f9d4-4c2f-93ef-cc952577182b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-7c8cf85677-4ftxm_openshift-operators(9918e6d2-f9d4-4c2f-93ef-cc952577182b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-4ftxm_openshift-operators_9918e6d2-f9d4-4c2f-93ef-cc952577182b_0(ac31b17a1d98f764b7bbcd815d57d09f6687e581801e16befc99a3643ac344c8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm" podUID="9918e6d2-f9d4-4c2f-93ef-cc952577182b" Oct 11 01:01:22 crc kubenswrapper[4743]: I1011 01:01:22.092673 4743 scope.go:117] "RemoveContainer" containerID="bdc42fd21a8b6982fc5516915cecbd0521737b5b4fd27556f887dbf66219ef33" Oct 11 01:01:22 crc kubenswrapper[4743]: E1011 01:01:22.093511 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-9jfxn_openshift-multus(e8c603f4-717c-4554-992a-8338b3bef24d)\"" pod="openshift-multus/multus-9jfxn" podUID="e8c603f4-717c-4554-992a-8338b3bef24d" Oct 11 01:01:29 crc kubenswrapper[4743]: I1011 01:01:29.091571 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" Oct 11 01:01:29 crc kubenswrapper[4743]: I1011 01:01:29.092380 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" Oct 11 01:01:29 crc kubenswrapper[4743]: E1011 01:01:29.119419 4743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55f6745849-chpjb_openshift-operators_97d6cff1-f86b-4110-9c64-907a97ea4ceb_0(337bafa52b57e6327373277c5cb013290d18989ec3dcda50f91cd01123031d83): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 11 01:01:29 crc kubenswrapper[4743]: E1011 01:01:29.119788 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55f6745849-chpjb_openshift-operators_97d6cff1-f86b-4110-9c64-907a97ea4ceb_0(337bafa52b57e6327373277c5cb013290d18989ec3dcda50f91cd01123031d83): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" Oct 11 01:01:29 crc kubenswrapper[4743]: E1011 01:01:29.119821 4743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55f6745849-chpjb_openshift-operators_97d6cff1-f86b-4110-9c64-907a97ea4ceb_0(337bafa52b57e6327373277c5cb013290d18989ec3dcda50f91cd01123031d83): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" Oct 11 01:01:29 crc kubenswrapper[4743]: E1011 01:01:29.119927 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-55f6745849-chpjb_openshift-operators(97d6cff1-f86b-4110-9c64-907a97ea4ceb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-55f6745849-chpjb_openshift-operators(97d6cff1-f86b-4110-9c64-907a97ea4ceb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55f6745849-chpjb_openshift-operators_97d6cff1-f86b-4110-9c64-907a97ea4ceb_0(337bafa52b57e6327373277c5cb013290d18989ec3dcda50f91cd01123031d83): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" podUID="97d6cff1-f86b-4110-9c64-907a97ea4ceb" Oct 11 01:01:31 crc kubenswrapper[4743]: I1011 01:01:31.091076 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" Oct 11 01:01:31 crc kubenswrapper[4743]: I1011 01:01:31.091653 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" Oct 11 01:01:31 crc kubenswrapper[4743]: E1011 01:01:31.116375 4743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-bdtnb_openshift-operators_fe87b24f-db4d-49cd-a2de-ab949443ecea_0(9c0d035d13702932f8173c04b103e3b62cfee0cb98318f5e361a42ad28eb02c7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 11 01:01:31 crc kubenswrapper[4743]: E1011 01:01:31.116461 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-bdtnb_openshift-operators_fe87b24f-db4d-49cd-a2de-ab949443ecea_0(9c0d035d13702932f8173c04b103e3b62cfee0cb98318f5e361a42ad28eb02c7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" Oct 11 01:01:31 crc kubenswrapper[4743]: E1011 01:01:31.116488 4743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-bdtnb_openshift-operators_fe87b24f-db4d-49cd-a2de-ab949443ecea_0(9c0d035d13702932f8173c04b103e3b62cfee0cb98318f5e361a42ad28eb02c7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" Oct 11 01:01:31 crc kubenswrapper[4743]: E1011 01:01:31.116540 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-54bc95c9fb-bdtnb_openshift-operators(fe87b24f-db4d-49cd-a2de-ab949443ecea)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-54bc95c9fb-bdtnb_openshift-operators(fe87b24f-db4d-49cd-a2de-ab949443ecea)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-bdtnb_openshift-operators_fe87b24f-db4d-49cd-a2de-ab949443ecea_0(9c0d035d13702932f8173c04b103e3b62cfee0cb98318f5e361a42ad28eb02c7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" podUID="fe87b24f-db4d-49cd-a2de-ab949443ecea" Oct 11 01:01:32 crc kubenswrapper[4743]: I1011 01:01:32.091481 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm" Oct 11 01:01:32 crc kubenswrapper[4743]: I1011 01:01:32.091545 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" Oct 11 01:01:32 crc kubenswrapper[4743]: I1011 01:01:32.092285 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm" Oct 11 01:01:32 crc kubenswrapper[4743]: I1011 01:01:32.092289 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" Oct 11 01:01:32 crc kubenswrapper[4743]: E1011 01:01:32.141502 4743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-sgfrq_openshift-operators_c4642856-48b2-4843-a11e-1a207a8c8efc_0(6edd67c383b18798443eb8c951af186d113556a91d76e477fe4f5ef2d412bf54): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 11 01:01:32 crc kubenswrapper[4743]: E1011 01:01:32.141572 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-sgfrq_openshift-operators_c4642856-48b2-4843-a11e-1a207a8c8efc_0(6edd67c383b18798443eb8c951af186d113556a91d76e477fe4f5ef2d412bf54): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" Oct 11 01:01:32 crc kubenswrapper[4743]: E1011 01:01:32.141595 4743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-sgfrq_openshift-operators_c4642856-48b2-4843-a11e-1a207a8c8efc_0(6edd67c383b18798443eb8c951af186d113556a91d76e477fe4f5ef2d412bf54): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" Oct 11 01:01:32 crc kubenswrapper[4743]: E1011 01:01:32.141686 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-cc5f78dfc-sgfrq_openshift-operators(c4642856-48b2-4843-a11e-1a207a8c8efc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-cc5f78dfc-sgfrq_openshift-operators(c4642856-48b2-4843-a11e-1a207a8c8efc)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-sgfrq_openshift-operators_c4642856-48b2-4843-a11e-1a207a8c8efc_0(6edd67c383b18798443eb8c951af186d113556a91d76e477fe4f5ef2d412bf54): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" podUID="c4642856-48b2-4843-a11e-1a207a8c8efc" Oct 11 01:01:32 crc kubenswrapper[4743]: E1011 01:01:32.159553 4743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-4ftxm_openshift-operators_9918e6d2-f9d4-4c2f-93ef-cc952577182b_0(2455b52d7d1722cc037420f1b7652633289167d14350e74c7c2ee18f7120f35f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 11 01:01:32 crc kubenswrapper[4743]: E1011 01:01:32.159629 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-4ftxm_openshift-operators_9918e6d2-f9d4-4c2f-93ef-cc952577182b_0(2455b52d7d1722cc037420f1b7652633289167d14350e74c7c2ee18f7120f35f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm" Oct 11 01:01:32 crc kubenswrapper[4743]: E1011 01:01:32.159650 4743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-4ftxm_openshift-operators_9918e6d2-f9d4-4c2f-93ef-cc952577182b_0(2455b52d7d1722cc037420f1b7652633289167d14350e74c7c2ee18f7120f35f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm" Oct 11 01:01:32 crc kubenswrapper[4743]: E1011 01:01:32.159695 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-7c8cf85677-4ftxm_openshift-operators(9918e6d2-f9d4-4c2f-93ef-cc952577182b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-7c8cf85677-4ftxm_openshift-operators(9918e6d2-f9d4-4c2f-93ef-cc952577182b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-4ftxm_openshift-operators_9918e6d2-f9d4-4c2f-93ef-cc952577182b_0(2455b52d7d1722cc037420f1b7652633289167d14350e74c7c2ee18f7120f35f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm" podUID="9918e6d2-f9d4-4c2f-93ef-cc952577182b" Oct 11 01:01:33 crc kubenswrapper[4743]: I1011 01:01:33.091220 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" Oct 11 01:01:33 crc kubenswrapper[4743]: I1011 01:01:33.091916 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" Oct 11 01:01:33 crc kubenswrapper[4743]: E1011 01:01:33.116472 4743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55f6745849-6fxfg_openshift-operators_e62c0910-f3a4-4c85-9ad5-88f6fa5262df_0(773f9446066fb75305dc5f356dca4eb5d8354791ba0104bac30e3b55dc85aedd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 11 01:01:33 crc kubenswrapper[4743]: E1011 01:01:33.116630 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55f6745849-6fxfg_openshift-operators_e62c0910-f3a4-4c85-9ad5-88f6fa5262df_0(773f9446066fb75305dc5f356dca4eb5d8354791ba0104bac30e3b55dc85aedd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" Oct 11 01:01:33 crc kubenswrapper[4743]: E1011 01:01:33.116675 4743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55f6745849-6fxfg_openshift-operators_e62c0910-f3a4-4c85-9ad5-88f6fa5262df_0(773f9446066fb75305dc5f356dca4eb5d8354791ba0104bac30e3b55dc85aedd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" Oct 11 01:01:33 crc kubenswrapper[4743]: E1011 01:01:33.116803 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-55f6745849-6fxfg_openshift-operators(e62c0910-f3a4-4c85-9ad5-88f6fa5262df)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-55f6745849-6fxfg_openshift-operators(e62c0910-f3a4-4c85-9ad5-88f6fa5262df)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55f6745849-6fxfg_openshift-operators_e62c0910-f3a4-4c85-9ad5-88f6fa5262df_0(773f9446066fb75305dc5f356dca4eb5d8354791ba0104bac30e3b55dc85aedd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" podUID="e62c0910-f3a4-4c85-9ad5-88f6fa5262df" Oct 11 01:01:36 crc kubenswrapper[4743]: I1011 01:01:36.096963 4743 scope.go:117] "RemoveContainer" containerID="bdc42fd21a8b6982fc5516915cecbd0521737b5b4fd27556f887dbf66219ef33" Oct 11 01:01:36 crc kubenswrapper[4743]: I1011 01:01:36.945713 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9jfxn_e8c603f4-717c-4554-992a-8338b3bef24d/kube-multus/2.log" Oct 11 01:01:36 crc kubenswrapper[4743]: I1011 01:01:36.946113 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9jfxn" event={"ID":"e8c603f4-717c-4554-992a-8338b3bef24d","Type":"ContainerStarted","Data":"b5330b40cd02e1052b84a5bd9229eb205f4583524655df081aedb08548ed2f23"} Oct 11 01:01:40 crc kubenswrapper[4743]: I1011 01:01:40.778514 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9qrj4" Oct 11 01:01:43 crc kubenswrapper[4743]: I1011 01:01:43.090733 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" Oct 11 01:01:43 crc kubenswrapper[4743]: I1011 01:01:43.090740 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" Oct 11 01:01:43 crc kubenswrapper[4743]: I1011 01:01:43.091366 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" Oct 11 01:01:43 crc kubenswrapper[4743]: I1011 01:01:43.091401 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" Oct 11 01:01:43 crc kubenswrapper[4743]: I1011 01:01:43.541650 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb"] Oct 11 01:01:43 crc kubenswrapper[4743]: I1011 01:01:43.554356 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-sgfrq"] Oct 11 01:01:43 crc kubenswrapper[4743]: W1011 01:01:43.595086 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4642856_48b2_4843_a11e_1a207a8c8efc.slice/crio-6c1c6918b2bfb648915626dbcc85f6c9e98e9f6717810b02e5ffea1baf37f9f7 WatchSource:0}: Error finding container 6c1c6918b2bfb648915626dbcc85f6c9e98e9f6717810b02e5ffea1baf37f9f7: Status 404 returned error can't find the container with id 6c1c6918b2bfb648915626dbcc85f6c9e98e9f6717810b02e5ffea1baf37f9f7 Oct 11 01:01:44 crc kubenswrapper[4743]: I1011 01:01:44.001270 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" event={"ID":"c4642856-48b2-4843-a11e-1a207a8c8efc","Type":"ContainerStarted","Data":"6c1c6918b2bfb648915626dbcc85f6c9e98e9f6717810b02e5ffea1baf37f9f7"} Oct 11 01:01:44 crc kubenswrapper[4743]: I1011 01:01:44.002593 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" event={"ID":"97d6cff1-f86b-4110-9c64-907a97ea4ceb","Type":"ContainerStarted","Data":"ec939f84ec83b3967d4124e64ed30900a5fd6e68f00ab2a9cd1214703680dc5d"} Oct 11 01:01:44 crc kubenswrapper[4743]: I1011 01:01:44.091131 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" Oct 11 01:01:44 crc kubenswrapper[4743]: I1011 01:01:44.091183 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" Oct 11 01:01:44 crc kubenswrapper[4743]: I1011 01:01:44.091487 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" Oct 11 01:01:44 crc kubenswrapper[4743]: I1011 01:01:44.091926 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" Oct 11 01:01:44 crc kubenswrapper[4743]: I1011 01:01:44.406533 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg"] Oct 11 01:01:44 crc kubenswrapper[4743]: I1011 01:01:44.576479 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-bdtnb"] Oct 11 01:01:44 crc kubenswrapper[4743]: W1011 01:01:44.583365 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe87b24f_db4d_49cd_a2de_ab949443ecea.slice/crio-9f21f9d215e64450bfcb8fce23064d0173f10517ff1f2eef99fa43f12ae53144 WatchSource:0}: Error finding container 9f21f9d215e64450bfcb8fce23064d0173f10517ff1f2eef99fa43f12ae53144: Status 404 returned error can't find the container with id 9f21f9d215e64450bfcb8fce23064d0173f10517ff1f2eef99fa43f12ae53144 Oct 11 01:01:45 crc kubenswrapper[4743]: I1011 01:01:45.012628 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" event={"ID":"fe87b24f-db4d-49cd-a2de-ab949443ecea","Type":"ContainerStarted","Data":"9f21f9d215e64450bfcb8fce23064d0173f10517ff1f2eef99fa43f12ae53144"} Oct 11 01:01:45 crc kubenswrapper[4743]: I1011 01:01:45.014968 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" event={"ID":"e62c0910-f3a4-4c85-9ad5-88f6fa5262df","Type":"ContainerStarted","Data":"e37f204e2abbab40d17a6d73040d521c572c619e6bc633ff9271fe95201b0624"} Oct 11 01:01:45 crc kubenswrapper[4743]: I1011 01:01:45.090756 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm" Oct 11 01:01:45 crc kubenswrapper[4743]: I1011 01:01:45.091935 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm" Oct 11 01:01:45 crc kubenswrapper[4743]: I1011 01:01:45.354390 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm"] Oct 11 01:01:46 crc kubenswrapper[4743]: I1011 01:01:46.032414 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm" event={"ID":"9918e6d2-f9d4-4c2f-93ef-cc952577182b","Type":"ContainerStarted","Data":"0489658ad5286caf1e51d3fca809cff6b24499ea17dc5089ce238c73af14c559"} Oct 11 01:01:54 crc kubenswrapper[4743]: I1011 01:01:54.090400 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm" event={"ID":"9918e6d2-f9d4-4c2f-93ef-cc952577182b","Type":"ContainerStarted","Data":"4fc92c6e046ff4133eb7420ba9350c9c52ae54252161e91fdb35fecaf3cc070b"} Oct 11 01:01:54 crc kubenswrapper[4743]: I1011 01:01:54.100112 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" Oct 11 01:01:54 crc kubenswrapper[4743]: I1011 01:01:54.100157 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" Oct 11 01:01:54 crc kubenswrapper[4743]: I1011 01:01:54.100172 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" event={"ID":"e62c0910-f3a4-4c85-9ad5-88f6fa5262df","Type":"ContainerStarted","Data":"bbdf0bb54e11cf52fd876bc7198e1303984f6b12c62821450f9c3d95ee3e8a30"} Oct 11 01:01:54 crc kubenswrapper[4743]: I1011 01:01:54.100189 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" event={"ID":"fe87b24f-db4d-49cd-a2de-ab949443ecea","Type":"ContainerStarted","Data":"383ced9b1f6cdd660ae7e5fe4817cc32a0af394ead949de3a5edc45496621da9"} Oct 11 01:01:54 crc kubenswrapper[4743]: I1011 01:01:54.100201 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" event={"ID":"c4642856-48b2-4843-a11e-1a207a8c8efc","Type":"ContainerStarted","Data":"d8eb49a9465e4089b9d1cfa876479eb43e69b75bb7302a6b6fd05a8cd67a60fc"} Oct 11 01:01:54 crc kubenswrapper[4743]: I1011 01:01:54.100214 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" event={"ID":"97d6cff1-f86b-4110-9c64-907a97ea4ceb","Type":"ContainerStarted","Data":"1bfbe559f1acc0392b47d5c47bc3bdc8bc791f306d64793ceca89f3f673bf029"} Oct 11 01:01:54 crc kubenswrapper[4743]: I1011 01:01:54.134218 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" podStartSLOduration=33.596502791 podStartE2EDuration="42.134202949s" podCreationTimestamp="2025-10-11 01:01:12 +0000 UTC" firstStartedPulling="2025-10-11 01:01:44.589471468 +0000 UTC m=+599.242451905" lastFinishedPulling="2025-10-11 01:01:53.127171666 +0000 UTC m=+607.780152063" observedRunningTime="2025-10-11 01:01:54.132994027 +0000 UTC m=+608.785974504" watchObservedRunningTime="2025-10-11 01:01:54.134202949 +0000 UTC m=+608.787183336" Oct 11 01:01:54 crc kubenswrapper[4743]: I1011 01:01:54.135694 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-4ftxm" podStartSLOduration=34.341086259 podStartE2EDuration="42.1356895s" podCreationTimestamp="2025-10-11 01:01:12 +0000 UTC" firstStartedPulling="2025-10-11 01:01:45.373948117 +0000 UTC m=+600.026928554" lastFinishedPulling="2025-10-11 01:01:53.168551398 +0000 UTC m=+607.821531795" observedRunningTime="2025-10-11 01:01:54.114460184 +0000 UTC m=+608.767440601" watchObservedRunningTime="2025-10-11 01:01:54.1356895 +0000 UTC m=+608.788669897" Oct 11 01:01:54 crc kubenswrapper[4743]: I1011 01:01:54.153281 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" Oct 11 01:01:54 crc kubenswrapper[4743]: I1011 01:01:54.169721 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-sgfrq" podStartSLOduration=32.605443683 podStartE2EDuration="42.169689132s" podCreationTimestamp="2025-10-11 01:01:12 +0000 UTC" firstStartedPulling="2025-10-11 01:01:43.623022846 +0000 UTC m=+598.276003253" lastFinishedPulling="2025-10-11 01:01:53.187268315 +0000 UTC m=+607.840248702" observedRunningTime="2025-10-11 01:01:54.167819991 +0000 UTC m=+608.820800388" watchObservedRunningTime="2025-10-11 01:01:54.169689132 +0000 UTC m=+608.822669609" Oct 11 01:01:54 crc kubenswrapper[4743]: I1011 01:01:54.221955 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-chpjb" podStartSLOduration=32.645616011 podStartE2EDuration="42.221940318s" podCreationTimestamp="2025-10-11 01:01:12 +0000 UTC" firstStartedPulling="2025-10-11 01:01:43.563094371 +0000 UTC m=+598.216074798" lastFinishedPulling="2025-10-11 01:01:53.139418718 +0000 UTC m=+607.792399105" observedRunningTime="2025-10-11 01:01:54.197078524 +0000 UTC m=+608.850058921" watchObservedRunningTime="2025-10-11 01:01:54.221940318 +0000 UTC m=+608.874920715" Oct 11 01:01:54 crc kubenswrapper[4743]: I1011 01:01:54.222278 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55f6745849-6fxfg" podStartSLOduration=33.472482118 podStartE2EDuration="42.222273767s" podCreationTimestamp="2025-10-11 01:01:12 +0000 UTC" firstStartedPulling="2025-10-11 01:01:44.426822158 +0000 UTC m=+599.079802585" lastFinishedPulling="2025-10-11 01:01:53.176613837 +0000 UTC m=+607.829594234" observedRunningTime="2025-10-11 01:01:54.218658359 +0000 UTC m=+608.871638756" watchObservedRunningTime="2025-10-11 01:01:54.222273767 +0000 UTC m=+608.875254164" Oct 11 01:02:01 crc kubenswrapper[4743]: I1011 01:02:01.984206 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-gf47h"] Oct 11 01:02:01 crc kubenswrapper[4743]: I1011 01:02:01.985390 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-gf47h" Oct 11 01:02:01 crc kubenswrapper[4743]: I1011 01:02:01.988120 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 11 01:02:01 crc kubenswrapper[4743]: I1011 01:02:01.988311 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 11 01:02:01 crc kubenswrapper[4743]: I1011 01:02:01.991458 4743 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-6k4gs" Oct 11 01:02:01 crc kubenswrapper[4743]: I1011 01:02:01.993331 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-4xbkk"] Oct 11 01:02:01 crc kubenswrapper[4743]: I1011 01:02:01.994367 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-4xbkk" Oct 11 01:02:01 crc kubenswrapper[4743]: I1011 01:02:01.997986 4743 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-ngkfv" Oct 11 01:02:02 crc kubenswrapper[4743]: I1011 01:02:02.000664 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-gf47h"] Oct 11 01:02:02 crc kubenswrapper[4743]: I1011 01:02:02.010963 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-xdlw4"] Oct 11 01:02:02 crc kubenswrapper[4743]: I1011 01:02:02.011872 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-xdlw4" Oct 11 01:02:02 crc kubenswrapper[4743]: I1011 01:02:02.016686 4743 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-djtxs" Oct 11 01:02:02 crc kubenswrapper[4743]: I1011 01:02:02.021507 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mj5m\" (UniqueName: \"kubernetes.io/projected/5796afa2-031a-4046-b0ed-d2f728e700db-kube-api-access-7mj5m\") pod \"cert-manager-cainjector-7f985d654d-gf47h\" (UID: \"5796afa2-031a-4046-b0ed-d2f728e700db\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-gf47h" Oct 11 01:02:02 crc kubenswrapper[4743]: I1011 01:02:02.021587 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-4xbkk"] Oct 11 01:02:02 crc kubenswrapper[4743]: I1011 01:02:02.034154 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-xdlw4"] Oct 11 01:02:02 crc kubenswrapper[4743]: I1011 01:02:02.123132 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxc7x\" (UniqueName: \"kubernetes.io/projected/60052a4b-2c20-4f20-b109-ca070b9e11e6-kube-api-access-wxc7x\") pod \"cert-manager-webhook-5655c58dd6-xdlw4\" (UID: \"60052a4b-2c20-4f20-b109-ca070b9e11e6\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-xdlw4" Oct 11 01:02:02 crc kubenswrapper[4743]: I1011 01:02:02.123188 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mj5m\" (UniqueName: \"kubernetes.io/projected/5796afa2-031a-4046-b0ed-d2f728e700db-kube-api-access-7mj5m\") pod \"cert-manager-cainjector-7f985d654d-gf47h\" (UID: \"5796afa2-031a-4046-b0ed-d2f728e700db\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-gf47h" Oct 11 01:02:02 crc kubenswrapper[4743]: I1011 01:02:02.123212 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wndn\" (UniqueName: \"kubernetes.io/projected/3862bd0e-7310-4227-9d4f-8eb551293343-kube-api-access-5wndn\") pod \"cert-manager-5b446d88c5-4xbkk\" (UID: \"3862bd0e-7310-4227-9d4f-8eb551293343\") " pod="cert-manager/cert-manager-5b446d88c5-4xbkk" Oct 11 01:02:02 crc kubenswrapper[4743]: I1011 01:02:02.141129 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mj5m\" (UniqueName: \"kubernetes.io/projected/5796afa2-031a-4046-b0ed-d2f728e700db-kube-api-access-7mj5m\") pod \"cert-manager-cainjector-7f985d654d-gf47h\" (UID: \"5796afa2-031a-4046-b0ed-d2f728e700db\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-gf47h" Oct 11 01:02:02 crc kubenswrapper[4743]: I1011 01:02:02.224334 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxc7x\" (UniqueName: \"kubernetes.io/projected/60052a4b-2c20-4f20-b109-ca070b9e11e6-kube-api-access-wxc7x\") pod \"cert-manager-webhook-5655c58dd6-xdlw4\" (UID: \"60052a4b-2c20-4f20-b109-ca070b9e11e6\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-xdlw4" Oct 11 01:02:02 crc kubenswrapper[4743]: I1011 01:02:02.224420 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wndn\" (UniqueName: \"kubernetes.io/projected/3862bd0e-7310-4227-9d4f-8eb551293343-kube-api-access-5wndn\") pod \"cert-manager-5b446d88c5-4xbkk\" (UID: \"3862bd0e-7310-4227-9d4f-8eb551293343\") " pod="cert-manager/cert-manager-5b446d88c5-4xbkk" Oct 11 01:02:02 crc kubenswrapper[4743]: I1011 01:02:02.242912 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxc7x\" (UniqueName: \"kubernetes.io/projected/60052a4b-2c20-4f20-b109-ca070b9e11e6-kube-api-access-wxc7x\") pod \"cert-manager-webhook-5655c58dd6-xdlw4\" (UID: \"60052a4b-2c20-4f20-b109-ca070b9e11e6\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-xdlw4" Oct 11 01:02:02 crc kubenswrapper[4743]: I1011 01:02:02.245351 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wndn\" (UniqueName: \"kubernetes.io/projected/3862bd0e-7310-4227-9d4f-8eb551293343-kube-api-access-5wndn\") pod \"cert-manager-5b446d88c5-4xbkk\" (UID: \"3862bd0e-7310-4227-9d4f-8eb551293343\") " pod="cert-manager/cert-manager-5b446d88c5-4xbkk" Oct 11 01:02:02 crc kubenswrapper[4743]: I1011 01:02:02.310226 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-gf47h" Oct 11 01:02:02 crc kubenswrapper[4743]: I1011 01:02:02.327807 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-4xbkk" Oct 11 01:02:02 crc kubenswrapper[4743]: I1011 01:02:02.336714 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-xdlw4" Oct 11 01:02:02 crc kubenswrapper[4743]: I1011 01:02:02.568425 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-4xbkk"] Oct 11 01:02:02 crc kubenswrapper[4743]: I1011 01:02:02.589492 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-xdlw4"] Oct 11 01:02:02 crc kubenswrapper[4743]: I1011 01:02:02.629649 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-gf47h"] Oct 11 01:02:02 crc kubenswrapper[4743]: W1011 01:02:02.633079 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5796afa2_031a_4046_b0ed_d2f728e700db.slice/crio-9cb01243069d0394109e385883695fd437ec488324e44fe7d73d328ff800b37a WatchSource:0}: Error finding container 9cb01243069d0394109e385883695fd437ec488324e44fe7d73d328ff800b37a: Status 404 returned error can't find the container with id 9cb01243069d0394109e385883695fd437ec488324e44fe7d73d328ff800b37a Oct 11 01:02:02 crc kubenswrapper[4743]: I1011 01:02:02.897981 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-bdtnb" Oct 11 01:02:03 crc kubenswrapper[4743]: I1011 01:02:03.149215 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-4xbkk" event={"ID":"3862bd0e-7310-4227-9d4f-8eb551293343","Type":"ContainerStarted","Data":"56847e51f4426dcecc2329154ba5910d7b302e431cb057aa245993ff907db0a8"} Oct 11 01:02:03 crc kubenswrapper[4743]: I1011 01:02:03.150290 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-gf47h" event={"ID":"5796afa2-031a-4046-b0ed-d2f728e700db","Type":"ContainerStarted","Data":"9cb01243069d0394109e385883695fd437ec488324e44fe7d73d328ff800b37a"} Oct 11 01:02:03 crc kubenswrapper[4743]: I1011 01:02:03.151400 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-xdlw4" event={"ID":"60052a4b-2c20-4f20-b109-ca070b9e11e6","Type":"ContainerStarted","Data":"8968619f0cf433e2eddd2a6f0b99b6798c4dfd2dddcd85756486a737b4e92827"} Oct 11 01:02:08 crc kubenswrapper[4743]: I1011 01:02:08.186454 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-4xbkk" event={"ID":"3862bd0e-7310-4227-9d4f-8eb551293343","Type":"ContainerStarted","Data":"86c4ed7ea75d3b880f06aea4c41fbed63213bfbfd40e5a0dc22add6542b15491"} Oct 11 01:02:08 crc kubenswrapper[4743]: I1011 01:02:08.188312 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-gf47h" event={"ID":"5796afa2-031a-4046-b0ed-d2f728e700db","Type":"ContainerStarted","Data":"75e0e57835fcda38983583ab158763da9e039cc1cb033e99d99bdc6d3b8b369c"} Oct 11 01:02:08 crc kubenswrapper[4743]: I1011 01:02:08.189509 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-xdlw4" event={"ID":"60052a4b-2c20-4f20-b109-ca070b9e11e6","Type":"ContainerStarted","Data":"084f5e7fc81b70db3747f97a84ed64544c6cbbdc00e807d8d360af57fc66937c"} Oct 11 01:02:08 crc kubenswrapper[4743]: I1011 01:02:08.189623 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-xdlw4" Oct 11 01:02:08 crc kubenswrapper[4743]: I1011 01:02:08.200654 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-4xbkk" podStartSLOduration=2.517473082 podStartE2EDuration="7.200636084s" podCreationTimestamp="2025-10-11 01:02:01 +0000 UTC" firstStartedPulling="2025-10-11 01:02:02.583088028 +0000 UTC m=+617.236068425" lastFinishedPulling="2025-10-11 01:02:07.26625103 +0000 UTC m=+621.919231427" observedRunningTime="2025-10-11 01:02:08.20012571 +0000 UTC m=+622.853106107" watchObservedRunningTime="2025-10-11 01:02:08.200636084 +0000 UTC m=+622.853616481" Oct 11 01:02:08 crc kubenswrapper[4743]: I1011 01:02:08.212925 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-xdlw4" podStartSLOduration=2.542194403 podStartE2EDuration="7.212908387s" podCreationTimestamp="2025-10-11 01:02:01 +0000 UTC" firstStartedPulling="2025-10-11 01:02:02.592466263 +0000 UTC m=+617.245446670" lastFinishedPulling="2025-10-11 01:02:07.263180257 +0000 UTC m=+621.916160654" observedRunningTime="2025-10-11 01:02:08.212195337 +0000 UTC m=+622.865175734" watchObservedRunningTime="2025-10-11 01:02:08.212908387 +0000 UTC m=+622.865888784" Oct 11 01:02:08 crc kubenswrapper[4743]: I1011 01:02:08.226969 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-gf47h" podStartSLOduration=2.577121009 podStartE2EDuration="7.226947037s" podCreationTimestamp="2025-10-11 01:02:01 +0000 UTC" firstStartedPulling="2025-10-11 01:02:02.636215049 +0000 UTC m=+617.289195446" lastFinishedPulling="2025-10-11 01:02:07.286041077 +0000 UTC m=+621.939021474" observedRunningTime="2025-10-11 01:02:08.224553573 +0000 UTC m=+622.877533960" watchObservedRunningTime="2025-10-11 01:02:08.226947037 +0000 UTC m=+622.879927444" Oct 11 01:02:12 crc kubenswrapper[4743]: I1011 01:02:12.340757 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-xdlw4" Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.108609 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc"] Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.110078 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc" Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.111926 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.124833 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc"] Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.129839 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rxb4\" (UniqueName: \"kubernetes.io/projected/3f25a7dd-4148-4251-9896-0d781682a3c3-kube-api-access-5rxb4\") pod \"b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc\" (UID: \"3f25a7dd-4148-4251-9896-0d781682a3c3\") " pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc" Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.129911 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f25a7dd-4148-4251-9896-0d781682a3c3-util\") pod \"b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc\" (UID: \"3f25a7dd-4148-4251-9896-0d781682a3c3\") " pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc" Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.129952 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f25a7dd-4148-4251-9896-0d781682a3c3-bundle\") pod \"b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc\" (UID: \"3f25a7dd-4148-4251-9896-0d781682a3c3\") " pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc" Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.231319 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rxb4\" (UniqueName: \"kubernetes.io/projected/3f25a7dd-4148-4251-9896-0d781682a3c3-kube-api-access-5rxb4\") pod \"b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc\" (UID: \"3f25a7dd-4148-4251-9896-0d781682a3c3\") " pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc" Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.231372 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f25a7dd-4148-4251-9896-0d781682a3c3-util\") pod \"b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc\" (UID: \"3f25a7dd-4148-4251-9896-0d781682a3c3\") " pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc" Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.231412 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f25a7dd-4148-4251-9896-0d781682a3c3-bundle\") pod \"b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc\" (UID: \"3f25a7dd-4148-4251-9896-0d781682a3c3\") " pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc" Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.231908 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f25a7dd-4148-4251-9896-0d781682a3c3-bundle\") pod \"b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc\" (UID: \"3f25a7dd-4148-4251-9896-0d781682a3c3\") " pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc" Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.231942 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f25a7dd-4148-4251-9896-0d781682a3c3-util\") pod \"b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc\" (UID: \"3f25a7dd-4148-4251-9896-0d781682a3c3\") " pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc" Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.258744 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rxb4\" (UniqueName: \"kubernetes.io/projected/3f25a7dd-4148-4251-9896-0d781682a3c3-kube-api-access-5rxb4\") pod \"b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc\" (UID: \"3f25a7dd-4148-4251-9896-0d781682a3c3\") " pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc" Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.304886 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x"] Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.306546 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x" Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.323951 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x"] Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.332908 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtsdd\" (UniqueName: \"kubernetes.io/projected/2d977ae5-754b-436a-87b9-b0618947c353-kube-api-access-gtsdd\") pod \"0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x\" (UID: \"2d977ae5-754b-436a-87b9-b0618947c353\") " pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x" Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.332990 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d977ae5-754b-436a-87b9-b0618947c353-util\") pod \"0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x\" (UID: \"2d977ae5-754b-436a-87b9-b0618947c353\") " pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x" Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.333129 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d977ae5-754b-436a-87b9-b0618947c353-bundle\") pod \"0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x\" (UID: \"2d977ae5-754b-436a-87b9-b0618947c353\") " pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x" Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.427661 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc" Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.434430 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d977ae5-754b-436a-87b9-b0618947c353-bundle\") pod \"0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x\" (UID: \"2d977ae5-754b-436a-87b9-b0618947c353\") " pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x" Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.434584 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtsdd\" (UniqueName: \"kubernetes.io/projected/2d977ae5-754b-436a-87b9-b0618947c353-kube-api-access-gtsdd\") pod \"0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x\" (UID: \"2d977ae5-754b-436a-87b9-b0618947c353\") " pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x" Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.434641 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d977ae5-754b-436a-87b9-b0618947c353-util\") pod \"0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x\" (UID: \"2d977ae5-754b-436a-87b9-b0618947c353\") " pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x" Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.434947 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d977ae5-754b-436a-87b9-b0618947c353-bundle\") pod \"0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x\" (UID: \"2d977ae5-754b-436a-87b9-b0618947c353\") " pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x" Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.435364 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d977ae5-754b-436a-87b9-b0618947c353-util\") pod \"0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x\" (UID: \"2d977ae5-754b-436a-87b9-b0618947c353\") " pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x" Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.464410 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtsdd\" (UniqueName: \"kubernetes.io/projected/2d977ae5-754b-436a-87b9-b0618947c353-kube-api-access-gtsdd\") pod \"0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x\" (UID: \"2d977ae5-754b-436a-87b9-b0618947c353\") " pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x" Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.629176 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x" Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.689189 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc"] Oct 11 01:02:39 crc kubenswrapper[4743]: I1011 01:02:39.812666 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x"] Oct 11 01:02:40 crc kubenswrapper[4743]: I1011 01:02:40.391937 4743 generic.go:334] "Generic (PLEG): container finished" podID="3f25a7dd-4148-4251-9896-0d781682a3c3" containerID="39c2c0ab2275cf61f2cccbcf283ea128322292b1c774e119640fc553a546ab32" exitCode=0 Oct 11 01:02:40 crc kubenswrapper[4743]: I1011 01:02:40.391982 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc" event={"ID":"3f25a7dd-4148-4251-9896-0d781682a3c3","Type":"ContainerDied","Data":"39c2c0ab2275cf61f2cccbcf283ea128322292b1c774e119640fc553a546ab32"} Oct 11 01:02:40 crc kubenswrapper[4743]: I1011 01:02:40.392029 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc" event={"ID":"3f25a7dd-4148-4251-9896-0d781682a3c3","Type":"ContainerStarted","Data":"ec29f02a806d46c7e7f24ca2b304e1dab8bebcd47ae1e6d3d3c4e7181dbcc116"} Oct 11 01:02:40 crc kubenswrapper[4743]: I1011 01:02:40.393930 4743 generic.go:334] "Generic (PLEG): container finished" podID="2d977ae5-754b-436a-87b9-b0618947c353" containerID="106a175d927bd8a6fbafd593171e3b7983f7c70b4dd1c2d2d95aef899c42eb73" exitCode=0 Oct 11 01:02:40 crc kubenswrapper[4743]: I1011 01:02:40.393976 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x" event={"ID":"2d977ae5-754b-436a-87b9-b0618947c353","Type":"ContainerDied","Data":"106a175d927bd8a6fbafd593171e3b7983f7c70b4dd1c2d2d95aef899c42eb73"} Oct 11 01:02:40 crc kubenswrapper[4743]: I1011 01:02:40.394006 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x" event={"ID":"2d977ae5-754b-436a-87b9-b0618947c353","Type":"ContainerStarted","Data":"2efd40658174b24d9b09050b052dc5ef20a42b55930fb400f09cbad2bc8aa355"} Oct 11 01:02:42 crc kubenswrapper[4743]: I1011 01:02:42.415915 4743 generic.go:334] "Generic (PLEG): container finished" podID="3f25a7dd-4148-4251-9896-0d781682a3c3" containerID="44f2f4646f9f9d95f4ad3ba211d1612f7f7ec110d22e68cecfbe5edcbbecfa60" exitCode=0 Oct 11 01:02:42 crc kubenswrapper[4743]: I1011 01:02:42.415990 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc" event={"ID":"3f25a7dd-4148-4251-9896-0d781682a3c3","Type":"ContainerDied","Data":"44f2f4646f9f9d95f4ad3ba211d1612f7f7ec110d22e68cecfbe5edcbbecfa60"} Oct 11 01:02:42 crc kubenswrapper[4743]: I1011 01:02:42.419271 4743 generic.go:334] "Generic (PLEG): container finished" podID="2d977ae5-754b-436a-87b9-b0618947c353" containerID="4ec5d529ddc96f195018829986d1cb9b0c659be304e0f2232a9cb6d1a3d5f73d" exitCode=0 Oct 11 01:02:42 crc kubenswrapper[4743]: I1011 01:02:42.419352 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x" event={"ID":"2d977ae5-754b-436a-87b9-b0618947c353","Type":"ContainerDied","Data":"4ec5d529ddc96f195018829986d1cb9b0c659be304e0f2232a9cb6d1a3d5f73d"} Oct 11 01:02:43 crc kubenswrapper[4743]: I1011 01:02:43.431415 4743 generic.go:334] "Generic (PLEG): container finished" podID="3f25a7dd-4148-4251-9896-0d781682a3c3" containerID="963b8354a0127b779370580ae9288d03d373190e3f139571172c6ff458acf041" exitCode=0 Oct 11 01:02:43 crc kubenswrapper[4743]: I1011 01:02:43.431490 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc" event={"ID":"3f25a7dd-4148-4251-9896-0d781682a3c3","Type":"ContainerDied","Data":"963b8354a0127b779370580ae9288d03d373190e3f139571172c6ff458acf041"} Oct 11 01:02:43 crc kubenswrapper[4743]: I1011 01:02:43.435815 4743 generic.go:334] "Generic (PLEG): container finished" podID="2d977ae5-754b-436a-87b9-b0618947c353" containerID="b43dcbc8693fc0b27976d026e82285935dd00b77b60629c454cdf7cf36a4b7d0" exitCode=0 Oct 11 01:02:43 crc kubenswrapper[4743]: I1011 01:02:43.435904 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x" event={"ID":"2d977ae5-754b-436a-87b9-b0618947c353","Type":"ContainerDied","Data":"b43dcbc8693fc0b27976d026e82285935dd00b77b60629c454cdf7cf36a4b7d0"} Oct 11 01:02:44 crc kubenswrapper[4743]: I1011 01:02:44.689496 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x" Oct 11 01:02:44 crc kubenswrapper[4743]: I1011 01:02:44.727602 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtsdd\" (UniqueName: \"kubernetes.io/projected/2d977ae5-754b-436a-87b9-b0618947c353-kube-api-access-gtsdd\") pod \"2d977ae5-754b-436a-87b9-b0618947c353\" (UID: \"2d977ae5-754b-436a-87b9-b0618947c353\") " Oct 11 01:02:44 crc kubenswrapper[4743]: I1011 01:02:44.727665 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d977ae5-754b-436a-87b9-b0618947c353-util\") pod \"2d977ae5-754b-436a-87b9-b0618947c353\" (UID: \"2d977ae5-754b-436a-87b9-b0618947c353\") " Oct 11 01:02:44 crc kubenswrapper[4743]: I1011 01:02:44.727690 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d977ae5-754b-436a-87b9-b0618947c353-bundle\") pod \"2d977ae5-754b-436a-87b9-b0618947c353\" (UID: \"2d977ae5-754b-436a-87b9-b0618947c353\") " Oct 11 01:02:44 crc kubenswrapper[4743]: I1011 01:02:44.728702 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d977ae5-754b-436a-87b9-b0618947c353-bundle" (OuterVolumeSpecName: "bundle") pod "2d977ae5-754b-436a-87b9-b0618947c353" (UID: "2d977ae5-754b-436a-87b9-b0618947c353"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:02:44 crc kubenswrapper[4743]: I1011 01:02:44.734044 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d977ae5-754b-436a-87b9-b0618947c353-kube-api-access-gtsdd" (OuterVolumeSpecName: "kube-api-access-gtsdd") pod "2d977ae5-754b-436a-87b9-b0618947c353" (UID: "2d977ae5-754b-436a-87b9-b0618947c353"). InnerVolumeSpecName "kube-api-access-gtsdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:02:44 crc kubenswrapper[4743]: I1011 01:02:44.746223 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d977ae5-754b-436a-87b9-b0618947c353-util" (OuterVolumeSpecName: "util") pod "2d977ae5-754b-436a-87b9-b0618947c353" (UID: "2d977ae5-754b-436a-87b9-b0618947c353"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:02:44 crc kubenswrapper[4743]: I1011 01:02:44.751900 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc" Oct 11 01:02:44 crc kubenswrapper[4743]: I1011 01:02:44.828772 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f25a7dd-4148-4251-9896-0d781682a3c3-util\") pod \"3f25a7dd-4148-4251-9896-0d781682a3c3\" (UID: \"3f25a7dd-4148-4251-9896-0d781682a3c3\") " Oct 11 01:02:44 crc kubenswrapper[4743]: I1011 01:02:44.828918 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f25a7dd-4148-4251-9896-0d781682a3c3-bundle\") pod \"3f25a7dd-4148-4251-9896-0d781682a3c3\" (UID: \"3f25a7dd-4148-4251-9896-0d781682a3c3\") " Oct 11 01:02:44 crc kubenswrapper[4743]: I1011 01:02:44.829127 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rxb4\" (UniqueName: \"kubernetes.io/projected/3f25a7dd-4148-4251-9896-0d781682a3c3-kube-api-access-5rxb4\") pod \"3f25a7dd-4148-4251-9896-0d781682a3c3\" (UID: \"3f25a7dd-4148-4251-9896-0d781682a3c3\") " Oct 11 01:02:44 crc kubenswrapper[4743]: I1011 01:02:44.829458 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtsdd\" (UniqueName: \"kubernetes.io/projected/2d977ae5-754b-436a-87b9-b0618947c353-kube-api-access-gtsdd\") on node \"crc\" DevicePath \"\"" Oct 11 01:02:44 crc kubenswrapper[4743]: I1011 01:02:44.829480 4743 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d977ae5-754b-436a-87b9-b0618947c353-util\") on node \"crc\" DevicePath \"\"" Oct 11 01:02:44 crc kubenswrapper[4743]: I1011 01:02:44.829493 4743 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d977ae5-754b-436a-87b9-b0618947c353-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:02:44 crc kubenswrapper[4743]: I1011 01:02:44.829590 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f25a7dd-4148-4251-9896-0d781682a3c3-bundle" (OuterVolumeSpecName: "bundle") pod "3f25a7dd-4148-4251-9896-0d781682a3c3" (UID: "3f25a7dd-4148-4251-9896-0d781682a3c3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:02:44 crc kubenswrapper[4743]: I1011 01:02:44.831985 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f25a7dd-4148-4251-9896-0d781682a3c3-kube-api-access-5rxb4" (OuterVolumeSpecName: "kube-api-access-5rxb4") pod "3f25a7dd-4148-4251-9896-0d781682a3c3" (UID: "3f25a7dd-4148-4251-9896-0d781682a3c3"). InnerVolumeSpecName "kube-api-access-5rxb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:02:44 crc kubenswrapper[4743]: I1011 01:02:44.842275 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f25a7dd-4148-4251-9896-0d781682a3c3-util" (OuterVolumeSpecName: "util") pod "3f25a7dd-4148-4251-9896-0d781682a3c3" (UID: "3f25a7dd-4148-4251-9896-0d781682a3c3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:02:44 crc kubenswrapper[4743]: I1011 01:02:44.930834 4743 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f25a7dd-4148-4251-9896-0d781682a3c3-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:02:44 crc kubenswrapper[4743]: I1011 01:02:44.930879 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rxb4\" (UniqueName: \"kubernetes.io/projected/3f25a7dd-4148-4251-9896-0d781682a3c3-kube-api-access-5rxb4\") on node \"crc\" DevicePath \"\"" Oct 11 01:02:44 crc kubenswrapper[4743]: I1011 01:02:44.930893 4743 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f25a7dd-4148-4251-9896-0d781682a3c3-util\") on node \"crc\" DevicePath \"\"" Oct 11 01:02:45 crc kubenswrapper[4743]: I1011 01:02:45.450443 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc" event={"ID":"3f25a7dd-4148-4251-9896-0d781682a3c3","Type":"ContainerDied","Data":"ec29f02a806d46c7e7f24ca2b304e1dab8bebcd47ae1e6d3d3c4e7181dbcc116"} Oct 11 01:02:45 crc kubenswrapper[4743]: I1011 01:02:45.450474 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec29f02a806d46c7e7f24ca2b304e1dab8bebcd47ae1e6d3d3c4e7181dbcc116" Oct 11 01:02:45 crc kubenswrapper[4743]: I1011 01:02:45.450553 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc" Oct 11 01:02:45 crc kubenswrapper[4743]: I1011 01:02:45.452134 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x" event={"ID":"2d977ae5-754b-436a-87b9-b0618947c353","Type":"ContainerDied","Data":"2efd40658174b24d9b09050b052dc5ef20a42b55930fb400f09cbad2bc8aa355"} Oct 11 01:02:45 crc kubenswrapper[4743]: I1011 01:02:45.452156 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2efd40658174b24d9b09050b052dc5ef20a42b55930fb400f09cbad2bc8aa355" Oct 11 01:02:45 crc kubenswrapper[4743]: I1011 01:02:45.452202 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.130108 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n"] Oct 11 01:02:56 crc kubenswrapper[4743]: E1011 01:02:56.130844 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f25a7dd-4148-4251-9896-0d781682a3c3" containerName="extract" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.130877 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f25a7dd-4148-4251-9896-0d781682a3c3" containerName="extract" Oct 11 01:02:56 crc kubenswrapper[4743]: E1011 01:02:56.130897 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f25a7dd-4148-4251-9896-0d781682a3c3" containerName="pull" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.130906 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f25a7dd-4148-4251-9896-0d781682a3c3" containerName="pull" Oct 11 01:02:56 crc kubenswrapper[4743]: E1011 01:02:56.130920 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d977ae5-754b-436a-87b9-b0618947c353" containerName="extract" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.130928 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d977ae5-754b-436a-87b9-b0618947c353" containerName="extract" Oct 11 01:02:56 crc kubenswrapper[4743]: E1011 01:02:56.130941 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f25a7dd-4148-4251-9896-0d781682a3c3" containerName="util" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.130949 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f25a7dd-4148-4251-9896-0d781682a3c3" containerName="util" Oct 11 01:02:56 crc kubenswrapper[4743]: E1011 01:02:56.130961 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d977ae5-754b-436a-87b9-b0618947c353" containerName="util" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.130969 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d977ae5-754b-436a-87b9-b0618947c353" containerName="util" Oct 11 01:02:56 crc kubenswrapper[4743]: E1011 01:02:56.130980 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d977ae5-754b-436a-87b9-b0618947c353" containerName="pull" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.130989 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d977ae5-754b-436a-87b9-b0618947c353" containerName="pull" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.131133 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f25a7dd-4148-4251-9896-0d781682a3c3" containerName="extract" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.131153 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d977ae5-754b-436a-87b9-b0618947c353" containerName="extract" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.131796 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.133885 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.134631 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.134810 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-gw276" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.145565 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.145579 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.146330 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.151606 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n"] Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.264223 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d60b36a6-87ff-4325-a245-43b3dea4cfaf-manager-config\") pod \"loki-operator-controller-manager-7857f779b4-t484n\" (UID: \"d60b36a6-87ff-4325-a245-43b3dea4cfaf\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.264266 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn567\" (UniqueName: \"kubernetes.io/projected/d60b36a6-87ff-4325-a245-43b3dea4cfaf-kube-api-access-qn567\") pod \"loki-operator-controller-manager-7857f779b4-t484n\" (UID: \"d60b36a6-87ff-4325-a245-43b3dea4cfaf\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.264306 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d60b36a6-87ff-4325-a245-43b3dea4cfaf-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7857f779b4-t484n\" (UID: \"d60b36a6-87ff-4325-a245-43b3dea4cfaf\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.264334 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d60b36a6-87ff-4325-a245-43b3dea4cfaf-webhook-cert\") pod \"loki-operator-controller-manager-7857f779b4-t484n\" (UID: \"d60b36a6-87ff-4325-a245-43b3dea4cfaf\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.264565 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d60b36a6-87ff-4325-a245-43b3dea4cfaf-apiservice-cert\") pod \"loki-operator-controller-manager-7857f779b4-t484n\" (UID: \"d60b36a6-87ff-4325-a245-43b3dea4cfaf\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.365517 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d60b36a6-87ff-4325-a245-43b3dea4cfaf-webhook-cert\") pod \"loki-operator-controller-manager-7857f779b4-t484n\" (UID: \"d60b36a6-87ff-4325-a245-43b3dea4cfaf\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.365806 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d60b36a6-87ff-4325-a245-43b3dea4cfaf-apiservice-cert\") pod \"loki-operator-controller-manager-7857f779b4-t484n\" (UID: \"d60b36a6-87ff-4325-a245-43b3dea4cfaf\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.365843 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d60b36a6-87ff-4325-a245-43b3dea4cfaf-manager-config\") pod \"loki-operator-controller-manager-7857f779b4-t484n\" (UID: \"d60b36a6-87ff-4325-a245-43b3dea4cfaf\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.365880 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn567\" (UniqueName: \"kubernetes.io/projected/d60b36a6-87ff-4325-a245-43b3dea4cfaf-kube-api-access-qn567\") pod \"loki-operator-controller-manager-7857f779b4-t484n\" (UID: \"d60b36a6-87ff-4325-a245-43b3dea4cfaf\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.365903 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d60b36a6-87ff-4325-a245-43b3dea4cfaf-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7857f779b4-t484n\" (UID: \"d60b36a6-87ff-4325-a245-43b3dea4cfaf\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.367658 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d60b36a6-87ff-4325-a245-43b3dea4cfaf-manager-config\") pod \"loki-operator-controller-manager-7857f779b4-t484n\" (UID: \"d60b36a6-87ff-4325-a245-43b3dea4cfaf\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.371466 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d60b36a6-87ff-4325-a245-43b3dea4cfaf-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7857f779b4-t484n\" (UID: \"d60b36a6-87ff-4325-a245-43b3dea4cfaf\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.371532 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d60b36a6-87ff-4325-a245-43b3dea4cfaf-apiservice-cert\") pod \"loki-operator-controller-manager-7857f779b4-t484n\" (UID: \"d60b36a6-87ff-4325-a245-43b3dea4cfaf\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.371910 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d60b36a6-87ff-4325-a245-43b3dea4cfaf-webhook-cert\") pod \"loki-operator-controller-manager-7857f779b4-t484n\" (UID: \"d60b36a6-87ff-4325-a245-43b3dea4cfaf\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.391134 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn567\" (UniqueName: \"kubernetes.io/projected/d60b36a6-87ff-4325-a245-43b3dea4cfaf-kube-api-access-qn567\") pod \"loki-operator-controller-manager-7857f779b4-t484n\" (UID: \"d60b36a6-87ff-4325-a245-43b3dea4cfaf\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.448926 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n" Oct 11 01:02:56 crc kubenswrapper[4743]: I1011 01:02:56.714894 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n"] Oct 11 01:02:57 crc kubenswrapper[4743]: I1011 01:02:57.549542 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n" event={"ID":"d60b36a6-87ff-4325-a245-43b3dea4cfaf","Type":"ContainerStarted","Data":"1d28b0c2f2ec1edbae1c38a1941c0140282846cc3d59712fa362d846f476c030"} Oct 11 01:02:59 crc kubenswrapper[4743]: I1011 01:02:59.297547 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-8958c8b87-zqct6"] Oct 11 01:02:59 crc kubenswrapper[4743]: I1011 01:02:59.298392 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-8958c8b87-zqct6" Oct 11 01:02:59 crc kubenswrapper[4743]: I1011 01:02:59.301363 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Oct 11 01:02:59 crc kubenswrapper[4743]: I1011 01:02:59.301721 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-54hww" Oct 11 01:02:59 crc kubenswrapper[4743]: I1011 01:02:59.301937 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Oct 11 01:02:59 crc kubenswrapper[4743]: I1011 01:02:59.307044 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-8958c8b87-zqct6"] Oct 11 01:02:59 crc kubenswrapper[4743]: I1011 01:02:59.406046 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrqg8\" (UniqueName: \"kubernetes.io/projected/e4f8060c-3f3e-4e5d-85c5-3c344322869a-kube-api-access-lrqg8\") pod \"cluster-logging-operator-8958c8b87-zqct6\" (UID: \"e4f8060c-3f3e-4e5d-85c5-3c344322869a\") " pod="openshift-logging/cluster-logging-operator-8958c8b87-zqct6" Oct 11 01:02:59 crc kubenswrapper[4743]: I1011 01:02:59.506707 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrqg8\" (UniqueName: \"kubernetes.io/projected/e4f8060c-3f3e-4e5d-85c5-3c344322869a-kube-api-access-lrqg8\") pod \"cluster-logging-operator-8958c8b87-zqct6\" (UID: \"e4f8060c-3f3e-4e5d-85c5-3c344322869a\") " pod="openshift-logging/cluster-logging-operator-8958c8b87-zqct6" Oct 11 01:02:59 crc kubenswrapper[4743]: I1011 01:02:59.524682 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrqg8\" (UniqueName: \"kubernetes.io/projected/e4f8060c-3f3e-4e5d-85c5-3c344322869a-kube-api-access-lrqg8\") pod \"cluster-logging-operator-8958c8b87-zqct6\" (UID: \"e4f8060c-3f3e-4e5d-85c5-3c344322869a\") " pod="openshift-logging/cluster-logging-operator-8958c8b87-zqct6" Oct 11 01:02:59 crc kubenswrapper[4743]: I1011 01:02:59.653099 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-8958c8b87-zqct6" Oct 11 01:03:02 crc kubenswrapper[4743]: I1011 01:03:02.084792 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-8958c8b87-zqct6"] Oct 11 01:03:02 crc kubenswrapper[4743]: I1011 01:03:02.585841 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-8958c8b87-zqct6" event={"ID":"e4f8060c-3f3e-4e5d-85c5-3c344322869a","Type":"ContainerStarted","Data":"cc0cc315c1a392c85086538f1cb7fb75ad47aa6b43b66b6ae4b15cde909edb69"} Oct 11 01:03:02 crc kubenswrapper[4743]: I1011 01:03:02.587601 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n" event={"ID":"d60b36a6-87ff-4325-a245-43b3dea4cfaf","Type":"ContainerStarted","Data":"b1c56c197f00c0a56b8c8e96f56cc2f7d82351ef923a1ec0aa053feada82d034"} Oct 11 01:03:10 crc kubenswrapper[4743]: I1011 01:03:10.635015 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-8958c8b87-zqct6" event={"ID":"e4f8060c-3f3e-4e5d-85c5-3c344322869a","Type":"ContainerStarted","Data":"f6062f93ba57e88aaf5c3a7b773f52fb627d5c3d9a76c554b17c0c6074b58661"} Oct 11 01:03:10 crc kubenswrapper[4743]: I1011 01:03:10.636647 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n" event={"ID":"d60b36a6-87ff-4325-a245-43b3dea4cfaf","Type":"ContainerStarted","Data":"d56f33a1adb0e1c21698dff60930f1a7316c5d24203a9ef6c25a3bffe84b373e"} Oct 11 01:03:10 crc kubenswrapper[4743]: I1011 01:03:10.636879 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n" Oct 11 01:03:10 crc kubenswrapper[4743]: I1011 01:03:10.638795 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n" Oct 11 01:03:10 crc kubenswrapper[4743]: I1011 01:03:10.654939 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-8958c8b87-zqct6" podStartSLOduration=3.947153144 podStartE2EDuration="11.654920572s" podCreationTimestamp="2025-10-11 01:02:59 +0000 UTC" firstStartedPulling="2025-10-11 01:03:02.104656127 +0000 UTC m=+676.757636524" lastFinishedPulling="2025-10-11 01:03:09.812423555 +0000 UTC m=+684.465403952" observedRunningTime="2025-10-11 01:03:10.649870122 +0000 UTC m=+685.302850539" watchObservedRunningTime="2025-10-11 01:03:10.654920572 +0000 UTC m=+685.307900979" Oct 11 01:03:10 crc kubenswrapper[4743]: I1011 01:03:10.676065 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-7857f779b4-t484n" podStartSLOduration=1.58752754 podStartE2EDuration="14.676034095s" podCreationTimestamp="2025-10-11 01:02:56 +0000 UTC" firstStartedPulling="2025-10-11 01:02:56.724503476 +0000 UTC m=+671.377483873" lastFinishedPulling="2025-10-11 01:03:09.813010021 +0000 UTC m=+684.465990428" observedRunningTime="2025-10-11 01:03:10.675560863 +0000 UTC m=+685.328541280" watchObservedRunningTime="2025-10-11 01:03:10.676034095 +0000 UTC m=+685.329014562" Oct 11 01:03:14 crc kubenswrapper[4743]: I1011 01:03:14.458287 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:03:14 crc kubenswrapper[4743]: I1011 01:03:14.458599 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:03:15 crc kubenswrapper[4743]: I1011 01:03:15.757541 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Oct 11 01:03:15 crc kubenswrapper[4743]: I1011 01:03:15.758236 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Oct 11 01:03:15 crc kubenswrapper[4743]: I1011 01:03:15.760275 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Oct 11 01:03:15 crc kubenswrapper[4743]: I1011 01:03:15.769207 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Oct 11 01:03:15 crc kubenswrapper[4743]: I1011 01:03:15.783359 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Oct 11 01:03:15 crc kubenswrapper[4743]: I1011 01:03:15.949707 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxlz5\" (UniqueName: \"kubernetes.io/projected/90243924-956a-4163-a85b-c9cc3e02c003-kube-api-access-rxlz5\") pod \"minio\" (UID: \"90243924-956a-4163-a85b-c9cc3e02c003\") " pod="minio-dev/minio" Oct 11 01:03:15 crc kubenswrapper[4743]: I1011 01:03:15.949767 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-699a960d-4349-4221-baaa-8ed979f759bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-699a960d-4349-4221-baaa-8ed979f759bd\") pod \"minio\" (UID: \"90243924-956a-4163-a85b-c9cc3e02c003\") " pod="minio-dev/minio" Oct 11 01:03:16 crc kubenswrapper[4743]: I1011 01:03:16.050486 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxlz5\" (UniqueName: \"kubernetes.io/projected/90243924-956a-4163-a85b-c9cc3e02c003-kube-api-access-rxlz5\") pod \"minio\" (UID: \"90243924-956a-4163-a85b-c9cc3e02c003\") " pod="minio-dev/minio" Oct 11 01:03:16 crc kubenswrapper[4743]: I1011 01:03:16.050533 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-699a960d-4349-4221-baaa-8ed979f759bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-699a960d-4349-4221-baaa-8ed979f759bd\") pod \"minio\" (UID: \"90243924-956a-4163-a85b-c9cc3e02c003\") " pod="minio-dev/minio" Oct 11 01:03:16 crc kubenswrapper[4743]: I1011 01:03:16.053972 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 11 01:03:16 crc kubenswrapper[4743]: I1011 01:03:16.054011 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-699a960d-4349-4221-baaa-8ed979f759bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-699a960d-4349-4221-baaa-8ed979f759bd\") pod \"minio\" (UID: \"90243924-956a-4163-a85b-c9cc3e02c003\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9953a5cd29cf2d478f1bfddf6aea80cc0a885830e8024caabdc0bf193611dc63/globalmount\"" pod="minio-dev/minio" Oct 11 01:03:16 crc kubenswrapper[4743]: I1011 01:03:16.068767 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxlz5\" (UniqueName: \"kubernetes.io/projected/90243924-956a-4163-a85b-c9cc3e02c003-kube-api-access-rxlz5\") pod \"minio\" (UID: \"90243924-956a-4163-a85b-c9cc3e02c003\") " pod="minio-dev/minio" Oct 11 01:03:16 crc kubenswrapper[4743]: I1011 01:03:16.086547 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-699a960d-4349-4221-baaa-8ed979f759bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-699a960d-4349-4221-baaa-8ed979f759bd\") pod \"minio\" (UID: \"90243924-956a-4163-a85b-c9cc3e02c003\") " pod="minio-dev/minio" Oct 11 01:03:16 crc kubenswrapper[4743]: I1011 01:03:16.374506 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Oct 11 01:03:16 crc kubenswrapper[4743]: I1011 01:03:16.840911 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Oct 11 01:03:16 crc kubenswrapper[4743]: W1011 01:03:16.847342 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90243924_956a_4163_a85b_c9cc3e02c003.slice/crio-9602a38a8bc3fc3f88878cb2e89fc3ca8dfd34814dc5703b3141b6188fc0fe42 WatchSource:0}: Error finding container 9602a38a8bc3fc3f88878cb2e89fc3ca8dfd34814dc5703b3141b6188fc0fe42: Status 404 returned error can't find the container with id 9602a38a8bc3fc3f88878cb2e89fc3ca8dfd34814dc5703b3141b6188fc0fe42 Oct 11 01:03:17 crc kubenswrapper[4743]: I1011 01:03:17.679731 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"90243924-956a-4163-a85b-c9cc3e02c003","Type":"ContainerStarted","Data":"9602a38a8bc3fc3f88878cb2e89fc3ca8dfd34814dc5703b3141b6188fc0fe42"} Oct 11 01:03:20 crc kubenswrapper[4743]: I1011 01:03:20.722179 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"90243924-956a-4163-a85b-c9cc3e02c003","Type":"ContainerStarted","Data":"12f7aceafbe690eaaa55a9b489f2113587bd9ee21580625b680654081ddbfed8"} Oct 11 01:03:20 crc kubenswrapper[4743]: I1011 01:03:20.741027 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.82232265 podStartE2EDuration="7.7410053s" podCreationTimestamp="2025-10-11 01:03:13 +0000 UTC" firstStartedPulling="2025-10-11 01:03:16.849493259 +0000 UTC m=+691.502473676" lastFinishedPulling="2025-10-11 01:03:19.768175889 +0000 UTC m=+694.421156326" observedRunningTime="2025-10-11 01:03:20.739764738 +0000 UTC m=+695.392745205" watchObservedRunningTime="2025-10-11 01:03:20.7410053 +0000 UTC m=+695.393985727" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.144349 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-6f5f7fff97-72rvc"] Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.145728 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-6f5f7fff97-72rvc" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.147729 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.150485 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.150730 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.150937 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-xxncw" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.151122 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.178163 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-6f5f7fff97-72rvc"] Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.258416 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/1f759884-04cd-4b18-90dd-9b4745c12ba7-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-6f5f7fff97-72rvc\" (UID: \"1f759884-04cd-4b18-90dd-9b4745c12ba7\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-72rvc" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.258471 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/1f759884-04cd-4b18-90dd-9b4745c12ba7-logging-loki-distributor-http\") pod \"logging-loki-distributor-6f5f7fff97-72rvc\" (UID: \"1f759884-04cd-4b18-90dd-9b4745c12ba7\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-72rvc" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.258499 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f759884-04cd-4b18-90dd-9b4745c12ba7-config\") pod \"logging-loki-distributor-6f5f7fff97-72rvc\" (UID: \"1f759884-04cd-4b18-90dd-9b4745c12ba7\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-72rvc" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.258577 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6vxn\" (UniqueName: \"kubernetes.io/projected/1f759884-04cd-4b18-90dd-9b4745c12ba7-kube-api-access-g6vxn\") pod \"logging-loki-distributor-6f5f7fff97-72rvc\" (UID: \"1f759884-04cd-4b18-90dd-9b4745c12ba7\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-72rvc" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.258631 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f759884-04cd-4b18-90dd-9b4745c12ba7-logging-loki-ca-bundle\") pod \"logging-loki-distributor-6f5f7fff97-72rvc\" (UID: \"1f759884-04cd-4b18-90dd-9b4745c12ba7\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-72rvc" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.301548 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-5d954896cf-xd7bv"] Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.303011 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5d954896cf-xd7bv" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.305116 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.305330 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.306217 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.330198 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5d954896cf-xd7bv"] Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.359759 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t592r\" (UniqueName: \"kubernetes.io/projected/7b393f9c-b255-4cac-96f2-3d5861cc7cce-kube-api-access-t592r\") pod \"logging-loki-querier-5d954896cf-xd7bv\" (UID: \"7b393f9c-b255-4cac-96f2-3d5861cc7cce\") " pod="openshift-logging/logging-loki-querier-5d954896cf-xd7bv" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.359806 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/7b393f9c-b255-4cac-96f2-3d5861cc7cce-logging-loki-querier-http\") pod \"logging-loki-querier-5d954896cf-xd7bv\" (UID: \"7b393f9c-b255-4cac-96f2-3d5861cc7cce\") " pod="openshift-logging/logging-loki-querier-5d954896cf-xd7bv" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.359835 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6vxn\" (UniqueName: \"kubernetes.io/projected/1f759884-04cd-4b18-90dd-9b4745c12ba7-kube-api-access-g6vxn\") pod \"logging-loki-distributor-6f5f7fff97-72rvc\" (UID: \"1f759884-04cd-4b18-90dd-9b4745c12ba7\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-72rvc" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.359939 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b393f9c-b255-4cac-96f2-3d5861cc7cce-logging-loki-ca-bundle\") pod \"logging-loki-querier-5d954896cf-xd7bv\" (UID: \"7b393f9c-b255-4cac-96f2-3d5861cc7cce\") " pod="openshift-logging/logging-loki-querier-5d954896cf-xd7bv" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.359996 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/7b393f9c-b255-4cac-96f2-3d5861cc7cce-logging-loki-querier-grpc\") pod \"logging-loki-querier-5d954896cf-xd7bv\" (UID: \"7b393f9c-b255-4cac-96f2-3d5861cc7cce\") " pod="openshift-logging/logging-loki-querier-5d954896cf-xd7bv" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.360065 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b393f9c-b255-4cac-96f2-3d5861cc7cce-config\") pod \"logging-loki-querier-5d954896cf-xd7bv\" (UID: \"7b393f9c-b255-4cac-96f2-3d5861cc7cce\") " pod="openshift-logging/logging-loki-querier-5d954896cf-xd7bv" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.360148 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f759884-04cd-4b18-90dd-9b4745c12ba7-logging-loki-ca-bundle\") pod \"logging-loki-distributor-6f5f7fff97-72rvc\" (UID: \"1f759884-04cd-4b18-90dd-9b4745c12ba7\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-72rvc" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.360212 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/1f759884-04cd-4b18-90dd-9b4745c12ba7-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-6f5f7fff97-72rvc\" (UID: \"1f759884-04cd-4b18-90dd-9b4745c12ba7\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-72rvc" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.360245 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/1f759884-04cd-4b18-90dd-9b4745c12ba7-logging-loki-distributor-http\") pod \"logging-loki-distributor-6f5f7fff97-72rvc\" (UID: \"1f759884-04cd-4b18-90dd-9b4745c12ba7\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-72rvc" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.360275 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f759884-04cd-4b18-90dd-9b4745c12ba7-config\") pod \"logging-loki-distributor-6f5f7fff97-72rvc\" (UID: \"1f759884-04cd-4b18-90dd-9b4745c12ba7\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-72rvc" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.360303 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/7b393f9c-b255-4cac-96f2-3d5861cc7cce-logging-loki-s3\") pod \"logging-loki-querier-5d954896cf-xd7bv\" (UID: \"7b393f9c-b255-4cac-96f2-3d5861cc7cce\") " pod="openshift-logging/logging-loki-querier-5d954896cf-xd7bv" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.361041 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f759884-04cd-4b18-90dd-9b4745c12ba7-logging-loki-ca-bundle\") pod \"logging-loki-distributor-6f5f7fff97-72rvc\" (UID: \"1f759884-04cd-4b18-90dd-9b4745c12ba7\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-72rvc" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.361502 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f759884-04cd-4b18-90dd-9b4745c12ba7-config\") pod \"logging-loki-distributor-6f5f7fff97-72rvc\" (UID: \"1f759884-04cd-4b18-90dd-9b4745c12ba7\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-72rvc" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.364603 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-zzmff"] Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.365381 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-zzmff" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.366075 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/1f759884-04cd-4b18-90dd-9b4745c12ba7-logging-loki-distributor-http\") pod \"logging-loki-distributor-6f5f7fff97-72rvc\" (UID: \"1f759884-04cd-4b18-90dd-9b4745c12ba7\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-72rvc" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.366245 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/1f759884-04cd-4b18-90dd-9b4745c12ba7-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-6f5f7fff97-72rvc\" (UID: \"1f759884-04cd-4b18-90dd-9b4745c12ba7\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-72rvc" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.368317 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.368550 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.391515 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6vxn\" (UniqueName: \"kubernetes.io/projected/1f759884-04cd-4b18-90dd-9b4745c12ba7-kube-api-access-g6vxn\") pod \"logging-loki-distributor-6f5f7fff97-72rvc\" (UID: \"1f759884-04cd-4b18-90dd-9b4745c12ba7\") " pod="openshift-logging/logging-loki-distributor-6f5f7fff97-72rvc" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.397686 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-zzmff"] Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.461838 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/58a37184-c341-454d-b33e-a2af6dc56af3-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-6fbbbc8b7d-zzmff\" (UID: \"58a37184-c341-454d-b33e-a2af6dc56af3\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-zzmff" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.461890 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8csfq\" (UniqueName: \"kubernetes.io/projected/58a37184-c341-454d-b33e-a2af6dc56af3-kube-api-access-8csfq\") pod \"logging-loki-query-frontend-6fbbbc8b7d-zzmff\" (UID: \"58a37184-c341-454d-b33e-a2af6dc56af3\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-zzmff" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.461915 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t592r\" (UniqueName: \"kubernetes.io/projected/7b393f9c-b255-4cac-96f2-3d5861cc7cce-kube-api-access-t592r\") pod \"logging-loki-querier-5d954896cf-xd7bv\" (UID: \"7b393f9c-b255-4cac-96f2-3d5861cc7cce\") " pod="openshift-logging/logging-loki-querier-5d954896cf-xd7bv" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.462064 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/7b393f9c-b255-4cac-96f2-3d5861cc7cce-logging-loki-querier-http\") pod \"logging-loki-querier-5d954896cf-xd7bv\" (UID: \"7b393f9c-b255-4cac-96f2-3d5861cc7cce\") " pod="openshift-logging/logging-loki-querier-5d954896cf-xd7bv" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.462142 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b393f9c-b255-4cac-96f2-3d5861cc7cce-logging-loki-ca-bundle\") pod \"logging-loki-querier-5d954896cf-xd7bv\" (UID: \"7b393f9c-b255-4cac-96f2-3d5861cc7cce\") " pod="openshift-logging/logging-loki-querier-5d954896cf-xd7bv" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.462170 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/7b393f9c-b255-4cac-96f2-3d5861cc7cce-logging-loki-querier-grpc\") pod \"logging-loki-querier-5d954896cf-xd7bv\" (UID: \"7b393f9c-b255-4cac-96f2-3d5861cc7cce\") " pod="openshift-logging/logging-loki-querier-5d954896cf-xd7bv" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.462204 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58a37184-c341-454d-b33e-a2af6dc56af3-config\") pod \"logging-loki-query-frontend-6fbbbc8b7d-zzmff\" (UID: \"58a37184-c341-454d-b33e-a2af6dc56af3\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-zzmff" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.462238 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b393f9c-b255-4cac-96f2-3d5861cc7cce-config\") pod \"logging-loki-querier-5d954896cf-xd7bv\" (UID: \"7b393f9c-b255-4cac-96f2-3d5861cc7cce\") " pod="openshift-logging/logging-loki-querier-5d954896cf-xd7bv" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.462321 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/58a37184-c341-454d-b33e-a2af6dc56af3-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-6fbbbc8b7d-zzmff\" (UID: \"58a37184-c341-454d-b33e-a2af6dc56af3\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-zzmff" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.462362 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58a37184-c341-454d-b33e-a2af6dc56af3-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-6fbbbc8b7d-zzmff\" (UID: \"58a37184-c341-454d-b33e-a2af6dc56af3\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-zzmff" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.462404 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/7b393f9c-b255-4cac-96f2-3d5861cc7cce-logging-loki-s3\") pod \"logging-loki-querier-5d954896cf-xd7bv\" (UID: \"7b393f9c-b255-4cac-96f2-3d5861cc7cce\") " pod="openshift-logging/logging-loki-querier-5d954896cf-xd7bv" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.463103 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b393f9c-b255-4cac-96f2-3d5861cc7cce-logging-loki-ca-bundle\") pod \"logging-loki-querier-5d954896cf-xd7bv\" (UID: \"7b393f9c-b255-4cac-96f2-3d5861cc7cce\") " pod="openshift-logging/logging-loki-querier-5d954896cf-xd7bv" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.463308 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b393f9c-b255-4cac-96f2-3d5861cc7cce-config\") pod \"logging-loki-querier-5d954896cf-xd7bv\" (UID: \"7b393f9c-b255-4cac-96f2-3d5861cc7cce\") " pod="openshift-logging/logging-loki-querier-5d954896cf-xd7bv" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.464740 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/7b393f9c-b255-4cac-96f2-3d5861cc7cce-logging-loki-querier-http\") pod \"logging-loki-querier-5d954896cf-xd7bv\" (UID: \"7b393f9c-b255-4cac-96f2-3d5861cc7cce\") " pod="openshift-logging/logging-loki-querier-5d954896cf-xd7bv" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.468455 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-c45fcc855-ddvcq"] Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.470376 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.475180 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.475483 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.475587 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.475686 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-29jt7" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.475811 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.476322 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.480761 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-6f5f7fff97-72rvc" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.480844 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/7b393f9c-b255-4cac-96f2-3d5861cc7cce-logging-loki-querier-grpc\") pod \"logging-loki-querier-5d954896cf-xd7bv\" (UID: \"7b393f9c-b255-4cac-96f2-3d5861cc7cce\") " pod="openshift-logging/logging-loki-querier-5d954896cf-xd7bv" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.482539 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/7b393f9c-b255-4cac-96f2-3d5861cc7cce-logging-loki-s3\") pod \"logging-loki-querier-5d954896cf-xd7bv\" (UID: \"7b393f9c-b255-4cac-96f2-3d5861cc7cce\") " pod="openshift-logging/logging-loki-querier-5d954896cf-xd7bv" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.495900 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t592r\" (UniqueName: \"kubernetes.io/projected/7b393f9c-b255-4cac-96f2-3d5861cc7cce-kube-api-access-t592r\") pod \"logging-loki-querier-5d954896cf-xd7bv\" (UID: \"7b393f9c-b255-4cac-96f2-3d5861cc7cce\") " pod="openshift-logging/logging-loki-querier-5d954896cf-xd7bv" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.499314 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-c45fcc855-ddvcq"] Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.529431 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-c45fcc855-8mtnp"] Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.536423 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.546792 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-c45fcc855-8mtnp"] Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.563613 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mltfh\" (UniqueName: \"kubernetes.io/projected/443c6346-a364-4e67-8c53-2bcd9b1f0927-kube-api-access-mltfh\") pod \"logging-loki-gateway-c45fcc855-ddvcq\" (UID: \"443c6346-a364-4e67-8c53-2bcd9b1f0927\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.563663 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/58a37184-c341-454d-b33e-a2af6dc56af3-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-6fbbbc8b7d-zzmff\" (UID: \"58a37184-c341-454d-b33e-a2af6dc56af3\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-zzmff" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.563703 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8csfq\" (UniqueName: \"kubernetes.io/projected/58a37184-c341-454d-b33e-a2af6dc56af3-kube-api-access-8csfq\") pod \"logging-loki-query-frontend-6fbbbc8b7d-zzmff\" (UID: \"58a37184-c341-454d-b33e-a2af6dc56af3\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-zzmff" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.563725 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwjc7\" (UniqueName: \"kubernetes.io/projected/60332ecb-34a5-4628-9311-4469d823f589-kube-api-access-lwjc7\") pod \"logging-loki-gateway-c45fcc855-8mtnp\" (UID: \"60332ecb-34a5-4628-9311-4469d823f589\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.564039 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/60332ecb-34a5-4628-9311-4469d823f589-tls-secret\") pod \"logging-loki-gateway-c45fcc855-8mtnp\" (UID: \"60332ecb-34a5-4628-9311-4469d823f589\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.564066 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/60332ecb-34a5-4628-9311-4469d823f589-rbac\") pod \"logging-loki-gateway-c45fcc855-8mtnp\" (UID: \"60332ecb-34a5-4628-9311-4469d823f589\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.564116 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/443c6346-a364-4e67-8c53-2bcd9b1f0927-tenants\") pod \"logging-loki-gateway-c45fcc855-ddvcq\" (UID: \"443c6346-a364-4e67-8c53-2bcd9b1f0927\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.564134 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/443c6346-a364-4e67-8c53-2bcd9b1f0927-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-c45fcc855-ddvcq\" (UID: \"443c6346-a364-4e67-8c53-2bcd9b1f0927\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.564201 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60332ecb-34a5-4628-9311-4469d823f589-logging-loki-ca-bundle\") pod \"logging-loki-gateway-c45fcc855-8mtnp\" (UID: \"60332ecb-34a5-4628-9311-4469d823f589\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.564271 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58a37184-c341-454d-b33e-a2af6dc56af3-config\") pod \"logging-loki-query-frontend-6fbbbc8b7d-zzmff\" (UID: \"58a37184-c341-454d-b33e-a2af6dc56af3\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-zzmff" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.564440 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/443c6346-a364-4e67-8c53-2bcd9b1f0927-rbac\") pod \"logging-loki-gateway-c45fcc855-ddvcq\" (UID: \"443c6346-a364-4e67-8c53-2bcd9b1f0927\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.564468 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/60332ecb-34a5-4628-9311-4469d823f589-tenants\") pod \"logging-loki-gateway-c45fcc855-8mtnp\" (UID: \"60332ecb-34a5-4628-9311-4469d823f589\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.564484 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60332ecb-34a5-4628-9311-4469d823f589-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-c45fcc855-8mtnp\" (UID: \"60332ecb-34a5-4628-9311-4469d823f589\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.564528 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/60332ecb-34a5-4628-9311-4469d823f589-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-c45fcc855-8mtnp\" (UID: \"60332ecb-34a5-4628-9311-4469d823f589\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.564549 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/60332ecb-34a5-4628-9311-4469d823f589-lokistack-gateway\") pod \"logging-loki-gateway-c45fcc855-8mtnp\" (UID: \"60332ecb-34a5-4628-9311-4469d823f589\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.564588 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/443c6346-a364-4e67-8c53-2bcd9b1f0927-lokistack-gateway\") pod \"logging-loki-gateway-c45fcc855-ddvcq\" (UID: \"443c6346-a364-4e67-8c53-2bcd9b1f0927\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.564611 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/443c6346-a364-4e67-8c53-2bcd9b1f0927-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-c45fcc855-ddvcq\" (UID: \"443c6346-a364-4e67-8c53-2bcd9b1f0927\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.564631 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/58a37184-c341-454d-b33e-a2af6dc56af3-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-6fbbbc8b7d-zzmff\" (UID: \"58a37184-c341-454d-b33e-a2af6dc56af3\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-zzmff" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.564666 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/443c6346-a364-4e67-8c53-2bcd9b1f0927-logging-loki-ca-bundle\") pod \"logging-loki-gateway-c45fcc855-ddvcq\" (UID: \"443c6346-a364-4e67-8c53-2bcd9b1f0927\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.564692 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/443c6346-a364-4e67-8c53-2bcd9b1f0927-tls-secret\") pod \"logging-loki-gateway-c45fcc855-ddvcq\" (UID: \"443c6346-a364-4e67-8c53-2bcd9b1f0927\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.564708 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58a37184-c341-454d-b33e-a2af6dc56af3-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-6fbbbc8b7d-zzmff\" (UID: \"58a37184-c341-454d-b33e-a2af6dc56af3\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-zzmff" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.565741 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58a37184-c341-454d-b33e-a2af6dc56af3-config\") pod \"logging-loki-query-frontend-6fbbbc8b7d-zzmff\" (UID: \"58a37184-c341-454d-b33e-a2af6dc56af3\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-zzmff" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.565976 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58a37184-c341-454d-b33e-a2af6dc56af3-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-6fbbbc8b7d-zzmff\" (UID: \"58a37184-c341-454d-b33e-a2af6dc56af3\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-zzmff" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.567201 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/58a37184-c341-454d-b33e-a2af6dc56af3-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-6fbbbc8b7d-zzmff\" (UID: \"58a37184-c341-454d-b33e-a2af6dc56af3\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-zzmff" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.572733 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/58a37184-c341-454d-b33e-a2af6dc56af3-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-6fbbbc8b7d-zzmff\" (UID: \"58a37184-c341-454d-b33e-a2af6dc56af3\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-zzmff" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.577683 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8csfq\" (UniqueName: \"kubernetes.io/projected/58a37184-c341-454d-b33e-a2af6dc56af3-kube-api-access-8csfq\") pod \"logging-loki-query-frontend-6fbbbc8b7d-zzmff\" (UID: \"58a37184-c341-454d-b33e-a2af6dc56af3\") " pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-zzmff" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.616458 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5d954896cf-xd7bv" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.670611 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/443c6346-a364-4e67-8c53-2bcd9b1f0927-tls-secret\") pod \"logging-loki-gateway-c45fcc855-ddvcq\" (UID: \"443c6346-a364-4e67-8c53-2bcd9b1f0927\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.670669 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mltfh\" (UniqueName: \"kubernetes.io/projected/443c6346-a364-4e67-8c53-2bcd9b1f0927-kube-api-access-mltfh\") pod \"logging-loki-gateway-c45fcc855-ddvcq\" (UID: \"443c6346-a364-4e67-8c53-2bcd9b1f0927\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.670700 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwjc7\" (UniqueName: \"kubernetes.io/projected/60332ecb-34a5-4628-9311-4469d823f589-kube-api-access-lwjc7\") pod \"logging-loki-gateway-c45fcc855-8mtnp\" (UID: \"60332ecb-34a5-4628-9311-4469d823f589\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.670724 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/60332ecb-34a5-4628-9311-4469d823f589-tls-secret\") pod \"logging-loki-gateway-c45fcc855-8mtnp\" (UID: \"60332ecb-34a5-4628-9311-4469d823f589\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.670739 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/60332ecb-34a5-4628-9311-4469d823f589-rbac\") pod \"logging-loki-gateway-c45fcc855-8mtnp\" (UID: \"60332ecb-34a5-4628-9311-4469d823f589\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.670754 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/443c6346-a364-4e67-8c53-2bcd9b1f0927-tenants\") pod \"logging-loki-gateway-c45fcc855-ddvcq\" (UID: \"443c6346-a364-4e67-8c53-2bcd9b1f0927\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.670770 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/443c6346-a364-4e67-8c53-2bcd9b1f0927-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-c45fcc855-ddvcq\" (UID: \"443c6346-a364-4e67-8c53-2bcd9b1f0927\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.670792 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60332ecb-34a5-4628-9311-4469d823f589-logging-loki-ca-bundle\") pod \"logging-loki-gateway-c45fcc855-8mtnp\" (UID: \"60332ecb-34a5-4628-9311-4469d823f589\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.670817 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/443c6346-a364-4e67-8c53-2bcd9b1f0927-rbac\") pod \"logging-loki-gateway-c45fcc855-ddvcq\" (UID: \"443c6346-a364-4e67-8c53-2bcd9b1f0927\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.670832 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/60332ecb-34a5-4628-9311-4469d823f589-tenants\") pod \"logging-loki-gateway-c45fcc855-8mtnp\" (UID: \"60332ecb-34a5-4628-9311-4469d823f589\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.670846 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60332ecb-34a5-4628-9311-4469d823f589-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-c45fcc855-8mtnp\" (UID: \"60332ecb-34a5-4628-9311-4469d823f589\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.670889 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/60332ecb-34a5-4628-9311-4469d823f589-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-c45fcc855-8mtnp\" (UID: \"60332ecb-34a5-4628-9311-4469d823f589\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.670911 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/60332ecb-34a5-4628-9311-4469d823f589-lokistack-gateway\") pod \"logging-loki-gateway-c45fcc855-8mtnp\" (UID: \"60332ecb-34a5-4628-9311-4469d823f589\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.670930 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/443c6346-a364-4e67-8c53-2bcd9b1f0927-lokistack-gateway\") pod \"logging-loki-gateway-c45fcc855-ddvcq\" (UID: \"443c6346-a364-4e67-8c53-2bcd9b1f0927\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.670948 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/443c6346-a364-4e67-8c53-2bcd9b1f0927-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-c45fcc855-ddvcq\" (UID: \"443c6346-a364-4e67-8c53-2bcd9b1f0927\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.672137 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/443c6346-a364-4e67-8c53-2bcd9b1f0927-logging-loki-ca-bundle\") pod \"logging-loki-gateway-c45fcc855-ddvcq\" (UID: \"443c6346-a364-4e67-8c53-2bcd9b1f0927\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.672910 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/443c6346-a364-4e67-8c53-2bcd9b1f0927-logging-loki-ca-bundle\") pod \"logging-loki-gateway-c45fcc855-ddvcq\" (UID: \"443c6346-a364-4e67-8c53-2bcd9b1f0927\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.673596 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60332ecb-34a5-4628-9311-4469d823f589-logging-loki-ca-bundle\") pod \"logging-loki-gateway-c45fcc855-8mtnp\" (UID: \"60332ecb-34a5-4628-9311-4469d823f589\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.673646 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/443c6346-a364-4e67-8c53-2bcd9b1f0927-lokistack-gateway\") pod \"logging-loki-gateway-c45fcc855-ddvcq\" (UID: \"443c6346-a364-4e67-8c53-2bcd9b1f0927\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.674209 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/60332ecb-34a5-4628-9311-4469d823f589-rbac\") pod \"logging-loki-gateway-c45fcc855-8mtnp\" (UID: \"60332ecb-34a5-4628-9311-4469d823f589\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.674327 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/60332ecb-34a5-4628-9311-4469d823f589-lokistack-gateway\") pod \"logging-loki-gateway-c45fcc855-8mtnp\" (UID: \"60332ecb-34a5-4628-9311-4469d823f589\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.674965 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/443c6346-a364-4e67-8c53-2bcd9b1f0927-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-c45fcc855-ddvcq\" (UID: \"443c6346-a364-4e67-8c53-2bcd9b1f0927\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.675834 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60332ecb-34a5-4628-9311-4469d823f589-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-c45fcc855-8mtnp\" (UID: \"60332ecb-34a5-4628-9311-4469d823f589\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.675850 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/443c6346-a364-4e67-8c53-2bcd9b1f0927-rbac\") pod \"logging-loki-gateway-c45fcc855-ddvcq\" (UID: \"443c6346-a364-4e67-8c53-2bcd9b1f0927\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.676473 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/60332ecb-34a5-4628-9311-4469d823f589-tls-secret\") pod \"logging-loki-gateway-c45fcc855-8mtnp\" (UID: \"60332ecb-34a5-4628-9311-4469d823f589\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.676470 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/443c6346-a364-4e67-8c53-2bcd9b1f0927-tls-secret\") pod \"logging-loki-gateway-c45fcc855-ddvcq\" (UID: \"443c6346-a364-4e67-8c53-2bcd9b1f0927\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.676782 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/443c6346-a364-4e67-8c53-2bcd9b1f0927-tenants\") pod \"logging-loki-gateway-c45fcc855-ddvcq\" (UID: \"443c6346-a364-4e67-8c53-2bcd9b1f0927\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.677378 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/443c6346-a364-4e67-8c53-2bcd9b1f0927-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-c45fcc855-ddvcq\" (UID: \"443c6346-a364-4e67-8c53-2bcd9b1f0927\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.678046 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/60332ecb-34a5-4628-9311-4469d823f589-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-c45fcc855-8mtnp\" (UID: \"60332ecb-34a5-4628-9311-4469d823f589\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.679233 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/60332ecb-34a5-4628-9311-4469d823f589-tenants\") pod \"logging-loki-gateway-c45fcc855-8mtnp\" (UID: \"60332ecb-34a5-4628-9311-4469d823f589\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.689040 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mltfh\" (UniqueName: \"kubernetes.io/projected/443c6346-a364-4e67-8c53-2bcd9b1f0927-kube-api-access-mltfh\") pod \"logging-loki-gateway-c45fcc855-ddvcq\" (UID: \"443c6346-a364-4e67-8c53-2bcd9b1f0927\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.690160 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwjc7\" (UniqueName: \"kubernetes.io/projected/60332ecb-34a5-4628-9311-4469d823f589-kube-api-access-lwjc7\") pod \"logging-loki-gateway-c45fcc855-8mtnp\" (UID: \"60332ecb-34a5-4628-9311-4469d823f589\") " pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.727140 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-zzmff" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.841296 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.853185 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.965712 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-zzmff"] Oct 11 01:03:24 crc kubenswrapper[4743]: I1011 01:03:24.971306 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-6f5f7fff97-72rvc"] Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.040088 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5d954896cf-xd7bv"] Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.346469 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.348247 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.349581 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-c45fcc855-8mtnp"] Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.351674 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.353108 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.365600 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.383861 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf0c8290-1118-4dbf-a638-bde5c07bdaab-config\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") " pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.384011 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-51c07f20-08af-4ca6-b9dc-7559ed2323d6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-51c07f20-08af-4ca6-b9dc-7559ed2323d6\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") " pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.384086 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/bf0c8290-1118-4dbf-a638-bde5c07bdaab-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") " pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.384150 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2x6v\" (UniqueName: \"kubernetes.io/projected/bf0c8290-1118-4dbf-a638-bde5c07bdaab-kube-api-access-c2x6v\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") " pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.384243 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf0c8290-1118-4dbf-a638-bde5c07bdaab-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") " pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.384314 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c5681915-9e45-4c45-b781-2ac759319748\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c5681915-9e45-4c45-b781-2ac759319748\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") " pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.384412 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/bf0c8290-1118-4dbf-a638-bde5c07bdaab-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") " pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.384480 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/bf0c8290-1118-4dbf-a638-bde5c07bdaab-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") " pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.397954 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.398763 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.400935 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.401106 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Oct 11 01:03:25 crc kubenswrapper[4743]: W1011 01:03:25.401215 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod443c6346_a364_4e67_8c53_2bcd9b1f0927.slice/crio-7e67fcfd3cc970577c39644a1372b1bbc51865b4da97bf14a1ad3f292130ede7 WatchSource:0}: Error finding container 7e67fcfd3cc970577c39644a1372b1bbc51865b4da97bf14a1ad3f292130ede7: Status 404 returned error can't find the container with id 7e67fcfd3cc970577c39644a1372b1bbc51865b4da97bf14a1ad3f292130ede7 Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.402359 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-c45fcc855-ddvcq"] Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.420989 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.439751 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.440633 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.443403 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.443646 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.454192 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.485250 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84dnq\" (UniqueName: \"kubernetes.io/projected/a85d8329-17fe-4d06-b45e-d410514cc210-kube-api-access-84dnq\") pod \"logging-loki-index-gateway-0\" (UID: \"a85d8329-17fe-4d06-b45e-d410514cc210\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.485288 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/bf0c8290-1118-4dbf-a638-bde5c07bdaab-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") " pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.485322 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2x6v\" (UniqueName: \"kubernetes.io/projected/bf0c8290-1118-4dbf-a638-bde5c07bdaab-kube-api-access-c2x6v\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") " pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.485414 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf0c8290-1118-4dbf-a638-bde5c07bdaab-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") " pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.485460 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a85d8329-17fe-4d06-b45e-d410514cc210-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"a85d8329-17fe-4d06-b45e-d410514cc210\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.485488 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c5681915-9e45-4c45-b781-2ac759319748\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c5681915-9e45-4c45-b781-2ac759319748\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") " pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.485547 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/a85d8329-17fe-4d06-b45e-d410514cc210-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"a85d8329-17fe-4d06-b45e-d410514cc210\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.485583 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/bf0c8290-1118-4dbf-a638-bde5c07bdaab-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") " pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.485604 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/a85d8329-17fe-4d06-b45e-d410514cc210-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"a85d8329-17fe-4d06-b45e-d410514cc210\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.485626 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a85d8329-17fe-4d06-b45e-d410514cc210-config\") pod \"logging-loki-index-gateway-0\" (UID: \"a85d8329-17fe-4d06-b45e-d410514cc210\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.485660 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/bf0c8290-1118-4dbf-a638-bde5c07bdaab-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") " pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.485740 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/a85d8329-17fe-4d06-b45e-d410514cc210-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"a85d8329-17fe-4d06-b45e-d410514cc210\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.485771 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf0c8290-1118-4dbf-a638-bde5c07bdaab-config\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") " pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.485816 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-637eb038-5939-4c26-99fe-8f93e1a4d0b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-637eb038-5939-4c26-99fe-8f93e1a4d0b7\") pod \"logging-loki-index-gateway-0\" (UID: \"a85d8329-17fe-4d06-b45e-d410514cc210\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.485841 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-51c07f20-08af-4ca6-b9dc-7559ed2323d6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-51c07f20-08af-4ca6-b9dc-7559ed2323d6\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") " pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.486432 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf0c8290-1118-4dbf-a638-bde5c07bdaab-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") " pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.487177 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf0c8290-1118-4dbf-a638-bde5c07bdaab-config\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") " pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.488558 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.488665 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-51c07f20-08af-4ca6-b9dc-7559ed2323d6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-51c07f20-08af-4ca6-b9dc-7559ed2323d6\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3a1e4f302420aee44c1437d02ab17b14913b70c424eee97a717ceff9bba9a1ef/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.488569 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.488897 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c5681915-9e45-4c45-b781-2ac759319748\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c5681915-9e45-4c45-b781-2ac759319748\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/94ae1ab4b1f0a8592f2546916544387768486bbefc448b814d17754c736401dc/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.490617 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/bf0c8290-1118-4dbf-a638-bde5c07bdaab-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") " pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.490619 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/bf0c8290-1118-4dbf-a638-bde5c07bdaab-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") " pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.492062 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/bf0c8290-1118-4dbf-a638-bde5c07bdaab-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") " pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.499356 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2x6v\" (UniqueName: \"kubernetes.io/projected/bf0c8290-1118-4dbf-a638-bde5c07bdaab-kube-api-access-c2x6v\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") " pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.512630 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-51c07f20-08af-4ca6-b9dc-7559ed2323d6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-51c07f20-08af-4ca6-b9dc-7559ed2323d6\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") " pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.513545 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c5681915-9e45-4c45-b781-2ac759319748\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c5681915-9e45-4c45-b781-2ac759319748\") pod \"logging-loki-ingester-0\" (UID: \"bf0c8290-1118-4dbf-a638-bde5c07bdaab\") " pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.587437 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/a85d8329-17fe-4d06-b45e-d410514cc210-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"a85d8329-17fe-4d06-b45e-d410514cc210\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.587497 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-637eb038-5939-4c26-99fe-8f93e1a4d0b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-637eb038-5939-4c26-99fe-8f93e1a4d0b7\") pod \"logging-loki-index-gateway-0\" (UID: \"a85d8329-17fe-4d06-b45e-d410514cc210\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.587529 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84dnq\" (UniqueName: \"kubernetes.io/projected/a85d8329-17fe-4d06-b45e-d410514cc210-kube-api-access-84dnq\") pod \"logging-loki-index-gateway-0\" (UID: \"a85d8329-17fe-4d06-b45e-d410514cc210\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.587567 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4b36c890-ec8f-4384-af73-bbde2f3e2d2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4b36c890-ec8f-4384-af73-bbde2f3e2d2a\") pod \"logging-loki-compactor-0\" (UID: \"03508b98-c7f8-4ffd-9417-074307cd588e\") " pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.587583 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03508b98-c7f8-4ffd-9417-074307cd588e-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"03508b98-c7f8-4ffd-9417-074307cd588e\") " pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.587623 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a85d8329-17fe-4d06-b45e-d410514cc210-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"a85d8329-17fe-4d06-b45e-d410514cc210\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.587651 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/a85d8329-17fe-4d06-b45e-d410514cc210-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"a85d8329-17fe-4d06-b45e-d410514cc210\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.587708 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03508b98-c7f8-4ffd-9417-074307cd588e-config\") pod \"logging-loki-compactor-0\" (UID: \"03508b98-c7f8-4ffd-9417-074307cd588e\") " pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.587725 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a85d8329-17fe-4d06-b45e-d410514cc210-config\") pod \"logging-loki-index-gateway-0\" (UID: \"a85d8329-17fe-4d06-b45e-d410514cc210\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.587739 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/a85d8329-17fe-4d06-b45e-d410514cc210-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"a85d8329-17fe-4d06-b45e-d410514cc210\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.587769 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/03508b98-c7f8-4ffd-9417-074307cd588e-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"03508b98-c7f8-4ffd-9417-074307cd588e\") " pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.587791 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/03508b98-c7f8-4ffd-9417-074307cd588e-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"03508b98-c7f8-4ffd-9417-074307cd588e\") " pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.587818 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfdrl\" (UniqueName: \"kubernetes.io/projected/03508b98-c7f8-4ffd-9417-074307cd588e-kube-api-access-jfdrl\") pod \"logging-loki-compactor-0\" (UID: \"03508b98-c7f8-4ffd-9417-074307cd588e\") " pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.587836 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/03508b98-c7f8-4ffd-9417-074307cd588e-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"03508b98-c7f8-4ffd-9417-074307cd588e\") " pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.589203 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a85d8329-17fe-4d06-b45e-d410514cc210-config\") pod \"logging-loki-index-gateway-0\" (UID: \"a85d8329-17fe-4d06-b45e-d410514cc210\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.589764 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a85d8329-17fe-4d06-b45e-d410514cc210-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"a85d8329-17fe-4d06-b45e-d410514cc210\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.592395 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/a85d8329-17fe-4d06-b45e-d410514cc210-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"a85d8329-17fe-4d06-b45e-d410514cc210\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.592813 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/a85d8329-17fe-4d06-b45e-d410514cc210-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"a85d8329-17fe-4d06-b45e-d410514cc210\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.597080 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/a85d8329-17fe-4d06-b45e-d410514cc210-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"a85d8329-17fe-4d06-b45e-d410514cc210\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.601281 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.601388 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-637eb038-5939-4c26-99fe-8f93e1a4d0b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-637eb038-5939-4c26-99fe-8f93e1a4d0b7\") pod \"logging-loki-index-gateway-0\" (UID: \"a85d8329-17fe-4d06-b45e-d410514cc210\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c0cc2154f31092277281762998427c9add28fc5cfd5ebd98143005888c246b4b/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.622916 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84dnq\" (UniqueName: \"kubernetes.io/projected/a85d8329-17fe-4d06-b45e-d410514cc210-kube-api-access-84dnq\") pod \"logging-loki-index-gateway-0\" (UID: \"a85d8329-17fe-4d06-b45e-d410514cc210\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.689197 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03508b98-c7f8-4ffd-9417-074307cd588e-config\") pod \"logging-loki-compactor-0\" (UID: \"03508b98-c7f8-4ffd-9417-074307cd588e\") " pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.689443 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/03508b98-c7f8-4ffd-9417-074307cd588e-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"03508b98-c7f8-4ffd-9417-074307cd588e\") " pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.689580 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/03508b98-c7f8-4ffd-9417-074307cd588e-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"03508b98-c7f8-4ffd-9417-074307cd588e\") " pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.689700 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfdrl\" (UniqueName: \"kubernetes.io/projected/03508b98-c7f8-4ffd-9417-074307cd588e-kube-api-access-jfdrl\") pod \"logging-loki-compactor-0\" (UID: \"03508b98-c7f8-4ffd-9417-074307cd588e\") " pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.690125 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/03508b98-c7f8-4ffd-9417-074307cd588e-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"03508b98-c7f8-4ffd-9417-074307cd588e\") " pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.690291 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4b36c890-ec8f-4384-af73-bbde2f3e2d2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4b36c890-ec8f-4384-af73-bbde2f3e2d2a\") pod \"logging-loki-compactor-0\" (UID: \"03508b98-c7f8-4ffd-9417-074307cd588e\") " pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.690844 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03508b98-c7f8-4ffd-9417-074307cd588e-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"03508b98-c7f8-4ffd-9417-074307cd588e\") " pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.690155 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03508b98-c7f8-4ffd-9417-074307cd588e-config\") pod \"logging-loki-compactor-0\" (UID: \"03508b98-c7f8-4ffd-9417-074307cd588e\") " pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.691667 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03508b98-c7f8-4ffd-9417-074307cd588e-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"03508b98-c7f8-4ffd-9417-074307cd588e\") " pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.695899 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/03508b98-c7f8-4ffd-9417-074307cd588e-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"03508b98-c7f8-4ffd-9417-074307cd588e\") " pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.696385 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/03508b98-c7f8-4ffd-9417-074307cd588e-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"03508b98-c7f8-4ffd-9417-074307cd588e\") " pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.696952 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/03508b98-c7f8-4ffd-9417-074307cd588e-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"03508b98-c7f8-4ffd-9417-074307cd588e\") " pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.703808 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.703844 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4b36c890-ec8f-4384-af73-bbde2f3e2d2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4b36c890-ec8f-4384-af73-bbde2f3e2d2a\") pod \"logging-loki-compactor-0\" (UID: \"03508b98-c7f8-4ffd-9417-074307cd588e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b5cf30ec8de4e58b147fb99cdb1ef774dfe52db17c419fb0ef26164c4bac4478/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.725058 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfdrl\" (UniqueName: \"kubernetes.io/projected/03508b98-c7f8-4ffd-9417-074307cd588e-kube-api-access-jfdrl\") pod \"logging-loki-compactor-0\" (UID: \"03508b98-c7f8-4ffd-9417-074307cd588e\") " pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.732598 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-637eb038-5939-4c26-99fe-8f93e1a4d0b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-637eb038-5939-4c26-99fe-8f93e1a4d0b7\") pod \"logging-loki-index-gateway-0\" (UID: \"a85d8329-17fe-4d06-b45e-d410514cc210\") " pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.741165 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4b36c890-ec8f-4384-af73-bbde2f3e2d2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4b36c890-ec8f-4384-af73-bbde2f3e2d2a\") pod \"logging-loki-compactor-0\" (UID: \"03508b98-c7f8-4ffd-9417-074307cd588e\") " pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.754362 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" event={"ID":"443c6346-a364-4e67-8c53-2bcd9b1f0927","Type":"ContainerStarted","Data":"7e67fcfd3cc970577c39644a1372b1bbc51865b4da97bf14a1ad3f292130ede7"} Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.755674 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-zzmff" event={"ID":"58a37184-c341-454d-b33e-a2af6dc56af3","Type":"ContainerStarted","Data":"faa3c7de66e309d96aec366f55cf0a69e4c05e6d33bc71ea565cf877f3cd3eb7"} Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.756663 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-6f5f7fff97-72rvc" event={"ID":"1f759884-04cd-4b18-90dd-9b4745c12ba7","Type":"ContainerStarted","Data":"b984dec38b22a8a4db1fbc6ffb606d20258593351dc228d09dfde1d20afd1bd4"} Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.757584 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5d954896cf-xd7bv" event={"ID":"7b393f9c-b255-4cac-96f2-3d5861cc7cce","Type":"ContainerStarted","Data":"884465a4577847f73865746042b4179303629ebc825d499e0be4eebc4ea0e832"} Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.758598 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" event={"ID":"60332ecb-34a5-4628-9311-4469d823f589","Type":"ContainerStarted","Data":"f42a48f81a8a91657939f110ed4593c91a1a16596234d47a21a8ed391d9a3691"} Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.768014 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.776314 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:25 crc kubenswrapper[4743]: I1011 01:03:25.783116 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:26 crc kubenswrapper[4743]: I1011 01:03:26.169768 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Oct 11 01:03:26 crc kubenswrapper[4743]: I1011 01:03:26.219666 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Oct 11 01:03:26 crc kubenswrapper[4743]: I1011 01:03:26.231042 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Oct 11 01:03:26 crc kubenswrapper[4743]: W1011 01:03:26.237919 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf0c8290_1118_4dbf_a638_bde5c07bdaab.slice/crio-142aee2e607e9d1a7284e48577ea2876a873dbb6b70cc15f47039ea109937b24 WatchSource:0}: Error finding container 142aee2e607e9d1a7284e48577ea2876a873dbb6b70cc15f47039ea109937b24: Status 404 returned error can't find the container with id 142aee2e607e9d1a7284e48577ea2876a873dbb6b70cc15f47039ea109937b24 Oct 11 01:03:26 crc kubenswrapper[4743]: I1011 01:03:26.766740 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"bf0c8290-1118-4dbf-a638-bde5c07bdaab","Type":"ContainerStarted","Data":"142aee2e607e9d1a7284e48577ea2876a873dbb6b70cc15f47039ea109937b24"} Oct 11 01:03:26 crc kubenswrapper[4743]: I1011 01:03:26.768322 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"a85d8329-17fe-4d06-b45e-d410514cc210","Type":"ContainerStarted","Data":"73fd1ad4ca9c52435208851d62a8753c49000152529a78710bf35d2d2f5d99c0"} Oct 11 01:03:26 crc kubenswrapper[4743]: I1011 01:03:26.769154 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"03508b98-c7f8-4ffd-9417-074307cd588e","Type":"ContainerStarted","Data":"81e9a73b7d9bb058c751608e3873a76083e903c91abb567a4ae944980e35a5b1"} Oct 11 01:03:29 crc kubenswrapper[4743]: I1011 01:03:29.786901 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"bf0c8290-1118-4dbf-a638-bde5c07bdaab","Type":"ContainerStarted","Data":"bdbf9176afbe4fda328bdca9f459f962df2cc6b5595f54e334836322ca507b97"} Oct 11 01:03:29 crc kubenswrapper[4743]: I1011 01:03:29.787486 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:03:29 crc kubenswrapper[4743]: I1011 01:03:29.788949 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"a85d8329-17fe-4d06-b45e-d410514cc210","Type":"ContainerStarted","Data":"6a6b6272587a7ca1db600d8d68ddc993473cac29b853bd5a14a5a1b46e1f3537"} Oct 11 01:03:29 crc kubenswrapper[4743]: I1011 01:03:29.789131 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:29 crc kubenswrapper[4743]: I1011 01:03:29.791151 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" event={"ID":"60332ecb-34a5-4628-9311-4469d823f589","Type":"ContainerStarted","Data":"81351c44d818aeba99beeee84f49b78f99d6fe579419bb46eb0697468ded81e5"} Oct 11 01:03:29 crc kubenswrapper[4743]: I1011 01:03:29.792959 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" event={"ID":"443c6346-a364-4e67-8c53-2bcd9b1f0927","Type":"ContainerStarted","Data":"ca0ad7bf8ec83b3fba59128112d5d69ff088421cfc62174dad80f772dbacb8b2"} Oct 11 01:03:29 crc kubenswrapper[4743]: I1011 01:03:29.794648 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"03508b98-c7f8-4ffd-9417-074307cd588e","Type":"ContainerStarted","Data":"e602c3f401dcb129aa698774aa9f18b45f8dfe9b8f38dbbf9879775cdc37cbaf"} Oct 11 01:03:29 crc kubenswrapper[4743]: I1011 01:03:29.795651 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:29 crc kubenswrapper[4743]: I1011 01:03:29.797205 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-zzmff" event={"ID":"58a37184-c341-454d-b33e-a2af6dc56af3","Type":"ContainerStarted","Data":"326ec4257e15cb17cbffbae01314ff5d3ca61a28bc90f37364514074ffad5fd2"} Oct 11 01:03:29 crc kubenswrapper[4743]: I1011 01:03:29.797712 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-zzmff" Oct 11 01:03:29 crc kubenswrapper[4743]: I1011 01:03:29.799124 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-6f5f7fff97-72rvc" event={"ID":"1f759884-04cd-4b18-90dd-9b4745c12ba7","Type":"ContainerStarted","Data":"fd1491a03547bcadd6a4844751d2899c128988db644d975dff3c435062c3cb39"} Oct 11 01:03:29 crc kubenswrapper[4743]: I1011 01:03:29.799203 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-6f5f7fff97-72rvc" Oct 11 01:03:29 crc kubenswrapper[4743]: I1011 01:03:29.800649 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5d954896cf-xd7bv" event={"ID":"7b393f9c-b255-4cac-96f2-3d5861cc7cce","Type":"ContainerStarted","Data":"6fdb1d346542fee00e9a154ac13b54bf4fcbb6a892caff9a76cdfef569fa4aa3"} Oct 11 01:03:29 crc kubenswrapper[4743]: I1011 01:03:29.800829 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-5d954896cf-xd7bv" Oct 11 01:03:29 crc kubenswrapper[4743]: I1011 01:03:29.820280 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=3.33930372 podStartE2EDuration="5.820258482s" podCreationTimestamp="2025-10-11 01:03:24 +0000 UTC" firstStartedPulling="2025-10-11 01:03:26.240028093 +0000 UTC m=+700.893008490" lastFinishedPulling="2025-10-11 01:03:28.720982855 +0000 UTC m=+703.373963252" observedRunningTime="2025-10-11 01:03:29.818683202 +0000 UTC m=+704.471663599" watchObservedRunningTime="2025-10-11 01:03:29.820258482 +0000 UTC m=+704.473238909" Oct 11 01:03:29 crc kubenswrapper[4743]: I1011 01:03:29.855828 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-zzmff" podStartSLOduration=2.0282438799999998 podStartE2EDuration="5.855805347s" podCreationTimestamp="2025-10-11 01:03:24 +0000 UTC" firstStartedPulling="2025-10-11 01:03:24.981088565 +0000 UTC m=+699.634068962" lastFinishedPulling="2025-10-11 01:03:28.808650032 +0000 UTC m=+703.461630429" observedRunningTime="2025-10-11 01:03:29.855325665 +0000 UTC m=+704.508306082" watchObservedRunningTime="2025-10-11 01:03:29.855805347 +0000 UTC m=+704.508785784" Oct 11 01:03:29 crc kubenswrapper[4743]: I1011 01:03:29.878052 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=3.313537166 podStartE2EDuration="5.878030479s" podCreationTimestamp="2025-10-11 01:03:24 +0000 UTC" firstStartedPulling="2025-10-11 01:03:26.221028664 +0000 UTC m=+700.874009061" lastFinishedPulling="2025-10-11 01:03:28.785521967 +0000 UTC m=+703.438502374" observedRunningTime="2025-10-11 01:03:29.872514037 +0000 UTC m=+704.525494444" watchObservedRunningTime="2025-10-11 01:03:29.878030479 +0000 UTC m=+704.531010876" Oct 11 01:03:29 crc kubenswrapper[4743]: I1011 01:03:29.889017 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-6f5f7fff97-72rvc" podStartSLOduration=2.114717187 podStartE2EDuration="5.888999452s" podCreationTimestamp="2025-10-11 01:03:24 +0000 UTC" firstStartedPulling="2025-10-11 01:03:25.004227851 +0000 UTC m=+699.657208248" lastFinishedPulling="2025-10-11 01:03:28.778510116 +0000 UTC m=+703.431490513" observedRunningTime="2025-10-11 01:03:29.886631381 +0000 UTC m=+704.539611838" watchObservedRunningTime="2025-10-11 01:03:29.888999452 +0000 UTC m=+704.541979849" Oct 11 01:03:29 crc kubenswrapper[4743]: I1011 01:03:29.907148 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-5d954896cf-xd7bv" podStartSLOduration=2.146566247 podStartE2EDuration="5.907129459s" podCreationTimestamp="2025-10-11 01:03:24 +0000 UTC" firstStartedPulling="2025-10-11 01:03:25.049480486 +0000 UTC m=+699.702460883" lastFinishedPulling="2025-10-11 01:03:28.810043698 +0000 UTC m=+703.463024095" observedRunningTime="2025-10-11 01:03:29.904558402 +0000 UTC m=+704.557538809" watchObservedRunningTime="2025-10-11 01:03:29.907129459 +0000 UTC m=+704.560109856" Oct 11 01:03:29 crc kubenswrapper[4743]: I1011 01:03:29.927732 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=3.273342582 podStartE2EDuration="5.927715558s" podCreationTimestamp="2025-10-11 01:03:24 +0000 UTC" firstStartedPulling="2025-10-11 01:03:26.18747046 +0000 UTC m=+700.840450897" lastFinishedPulling="2025-10-11 01:03:28.841843476 +0000 UTC m=+703.494823873" observedRunningTime="2025-10-11 01:03:29.92428044 +0000 UTC m=+704.577260867" watchObservedRunningTime="2025-10-11 01:03:29.927715558 +0000 UTC m=+704.580695955" Oct 11 01:03:31 crc kubenswrapper[4743]: I1011 01:03:31.816947 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" event={"ID":"60332ecb-34a5-4628-9311-4469d823f589","Type":"ContainerStarted","Data":"22b43d95b7e31b2723c107be440b11694bf190d091b5cdecde9a62b5c82d8ef9"} Oct 11 01:03:31 crc kubenswrapper[4743]: I1011 01:03:31.817289 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:31 crc kubenswrapper[4743]: I1011 01:03:31.819333 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" event={"ID":"443c6346-a364-4e67-8c53-2bcd9b1f0927","Type":"ContainerStarted","Data":"028d6a5cfdad6f5bfbe973e5b5309e905c1b24de9607c12717d1671d556acc2e"} Oct 11 01:03:31 crc kubenswrapper[4743]: I1011 01:03:31.835853 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:31 crc kubenswrapper[4743]: I1011 01:03:31.840743 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" podStartSLOduration=1.964605302 podStartE2EDuration="7.840717001s" podCreationTimestamp="2025-10-11 01:03:24 +0000 UTC" firstStartedPulling="2025-10-11 01:03:25.353297337 +0000 UTC m=+700.006277754" lastFinishedPulling="2025-10-11 01:03:31.229409056 +0000 UTC m=+705.882389453" observedRunningTime="2025-10-11 01:03:31.835009034 +0000 UTC m=+706.487989501" watchObservedRunningTime="2025-10-11 01:03:31.840717001 +0000 UTC m=+706.493697438" Oct 11 01:03:31 crc kubenswrapper[4743]: I1011 01:03:31.879095 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" podStartSLOduration=2.061411564 podStartE2EDuration="7.879063108s" podCreationTimestamp="2025-10-11 01:03:24 +0000 UTC" firstStartedPulling="2025-10-11 01:03:25.407950644 +0000 UTC m=+700.060931041" lastFinishedPulling="2025-10-11 01:03:31.225602188 +0000 UTC m=+705.878582585" observedRunningTime="2025-10-11 01:03:31.866154346 +0000 UTC m=+706.519134803" watchObservedRunningTime="2025-10-11 01:03:31.879063108 +0000 UTC m=+706.532043555" Oct 11 01:03:32 crc kubenswrapper[4743]: I1011 01:03:32.842376 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:32 crc kubenswrapper[4743]: I1011 01:03:32.843091 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:32 crc kubenswrapper[4743]: I1011 01:03:32.843129 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:32 crc kubenswrapper[4743]: I1011 01:03:32.854151 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-c45fcc855-8mtnp" Oct 11 01:03:32 crc kubenswrapper[4743]: I1011 01:03:32.854398 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:32 crc kubenswrapper[4743]: I1011 01:03:32.855830 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-c45fcc855-ddvcq" Oct 11 01:03:44 crc kubenswrapper[4743]: I1011 01:03:44.458180 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:03:44 crc kubenswrapper[4743]: I1011 01:03:44.458968 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:03:44 crc kubenswrapper[4743]: I1011 01:03:44.492394 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-6f5f7fff97-72rvc" Oct 11 01:03:44 crc kubenswrapper[4743]: I1011 01:03:44.625746 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-5d954896cf-xd7bv" Oct 11 01:03:44 crc kubenswrapper[4743]: I1011 01:03:44.744307 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-6fbbbc8b7d-zzmff" Oct 11 01:03:45 crc kubenswrapper[4743]: I1011 01:03:45.777963 4743 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Oct 11 01:03:45 crc kubenswrapper[4743]: I1011 01:03:45.778469 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="bf0c8290-1118-4dbf-a638-bde5c07bdaab" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 11 01:03:45 crc kubenswrapper[4743]: I1011 01:03:45.786517 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Oct 11 01:03:45 crc kubenswrapper[4743]: I1011 01:03:45.796239 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Oct 11 01:03:55 crc kubenswrapper[4743]: I1011 01:03:55.777402 4743 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Oct 11 01:03:55 crc kubenswrapper[4743]: I1011 01:03:55.778034 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="bf0c8290-1118-4dbf-a638-bde5c07bdaab" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 11 01:04:05 crc kubenswrapper[4743]: I1011 01:04:05.775990 4743 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Oct 11 01:04:05 crc kubenswrapper[4743]: I1011 01:04:05.776713 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="bf0c8290-1118-4dbf-a638-bde5c07bdaab" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 11 01:04:14 crc kubenswrapper[4743]: I1011 01:04:14.458633 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:04:14 crc kubenswrapper[4743]: I1011 01:04:14.459274 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:04:14 crc kubenswrapper[4743]: I1011 01:04:14.459336 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 01:04:14 crc kubenswrapper[4743]: I1011 01:04:14.460176 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"88127d52f7db156c5804bc403a408594bcfb43a90269eb1302483bd25dec7ebe"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 01:04:14 crc kubenswrapper[4743]: I1011 01:04:14.460264 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://88127d52f7db156c5804bc403a408594bcfb43a90269eb1302483bd25dec7ebe" gracePeriod=600 Oct 11 01:04:15 crc kubenswrapper[4743]: I1011 01:04:15.182380 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="88127d52f7db156c5804bc403a408594bcfb43a90269eb1302483bd25dec7ebe" exitCode=0 Oct 11 01:04:15 crc kubenswrapper[4743]: I1011 01:04:15.183049 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"88127d52f7db156c5804bc403a408594bcfb43a90269eb1302483bd25dec7ebe"} Oct 11 01:04:15 crc kubenswrapper[4743]: I1011 01:04:15.183180 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"06d63da6139508ac6d7d3ccf51eec7dcc1dbdfea0379b704f2d1844d8e86a974"} Oct 11 01:04:15 crc kubenswrapper[4743]: I1011 01:04:15.183343 4743 scope.go:117] "RemoveContainer" containerID="f4a50228ff369b6861f5b6579c1f5b36360b57624267c00cfb8d313cffab1c5d" Oct 11 01:04:15 crc kubenswrapper[4743]: I1011 01:04:15.780383 4743 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Oct 11 01:04:15 crc kubenswrapper[4743]: I1011 01:04:15.780807 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="bf0c8290-1118-4dbf-a638-bde5c07bdaab" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 11 01:04:24 crc kubenswrapper[4743]: I1011 01:04:24.428966 4743 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 11 01:04:25 crc kubenswrapper[4743]: I1011 01:04:25.772722 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.714962 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-c9h4b"] Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.716178 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.718339 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-mb62f" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.718527 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.719963 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.720481 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.720742 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.725928 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.730207 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-c9h4b"] Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.781795 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-c9h4b"] Oct 11 01:04:45 crc kubenswrapper[4743]: E1011 01:04:45.782221 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-v7jw9 metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-v7jw9 metrics sa-token tmp trusted-ca]: context canceled" pod="openshift-logging/collector-c9h4b" podUID="6082455e-5a13-4b35-b302-1c436af36e93" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.866431 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/6082455e-5a13-4b35-b302-1c436af36e93-entrypoint\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.866470 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6082455e-5a13-4b35-b302-1c436af36e93-trusted-ca\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.866498 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/6082455e-5a13-4b35-b302-1c436af36e93-collector-syslog-receiver\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.866526 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6082455e-5a13-4b35-b302-1c436af36e93-config\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.866544 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/6082455e-5a13-4b35-b302-1c436af36e93-sa-token\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.866570 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/6082455e-5a13-4b35-b302-1c436af36e93-collector-token\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.866586 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6082455e-5a13-4b35-b302-1c436af36e93-tmp\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.866602 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/6082455e-5a13-4b35-b302-1c436af36e93-config-openshift-service-cacrt\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.866615 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7jw9\" (UniqueName: \"kubernetes.io/projected/6082455e-5a13-4b35-b302-1c436af36e93-kube-api-access-v7jw9\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.866658 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/6082455e-5a13-4b35-b302-1c436af36e93-datadir\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.866677 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/6082455e-5a13-4b35-b302-1c436af36e93-metrics\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.968142 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/6082455e-5a13-4b35-b302-1c436af36e93-datadir\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.968193 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/6082455e-5a13-4b35-b302-1c436af36e93-metrics\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.968224 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/6082455e-5a13-4b35-b302-1c436af36e93-entrypoint\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.968309 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/6082455e-5a13-4b35-b302-1c436af36e93-datadir\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.969067 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6082455e-5a13-4b35-b302-1c436af36e93-trusted-ca\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.969229 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/6082455e-5a13-4b35-b302-1c436af36e93-collector-syslog-receiver\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.969281 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/6082455e-5a13-4b35-b302-1c436af36e93-entrypoint\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.969299 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6082455e-5a13-4b35-b302-1c436af36e93-config\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.969358 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/6082455e-5a13-4b35-b302-1c436af36e93-sa-token\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.969380 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/6082455e-5a13-4b35-b302-1c436af36e93-collector-token\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.969408 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6082455e-5a13-4b35-b302-1c436af36e93-tmp\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.969423 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/6082455e-5a13-4b35-b302-1c436af36e93-config-openshift-service-cacrt\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.969443 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7jw9\" (UniqueName: \"kubernetes.io/projected/6082455e-5a13-4b35-b302-1c436af36e93-kube-api-access-v7jw9\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.970065 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6082455e-5a13-4b35-b302-1c436af36e93-trusted-ca\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.970201 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6082455e-5a13-4b35-b302-1c436af36e93-config\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.970452 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/6082455e-5a13-4b35-b302-1c436af36e93-config-openshift-service-cacrt\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.973967 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/6082455e-5a13-4b35-b302-1c436af36e93-metrics\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.974139 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/6082455e-5a13-4b35-b302-1c436af36e93-collector-token\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.974156 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/6082455e-5a13-4b35-b302-1c436af36e93-collector-syslog-receiver\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.982094 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6082455e-5a13-4b35-b302-1c436af36e93-tmp\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.991889 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7jw9\" (UniqueName: \"kubernetes.io/projected/6082455e-5a13-4b35-b302-1c436af36e93-kube-api-access-v7jw9\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:45 crc kubenswrapper[4743]: I1011 01:04:45.999346 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/6082455e-5a13-4b35-b302-1c436af36e93-sa-token\") pod \"collector-c9h4b\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " pod="openshift-logging/collector-c9h4b" Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.441561 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-c9h4b" Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.454557 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-c9h4b" Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.578424 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6082455e-5a13-4b35-b302-1c436af36e93-trusted-ca\") pod \"6082455e-5a13-4b35-b302-1c436af36e93\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.578502 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/6082455e-5a13-4b35-b302-1c436af36e93-datadir\") pod \"6082455e-5a13-4b35-b302-1c436af36e93\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.578553 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7jw9\" (UniqueName: \"kubernetes.io/projected/6082455e-5a13-4b35-b302-1c436af36e93-kube-api-access-v7jw9\") pod \"6082455e-5a13-4b35-b302-1c436af36e93\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.578589 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6082455e-5a13-4b35-b302-1c436af36e93-config\") pod \"6082455e-5a13-4b35-b302-1c436af36e93\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.578655 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/6082455e-5a13-4b35-b302-1c436af36e93-entrypoint\") pod \"6082455e-5a13-4b35-b302-1c436af36e93\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.578694 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/6082455e-5a13-4b35-b302-1c436af36e93-collector-syslog-receiver\") pod \"6082455e-5a13-4b35-b302-1c436af36e93\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.578687 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6082455e-5a13-4b35-b302-1c436af36e93-datadir" (OuterVolumeSpecName: "datadir") pod "6082455e-5a13-4b35-b302-1c436af36e93" (UID: "6082455e-5a13-4b35-b302-1c436af36e93"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.578738 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6082455e-5a13-4b35-b302-1c436af36e93-tmp\") pod \"6082455e-5a13-4b35-b302-1c436af36e93\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.579263 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/6082455e-5a13-4b35-b302-1c436af36e93-collector-token\") pod \"6082455e-5a13-4b35-b302-1c436af36e93\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.579322 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/6082455e-5a13-4b35-b302-1c436af36e93-sa-token\") pod \"6082455e-5a13-4b35-b302-1c436af36e93\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.579335 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6082455e-5a13-4b35-b302-1c436af36e93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "6082455e-5a13-4b35-b302-1c436af36e93" (UID: "6082455e-5a13-4b35-b302-1c436af36e93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.579547 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6082455e-5a13-4b35-b302-1c436af36e93-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "6082455e-5a13-4b35-b302-1c436af36e93" (UID: "6082455e-5a13-4b35-b302-1c436af36e93"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.579915 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/6082455e-5a13-4b35-b302-1c436af36e93-config-openshift-service-cacrt\") pod \"6082455e-5a13-4b35-b302-1c436af36e93\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.579973 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/6082455e-5a13-4b35-b302-1c436af36e93-metrics\") pod \"6082455e-5a13-4b35-b302-1c436af36e93\" (UID: \"6082455e-5a13-4b35-b302-1c436af36e93\") " Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.580063 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6082455e-5a13-4b35-b302-1c436af36e93-config" (OuterVolumeSpecName: "config") pod "6082455e-5a13-4b35-b302-1c436af36e93" (UID: "6082455e-5a13-4b35-b302-1c436af36e93"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.580777 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6082455e-5a13-4b35-b302-1c436af36e93-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "6082455e-5a13-4b35-b302-1c436af36e93" (UID: "6082455e-5a13-4b35-b302-1c436af36e93"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.581048 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6082455e-5a13-4b35-b302-1c436af36e93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.581074 4743 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/6082455e-5a13-4b35-b302-1c436af36e93-datadir\") on node \"crc\" DevicePath \"\"" Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.581092 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6082455e-5a13-4b35-b302-1c436af36e93-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.581108 4743 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/6082455e-5a13-4b35-b302-1c436af36e93-entrypoint\") on node \"crc\" DevicePath \"\"" Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.581125 4743 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/6082455e-5a13-4b35-b302-1c436af36e93-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.583464 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6082455e-5a13-4b35-b302-1c436af36e93-tmp" (OuterVolumeSpecName: "tmp") pod "6082455e-5a13-4b35-b302-1c436af36e93" (UID: "6082455e-5a13-4b35-b302-1c436af36e93"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.584384 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6082455e-5a13-4b35-b302-1c436af36e93-collector-token" (OuterVolumeSpecName: "collector-token") pod "6082455e-5a13-4b35-b302-1c436af36e93" (UID: "6082455e-5a13-4b35-b302-1c436af36e93"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.585419 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6082455e-5a13-4b35-b302-1c436af36e93-kube-api-access-v7jw9" (OuterVolumeSpecName: "kube-api-access-v7jw9") pod "6082455e-5a13-4b35-b302-1c436af36e93" (UID: "6082455e-5a13-4b35-b302-1c436af36e93"). InnerVolumeSpecName "kube-api-access-v7jw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.585766 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6082455e-5a13-4b35-b302-1c436af36e93-sa-token" (OuterVolumeSpecName: "sa-token") pod "6082455e-5a13-4b35-b302-1c436af36e93" (UID: "6082455e-5a13-4b35-b302-1c436af36e93"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.587397 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6082455e-5a13-4b35-b302-1c436af36e93-metrics" (OuterVolumeSpecName: "metrics") pod "6082455e-5a13-4b35-b302-1c436af36e93" (UID: "6082455e-5a13-4b35-b302-1c436af36e93"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.591043 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6082455e-5a13-4b35-b302-1c436af36e93-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "6082455e-5a13-4b35-b302-1c436af36e93" (UID: "6082455e-5a13-4b35-b302-1c436af36e93"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.682527 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7jw9\" (UniqueName: \"kubernetes.io/projected/6082455e-5a13-4b35-b302-1c436af36e93-kube-api-access-v7jw9\") on node \"crc\" DevicePath \"\"" Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.682575 4743 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/6082455e-5a13-4b35-b302-1c436af36e93-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.682593 4743 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6082455e-5a13-4b35-b302-1c436af36e93-tmp\") on node \"crc\" DevicePath \"\"" Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.682613 4743 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/6082455e-5a13-4b35-b302-1c436af36e93-collector-token\") on node \"crc\" DevicePath \"\"" Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.682631 4743 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/6082455e-5a13-4b35-b302-1c436af36e93-sa-token\") on node \"crc\" DevicePath \"\"" Oct 11 01:04:46 crc kubenswrapper[4743]: I1011 01:04:46.682648 4743 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/6082455e-5a13-4b35-b302-1c436af36e93-metrics\") on node \"crc\" DevicePath \"\"" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.448706 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-c9h4b" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.526239 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-c9h4b"] Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.544323 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-c9h4b"] Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.551298 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-fdslp"] Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.552389 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.554898 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.555084 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.555214 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.555391 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.556311 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-mb62f" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.558539 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-fdslp"] Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.561098 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.698803 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/15f63aaf-1998-4daa-8ebf-1f9455b483e5-sa-token\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.699063 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/15f63aaf-1998-4daa-8ebf-1f9455b483e5-entrypoint\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.699135 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/15f63aaf-1998-4daa-8ebf-1f9455b483e5-config-openshift-service-cacrt\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.699160 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15f63aaf-1998-4daa-8ebf-1f9455b483e5-config\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.699188 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/15f63aaf-1998-4daa-8ebf-1f9455b483e5-datadir\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.699230 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/15f63aaf-1998-4daa-8ebf-1f9455b483e5-tmp\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.699271 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15f63aaf-1998-4daa-8ebf-1f9455b483e5-trusted-ca\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.699317 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/15f63aaf-1998-4daa-8ebf-1f9455b483e5-metrics\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.699373 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/15f63aaf-1998-4daa-8ebf-1f9455b483e5-collector-token\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.699406 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/15f63aaf-1998-4daa-8ebf-1f9455b483e5-collector-syslog-receiver\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.699442 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bbxw\" (UniqueName: \"kubernetes.io/projected/15f63aaf-1998-4daa-8ebf-1f9455b483e5-kube-api-access-5bbxw\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.800513 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15f63aaf-1998-4daa-8ebf-1f9455b483e5-trusted-ca\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.800625 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/15f63aaf-1998-4daa-8ebf-1f9455b483e5-metrics\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.800711 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/15f63aaf-1998-4daa-8ebf-1f9455b483e5-collector-token\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.800779 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/15f63aaf-1998-4daa-8ebf-1f9455b483e5-collector-syslog-receiver\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.800938 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bbxw\" (UniqueName: \"kubernetes.io/projected/15f63aaf-1998-4daa-8ebf-1f9455b483e5-kube-api-access-5bbxw\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.801004 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/15f63aaf-1998-4daa-8ebf-1f9455b483e5-sa-token\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.801128 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/15f63aaf-1998-4daa-8ebf-1f9455b483e5-entrypoint\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.801186 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/15f63aaf-1998-4daa-8ebf-1f9455b483e5-config-openshift-service-cacrt\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.801216 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15f63aaf-1998-4daa-8ebf-1f9455b483e5-config\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.801251 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/15f63aaf-1998-4daa-8ebf-1f9455b483e5-datadir\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.801296 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/15f63aaf-1998-4daa-8ebf-1f9455b483e5-tmp\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.802135 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/15f63aaf-1998-4daa-8ebf-1f9455b483e5-config-openshift-service-cacrt\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.802243 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/15f63aaf-1998-4daa-8ebf-1f9455b483e5-datadir\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.802371 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15f63aaf-1998-4daa-8ebf-1f9455b483e5-trusted-ca\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.802906 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/15f63aaf-1998-4daa-8ebf-1f9455b483e5-entrypoint\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.803144 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15f63aaf-1998-4daa-8ebf-1f9455b483e5-config\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.806462 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/15f63aaf-1998-4daa-8ebf-1f9455b483e5-collector-syslog-receiver\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.806718 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/15f63aaf-1998-4daa-8ebf-1f9455b483e5-metrics\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.807694 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/15f63aaf-1998-4daa-8ebf-1f9455b483e5-tmp\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.808734 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/15f63aaf-1998-4daa-8ebf-1f9455b483e5-collector-token\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.830776 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/15f63aaf-1998-4daa-8ebf-1f9455b483e5-sa-token\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.836320 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bbxw\" (UniqueName: \"kubernetes.io/projected/15f63aaf-1998-4daa-8ebf-1f9455b483e5-kube-api-access-5bbxw\") pod \"collector-fdslp\" (UID: \"15f63aaf-1998-4daa-8ebf-1f9455b483e5\") " pod="openshift-logging/collector-fdslp" Oct 11 01:04:47 crc kubenswrapper[4743]: I1011 01:04:47.875023 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-fdslp" Oct 11 01:04:48 crc kubenswrapper[4743]: I1011 01:04:48.100585 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6082455e-5a13-4b35-b302-1c436af36e93" path="/var/lib/kubelet/pods/6082455e-5a13-4b35-b302-1c436af36e93/volumes" Oct 11 01:04:48 crc kubenswrapper[4743]: I1011 01:04:48.330469 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-fdslp"] Oct 11 01:04:48 crc kubenswrapper[4743]: I1011 01:04:48.459360 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-fdslp" event={"ID":"15f63aaf-1998-4daa-8ebf-1f9455b483e5","Type":"ContainerStarted","Data":"e6b539371156171c075c4816eb5e318f211a877472c6f50e595489e13a3d3d5a"} Oct 11 01:04:56 crc kubenswrapper[4743]: I1011 01:04:56.529663 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-fdslp" event={"ID":"15f63aaf-1998-4daa-8ebf-1f9455b483e5","Type":"ContainerStarted","Data":"88215265c9f0645636aaaa55be091df6b9a7f1481663ff04bed0257d4ceda56e"} Oct 11 01:04:56 crc kubenswrapper[4743]: I1011 01:04:56.562396 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-fdslp" podStartSLOduration=1.9732762130000001 podStartE2EDuration="9.562365183s" podCreationTimestamp="2025-10-11 01:04:47 +0000 UTC" firstStartedPulling="2025-10-11 01:04:48.345060938 +0000 UTC m=+782.998041385" lastFinishedPulling="2025-10-11 01:04:55.934149928 +0000 UTC m=+790.587130355" observedRunningTime="2025-10-11 01:04:56.560214747 +0000 UTC m=+791.213195184" watchObservedRunningTime="2025-10-11 01:04:56.562365183 +0000 UTC m=+791.215345620" Oct 11 01:05:01 crc kubenswrapper[4743]: I1011 01:05:01.795295 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v7zb8"] Oct 11 01:05:01 crc kubenswrapper[4743]: I1011 01:05:01.797812 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7zb8" Oct 11 01:05:01 crc kubenswrapper[4743]: I1011 01:05:01.812178 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v7zb8"] Oct 11 01:05:01 crc kubenswrapper[4743]: I1011 01:05:01.917358 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecc14aad-6946-4b41-80ef-803851fbaa8e-utilities\") pod \"certified-operators-v7zb8\" (UID: \"ecc14aad-6946-4b41-80ef-803851fbaa8e\") " pod="openshift-marketplace/certified-operators-v7zb8" Oct 11 01:05:01 crc kubenswrapper[4743]: I1011 01:05:01.917449 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecc14aad-6946-4b41-80ef-803851fbaa8e-catalog-content\") pod \"certified-operators-v7zb8\" (UID: \"ecc14aad-6946-4b41-80ef-803851fbaa8e\") " pod="openshift-marketplace/certified-operators-v7zb8" Oct 11 01:05:01 crc kubenswrapper[4743]: I1011 01:05:01.917506 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lgx8\" (UniqueName: \"kubernetes.io/projected/ecc14aad-6946-4b41-80ef-803851fbaa8e-kube-api-access-5lgx8\") pod \"certified-operators-v7zb8\" (UID: \"ecc14aad-6946-4b41-80ef-803851fbaa8e\") " pod="openshift-marketplace/certified-operators-v7zb8" Oct 11 01:05:02 crc kubenswrapper[4743]: I1011 01:05:02.019258 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecc14aad-6946-4b41-80ef-803851fbaa8e-utilities\") pod \"certified-operators-v7zb8\" (UID: \"ecc14aad-6946-4b41-80ef-803851fbaa8e\") " pod="openshift-marketplace/certified-operators-v7zb8" Oct 11 01:05:02 crc kubenswrapper[4743]: I1011 01:05:02.019370 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecc14aad-6946-4b41-80ef-803851fbaa8e-catalog-content\") pod \"certified-operators-v7zb8\" (UID: \"ecc14aad-6946-4b41-80ef-803851fbaa8e\") " pod="openshift-marketplace/certified-operators-v7zb8" Oct 11 01:05:02 crc kubenswrapper[4743]: I1011 01:05:02.019424 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lgx8\" (UniqueName: \"kubernetes.io/projected/ecc14aad-6946-4b41-80ef-803851fbaa8e-kube-api-access-5lgx8\") pod \"certified-operators-v7zb8\" (UID: \"ecc14aad-6946-4b41-80ef-803851fbaa8e\") " pod="openshift-marketplace/certified-operators-v7zb8" Oct 11 01:05:02 crc kubenswrapper[4743]: I1011 01:05:02.019831 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecc14aad-6946-4b41-80ef-803851fbaa8e-utilities\") pod \"certified-operators-v7zb8\" (UID: \"ecc14aad-6946-4b41-80ef-803851fbaa8e\") " pod="openshift-marketplace/certified-operators-v7zb8" Oct 11 01:05:02 crc kubenswrapper[4743]: I1011 01:05:02.020083 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecc14aad-6946-4b41-80ef-803851fbaa8e-catalog-content\") pod \"certified-operators-v7zb8\" (UID: \"ecc14aad-6946-4b41-80ef-803851fbaa8e\") " pod="openshift-marketplace/certified-operators-v7zb8" Oct 11 01:05:02 crc kubenswrapper[4743]: I1011 01:05:02.048209 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lgx8\" (UniqueName: \"kubernetes.io/projected/ecc14aad-6946-4b41-80ef-803851fbaa8e-kube-api-access-5lgx8\") pod \"certified-operators-v7zb8\" (UID: \"ecc14aad-6946-4b41-80ef-803851fbaa8e\") " pod="openshift-marketplace/certified-operators-v7zb8" Oct 11 01:05:02 crc kubenswrapper[4743]: I1011 01:05:02.123538 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7zb8" Oct 11 01:05:02 crc kubenswrapper[4743]: I1011 01:05:02.567655 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v7zb8"] Oct 11 01:05:03 crc kubenswrapper[4743]: I1011 01:05:03.581193 4743 generic.go:334] "Generic (PLEG): container finished" podID="ecc14aad-6946-4b41-80ef-803851fbaa8e" containerID="3dec0e85a229e4a334c143487093a8e3e9a91875849da7825b73f7f66a6c0568" exitCode=0 Oct 11 01:05:03 crc kubenswrapper[4743]: I1011 01:05:03.581296 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7zb8" event={"ID":"ecc14aad-6946-4b41-80ef-803851fbaa8e","Type":"ContainerDied","Data":"3dec0e85a229e4a334c143487093a8e3e9a91875849da7825b73f7f66a6c0568"} Oct 11 01:05:03 crc kubenswrapper[4743]: I1011 01:05:03.581739 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7zb8" event={"ID":"ecc14aad-6946-4b41-80ef-803851fbaa8e","Type":"ContainerStarted","Data":"8cf19164e961452ff3dad3804396fa0c5b4eec4cda3fa4b7854240476c9a7ec8"} Oct 11 01:05:04 crc kubenswrapper[4743]: I1011 01:05:04.592838 4743 generic.go:334] "Generic (PLEG): container finished" podID="ecc14aad-6946-4b41-80ef-803851fbaa8e" containerID="8c8a0a01fd2d2dd666644e73129a11e5658e8dc2b9af54f16c2340b4ac3d0199" exitCode=0 Oct 11 01:05:04 crc kubenswrapper[4743]: I1011 01:05:04.592911 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7zb8" event={"ID":"ecc14aad-6946-4b41-80ef-803851fbaa8e","Type":"ContainerDied","Data":"8c8a0a01fd2d2dd666644e73129a11e5658e8dc2b9af54f16c2340b4ac3d0199"} Oct 11 01:05:05 crc kubenswrapper[4743]: I1011 01:05:05.603024 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7zb8" event={"ID":"ecc14aad-6946-4b41-80ef-803851fbaa8e","Type":"ContainerStarted","Data":"41992e164579dc3a51900c69a3a3d69b294f7135de3b09ec99c5363de498e4fc"} Oct 11 01:05:05 crc kubenswrapper[4743]: I1011 01:05:05.622949 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v7zb8" podStartSLOduration=3.169270705 podStartE2EDuration="4.622926209s" podCreationTimestamp="2025-10-11 01:05:01 +0000 UTC" firstStartedPulling="2025-10-11 01:05:03.583081571 +0000 UTC m=+798.236061968" lastFinishedPulling="2025-10-11 01:05:05.036737075 +0000 UTC m=+799.689717472" observedRunningTime="2025-10-11 01:05:05.619759637 +0000 UTC m=+800.272740044" watchObservedRunningTime="2025-10-11 01:05:05.622926209 +0000 UTC m=+800.275906636" Oct 11 01:05:05 crc kubenswrapper[4743]: I1011 01:05:05.791749 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f2hn9"] Oct 11 01:05:05 crc kubenswrapper[4743]: I1011 01:05:05.800015 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f2hn9" Oct 11 01:05:05 crc kubenswrapper[4743]: I1011 01:05:05.803828 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f2hn9"] Oct 11 01:05:05 crc kubenswrapper[4743]: I1011 01:05:05.972013 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bvrn\" (UniqueName: \"kubernetes.io/projected/e8b4a18c-cf5d-44ed-98aa-66a6127a89f3-kube-api-access-2bvrn\") pod \"redhat-operators-f2hn9\" (UID: \"e8b4a18c-cf5d-44ed-98aa-66a6127a89f3\") " pod="openshift-marketplace/redhat-operators-f2hn9" Oct 11 01:05:05 crc kubenswrapper[4743]: I1011 01:05:05.972065 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8b4a18c-cf5d-44ed-98aa-66a6127a89f3-catalog-content\") pod \"redhat-operators-f2hn9\" (UID: \"e8b4a18c-cf5d-44ed-98aa-66a6127a89f3\") " pod="openshift-marketplace/redhat-operators-f2hn9" Oct 11 01:05:05 crc kubenswrapper[4743]: I1011 01:05:05.972152 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8b4a18c-cf5d-44ed-98aa-66a6127a89f3-utilities\") pod \"redhat-operators-f2hn9\" (UID: \"e8b4a18c-cf5d-44ed-98aa-66a6127a89f3\") " pod="openshift-marketplace/redhat-operators-f2hn9" Oct 11 01:05:06 crc kubenswrapper[4743]: I1011 01:05:06.073437 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8b4a18c-cf5d-44ed-98aa-66a6127a89f3-utilities\") pod \"redhat-operators-f2hn9\" (UID: \"e8b4a18c-cf5d-44ed-98aa-66a6127a89f3\") " pod="openshift-marketplace/redhat-operators-f2hn9" Oct 11 01:05:06 crc kubenswrapper[4743]: I1011 01:05:06.073507 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bvrn\" (UniqueName: \"kubernetes.io/projected/e8b4a18c-cf5d-44ed-98aa-66a6127a89f3-kube-api-access-2bvrn\") pod \"redhat-operators-f2hn9\" (UID: \"e8b4a18c-cf5d-44ed-98aa-66a6127a89f3\") " pod="openshift-marketplace/redhat-operators-f2hn9" Oct 11 01:05:06 crc kubenswrapper[4743]: I1011 01:05:06.073538 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8b4a18c-cf5d-44ed-98aa-66a6127a89f3-catalog-content\") pod \"redhat-operators-f2hn9\" (UID: \"e8b4a18c-cf5d-44ed-98aa-66a6127a89f3\") " pod="openshift-marketplace/redhat-operators-f2hn9" Oct 11 01:05:06 crc kubenswrapper[4743]: I1011 01:05:06.073962 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8b4a18c-cf5d-44ed-98aa-66a6127a89f3-utilities\") pod \"redhat-operators-f2hn9\" (UID: \"e8b4a18c-cf5d-44ed-98aa-66a6127a89f3\") " pod="openshift-marketplace/redhat-operators-f2hn9" Oct 11 01:05:06 crc kubenswrapper[4743]: I1011 01:05:06.074004 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8b4a18c-cf5d-44ed-98aa-66a6127a89f3-catalog-content\") pod \"redhat-operators-f2hn9\" (UID: \"e8b4a18c-cf5d-44ed-98aa-66a6127a89f3\") " pod="openshift-marketplace/redhat-operators-f2hn9" Oct 11 01:05:06 crc kubenswrapper[4743]: I1011 01:05:06.091344 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bvrn\" (UniqueName: \"kubernetes.io/projected/e8b4a18c-cf5d-44ed-98aa-66a6127a89f3-kube-api-access-2bvrn\") pod \"redhat-operators-f2hn9\" (UID: \"e8b4a18c-cf5d-44ed-98aa-66a6127a89f3\") " pod="openshift-marketplace/redhat-operators-f2hn9" Oct 11 01:05:06 crc kubenswrapper[4743]: I1011 01:05:06.156615 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f2hn9" Oct 11 01:05:06 crc kubenswrapper[4743]: I1011 01:05:06.370916 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f2hn9"] Oct 11 01:05:06 crc kubenswrapper[4743]: I1011 01:05:06.611322 4743 generic.go:334] "Generic (PLEG): container finished" podID="e8b4a18c-cf5d-44ed-98aa-66a6127a89f3" containerID="5b2b916dfd0f156fb19a31ae577fbb10cf443b2eda26dbae9ef7a4edf2b05057" exitCode=0 Oct 11 01:05:06 crc kubenswrapper[4743]: I1011 01:05:06.611465 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2hn9" event={"ID":"e8b4a18c-cf5d-44ed-98aa-66a6127a89f3","Type":"ContainerDied","Data":"5b2b916dfd0f156fb19a31ae577fbb10cf443b2eda26dbae9ef7a4edf2b05057"} Oct 11 01:05:06 crc kubenswrapper[4743]: I1011 01:05:06.612219 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2hn9" event={"ID":"e8b4a18c-cf5d-44ed-98aa-66a6127a89f3","Type":"ContainerStarted","Data":"d603acb51ad39fbfecb0065fba8d7c615e9a652e1408459ad9acfd9a3de16dde"} Oct 11 01:05:07 crc kubenswrapper[4743]: I1011 01:05:07.617403 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2hn9" event={"ID":"e8b4a18c-cf5d-44ed-98aa-66a6127a89f3","Type":"ContainerStarted","Data":"28c7af64ee50aea24c96d0606134e8912d71eae7b115c1e609fb3847afb6eea7"} Oct 11 01:05:08 crc kubenswrapper[4743]: I1011 01:05:08.627289 4743 generic.go:334] "Generic (PLEG): container finished" podID="e8b4a18c-cf5d-44ed-98aa-66a6127a89f3" containerID="28c7af64ee50aea24c96d0606134e8912d71eae7b115c1e609fb3847afb6eea7" exitCode=0 Oct 11 01:05:08 crc kubenswrapper[4743]: I1011 01:05:08.627338 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2hn9" event={"ID":"e8b4a18c-cf5d-44ed-98aa-66a6127a89f3","Type":"ContainerDied","Data":"28c7af64ee50aea24c96d0606134e8912d71eae7b115c1e609fb3847afb6eea7"} Oct 11 01:05:09 crc kubenswrapper[4743]: I1011 01:05:09.635850 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2hn9" event={"ID":"e8b4a18c-cf5d-44ed-98aa-66a6127a89f3","Type":"ContainerStarted","Data":"df467cfec853a005213e3a05e2bc45bafb7c9490a1a75cb9cdf6514240cf2813"} Oct 11 01:05:09 crc kubenswrapper[4743]: I1011 01:05:09.657827 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f2hn9" podStartSLOduration=2.20059115 podStartE2EDuration="4.657809323s" podCreationTimestamp="2025-10-11 01:05:05 +0000 UTC" firstStartedPulling="2025-10-11 01:05:06.612881576 +0000 UTC m=+801.265861973" lastFinishedPulling="2025-10-11 01:05:09.070099739 +0000 UTC m=+803.723080146" observedRunningTime="2025-10-11 01:05:09.653992464 +0000 UTC m=+804.306972861" watchObservedRunningTime="2025-10-11 01:05:09.657809323 +0000 UTC m=+804.310789720" Oct 11 01:05:12 crc kubenswrapper[4743]: I1011 01:05:12.124636 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v7zb8" Oct 11 01:05:12 crc kubenswrapper[4743]: I1011 01:05:12.125139 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v7zb8" Oct 11 01:05:12 crc kubenswrapper[4743]: I1011 01:05:12.185728 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v7zb8" Oct 11 01:05:12 crc kubenswrapper[4743]: I1011 01:05:12.699237 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v7zb8" Oct 11 01:05:13 crc kubenswrapper[4743]: I1011 01:05:13.358043 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v7zb8"] Oct 11 01:05:14 crc kubenswrapper[4743]: I1011 01:05:14.665069 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v7zb8" podUID="ecc14aad-6946-4b41-80ef-803851fbaa8e" containerName="registry-server" containerID="cri-o://41992e164579dc3a51900c69a3a3d69b294f7135de3b09ec99c5363de498e4fc" gracePeriod=2 Oct 11 01:05:15 crc kubenswrapper[4743]: I1011 01:05:15.098133 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7zb8" Oct 11 01:05:15 crc kubenswrapper[4743]: I1011 01:05:15.206089 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecc14aad-6946-4b41-80ef-803851fbaa8e-utilities\") pod \"ecc14aad-6946-4b41-80ef-803851fbaa8e\" (UID: \"ecc14aad-6946-4b41-80ef-803851fbaa8e\") " Oct 11 01:05:15 crc kubenswrapper[4743]: I1011 01:05:15.206155 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecc14aad-6946-4b41-80ef-803851fbaa8e-catalog-content\") pod \"ecc14aad-6946-4b41-80ef-803851fbaa8e\" (UID: \"ecc14aad-6946-4b41-80ef-803851fbaa8e\") " Oct 11 01:05:15 crc kubenswrapper[4743]: I1011 01:05:15.206278 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lgx8\" (UniqueName: \"kubernetes.io/projected/ecc14aad-6946-4b41-80ef-803851fbaa8e-kube-api-access-5lgx8\") pod \"ecc14aad-6946-4b41-80ef-803851fbaa8e\" (UID: \"ecc14aad-6946-4b41-80ef-803851fbaa8e\") " Oct 11 01:05:15 crc kubenswrapper[4743]: I1011 01:05:15.207124 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecc14aad-6946-4b41-80ef-803851fbaa8e-utilities" (OuterVolumeSpecName: "utilities") pod "ecc14aad-6946-4b41-80ef-803851fbaa8e" (UID: "ecc14aad-6946-4b41-80ef-803851fbaa8e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:05:15 crc kubenswrapper[4743]: I1011 01:05:15.207304 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecc14aad-6946-4b41-80ef-803851fbaa8e-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 01:05:15 crc kubenswrapper[4743]: I1011 01:05:15.214071 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecc14aad-6946-4b41-80ef-803851fbaa8e-kube-api-access-5lgx8" (OuterVolumeSpecName: "kube-api-access-5lgx8") pod "ecc14aad-6946-4b41-80ef-803851fbaa8e" (UID: "ecc14aad-6946-4b41-80ef-803851fbaa8e"). InnerVolumeSpecName "kube-api-access-5lgx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:05:15 crc kubenswrapper[4743]: I1011 01:05:15.309202 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lgx8\" (UniqueName: \"kubernetes.io/projected/ecc14aad-6946-4b41-80ef-803851fbaa8e-kube-api-access-5lgx8\") on node \"crc\" DevicePath \"\"" Oct 11 01:05:15 crc kubenswrapper[4743]: I1011 01:05:15.672495 4743 generic.go:334] "Generic (PLEG): container finished" podID="ecc14aad-6946-4b41-80ef-803851fbaa8e" containerID="41992e164579dc3a51900c69a3a3d69b294f7135de3b09ec99c5363de498e4fc" exitCode=0 Oct 11 01:05:15 crc kubenswrapper[4743]: I1011 01:05:15.672537 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7zb8" event={"ID":"ecc14aad-6946-4b41-80ef-803851fbaa8e","Type":"ContainerDied","Data":"41992e164579dc3a51900c69a3a3d69b294f7135de3b09ec99c5363de498e4fc"} Oct 11 01:05:15 crc kubenswrapper[4743]: I1011 01:05:15.672572 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7zb8" event={"ID":"ecc14aad-6946-4b41-80ef-803851fbaa8e","Type":"ContainerDied","Data":"8cf19164e961452ff3dad3804396fa0c5b4eec4cda3fa4b7854240476c9a7ec8"} Oct 11 01:05:15 crc kubenswrapper[4743]: I1011 01:05:15.672591 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7zb8" Oct 11 01:05:15 crc kubenswrapper[4743]: I1011 01:05:15.672594 4743 scope.go:117] "RemoveContainer" containerID="41992e164579dc3a51900c69a3a3d69b294f7135de3b09ec99c5363de498e4fc" Oct 11 01:05:15 crc kubenswrapper[4743]: I1011 01:05:15.691879 4743 scope.go:117] "RemoveContainer" containerID="8c8a0a01fd2d2dd666644e73129a11e5658e8dc2b9af54f16c2340b4ac3d0199" Oct 11 01:05:15 crc kubenswrapper[4743]: I1011 01:05:15.712068 4743 scope.go:117] "RemoveContainer" containerID="3dec0e85a229e4a334c143487093a8e3e9a91875849da7825b73f7f66a6c0568" Oct 11 01:05:15 crc kubenswrapper[4743]: I1011 01:05:15.750165 4743 scope.go:117] "RemoveContainer" containerID="41992e164579dc3a51900c69a3a3d69b294f7135de3b09ec99c5363de498e4fc" Oct 11 01:05:15 crc kubenswrapper[4743]: E1011 01:05:15.750696 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41992e164579dc3a51900c69a3a3d69b294f7135de3b09ec99c5363de498e4fc\": container with ID starting with 41992e164579dc3a51900c69a3a3d69b294f7135de3b09ec99c5363de498e4fc not found: ID does not exist" containerID="41992e164579dc3a51900c69a3a3d69b294f7135de3b09ec99c5363de498e4fc" Oct 11 01:05:15 crc kubenswrapper[4743]: I1011 01:05:15.750839 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41992e164579dc3a51900c69a3a3d69b294f7135de3b09ec99c5363de498e4fc"} err="failed to get container status \"41992e164579dc3a51900c69a3a3d69b294f7135de3b09ec99c5363de498e4fc\": rpc error: code = NotFound desc = could not find container \"41992e164579dc3a51900c69a3a3d69b294f7135de3b09ec99c5363de498e4fc\": container with ID starting with 41992e164579dc3a51900c69a3a3d69b294f7135de3b09ec99c5363de498e4fc not found: ID does not exist" Oct 11 01:05:15 crc kubenswrapper[4743]: I1011 01:05:15.750977 4743 scope.go:117] "RemoveContainer" containerID="8c8a0a01fd2d2dd666644e73129a11e5658e8dc2b9af54f16c2340b4ac3d0199" Oct 11 01:05:15 crc kubenswrapper[4743]: E1011 01:05:15.751474 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c8a0a01fd2d2dd666644e73129a11e5658e8dc2b9af54f16c2340b4ac3d0199\": container with ID starting with 8c8a0a01fd2d2dd666644e73129a11e5658e8dc2b9af54f16c2340b4ac3d0199 not found: ID does not exist" containerID="8c8a0a01fd2d2dd666644e73129a11e5658e8dc2b9af54f16c2340b4ac3d0199" Oct 11 01:05:15 crc kubenswrapper[4743]: I1011 01:05:15.751502 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c8a0a01fd2d2dd666644e73129a11e5658e8dc2b9af54f16c2340b4ac3d0199"} err="failed to get container status \"8c8a0a01fd2d2dd666644e73129a11e5658e8dc2b9af54f16c2340b4ac3d0199\": rpc error: code = NotFound desc = could not find container \"8c8a0a01fd2d2dd666644e73129a11e5658e8dc2b9af54f16c2340b4ac3d0199\": container with ID starting with 8c8a0a01fd2d2dd666644e73129a11e5658e8dc2b9af54f16c2340b4ac3d0199 not found: ID does not exist" Oct 11 01:05:15 crc kubenswrapper[4743]: I1011 01:05:15.751516 4743 scope.go:117] "RemoveContainer" containerID="3dec0e85a229e4a334c143487093a8e3e9a91875849da7825b73f7f66a6c0568" Oct 11 01:05:15 crc kubenswrapper[4743]: E1011 01:05:15.751988 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dec0e85a229e4a334c143487093a8e3e9a91875849da7825b73f7f66a6c0568\": container with ID starting with 3dec0e85a229e4a334c143487093a8e3e9a91875849da7825b73f7f66a6c0568 not found: ID does not exist" containerID="3dec0e85a229e4a334c143487093a8e3e9a91875849da7825b73f7f66a6c0568" Oct 11 01:05:15 crc kubenswrapper[4743]: I1011 01:05:15.752083 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dec0e85a229e4a334c143487093a8e3e9a91875849da7825b73f7f66a6c0568"} err="failed to get container status \"3dec0e85a229e4a334c143487093a8e3e9a91875849da7825b73f7f66a6c0568\": rpc error: code = NotFound desc = could not find container \"3dec0e85a229e4a334c143487093a8e3e9a91875849da7825b73f7f66a6c0568\": container with ID starting with 3dec0e85a229e4a334c143487093a8e3e9a91875849da7825b73f7f66a6c0568 not found: ID does not exist" Oct 11 01:05:15 crc kubenswrapper[4743]: I1011 01:05:15.769241 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecc14aad-6946-4b41-80ef-803851fbaa8e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ecc14aad-6946-4b41-80ef-803851fbaa8e" (UID: "ecc14aad-6946-4b41-80ef-803851fbaa8e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:05:15 crc kubenswrapper[4743]: I1011 01:05:15.816502 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecc14aad-6946-4b41-80ef-803851fbaa8e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 01:05:16 crc kubenswrapper[4743]: I1011 01:05:16.009952 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v7zb8"] Oct 11 01:05:16 crc kubenswrapper[4743]: I1011 01:05:16.018470 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v7zb8"] Oct 11 01:05:16 crc kubenswrapper[4743]: I1011 01:05:16.105161 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecc14aad-6946-4b41-80ef-803851fbaa8e" path="/var/lib/kubelet/pods/ecc14aad-6946-4b41-80ef-803851fbaa8e/volumes" Oct 11 01:05:16 crc kubenswrapper[4743]: I1011 01:05:16.157621 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f2hn9" Oct 11 01:05:16 crc kubenswrapper[4743]: I1011 01:05:16.157678 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f2hn9" Oct 11 01:05:16 crc kubenswrapper[4743]: I1011 01:05:16.215687 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f2hn9" Oct 11 01:05:16 crc kubenswrapper[4743]: I1011 01:05:16.768360 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f2hn9" Oct 11 01:05:18 crc kubenswrapper[4743]: I1011 01:05:18.563764 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f2hn9"] Oct 11 01:05:18 crc kubenswrapper[4743]: I1011 01:05:18.690344 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f2hn9" podUID="e8b4a18c-cf5d-44ed-98aa-66a6127a89f3" containerName="registry-server" containerID="cri-o://df467cfec853a005213e3a05e2bc45bafb7c9490a1a75cb9cdf6514240cf2813" gracePeriod=2 Oct 11 01:05:19 crc kubenswrapper[4743]: I1011 01:05:19.603616 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk"] Oct 11 01:05:19 crc kubenswrapper[4743]: E1011 01:05:19.603966 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecc14aad-6946-4b41-80ef-803851fbaa8e" containerName="extract-content" Oct 11 01:05:19 crc kubenswrapper[4743]: I1011 01:05:19.603982 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecc14aad-6946-4b41-80ef-803851fbaa8e" containerName="extract-content" Oct 11 01:05:19 crc kubenswrapper[4743]: E1011 01:05:19.604001 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecc14aad-6946-4b41-80ef-803851fbaa8e" containerName="extract-utilities" Oct 11 01:05:19 crc kubenswrapper[4743]: I1011 01:05:19.604008 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecc14aad-6946-4b41-80ef-803851fbaa8e" containerName="extract-utilities" Oct 11 01:05:19 crc kubenswrapper[4743]: E1011 01:05:19.604023 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecc14aad-6946-4b41-80ef-803851fbaa8e" containerName="registry-server" Oct 11 01:05:19 crc kubenswrapper[4743]: I1011 01:05:19.604031 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecc14aad-6946-4b41-80ef-803851fbaa8e" containerName="registry-server" Oct 11 01:05:19 crc kubenswrapper[4743]: I1011 01:05:19.604197 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecc14aad-6946-4b41-80ef-803851fbaa8e" containerName="registry-server" Oct 11 01:05:19 crc kubenswrapper[4743]: I1011 01:05:19.605239 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk" Oct 11 01:05:19 crc kubenswrapper[4743]: I1011 01:05:19.607351 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 11 01:05:19 crc kubenswrapper[4743]: I1011 01:05:19.618192 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk"] Oct 11 01:05:19 crc kubenswrapper[4743]: I1011 01:05:19.701648 4743 generic.go:334] "Generic (PLEG): container finished" podID="e8b4a18c-cf5d-44ed-98aa-66a6127a89f3" containerID="df467cfec853a005213e3a05e2bc45bafb7c9490a1a75cb9cdf6514240cf2813" exitCode=0 Oct 11 01:05:19 crc kubenswrapper[4743]: I1011 01:05:19.701693 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2hn9" event={"ID":"e8b4a18c-cf5d-44ed-98aa-66a6127a89f3","Type":"ContainerDied","Data":"df467cfec853a005213e3a05e2bc45bafb7c9490a1a75cb9cdf6514240cf2813"} Oct 11 01:05:19 crc kubenswrapper[4743]: I1011 01:05:19.771220 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8jd7\" (UniqueName: \"kubernetes.io/projected/7c5ed10c-1531-47da-8329-5589a12da9ac-kube-api-access-m8jd7\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk\" (UID: \"7c5ed10c-1531-47da-8329-5589a12da9ac\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk" Oct 11 01:05:19 crc kubenswrapper[4743]: I1011 01:05:19.771326 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c5ed10c-1531-47da-8329-5589a12da9ac-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk\" (UID: \"7c5ed10c-1531-47da-8329-5589a12da9ac\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk" Oct 11 01:05:19 crc kubenswrapper[4743]: I1011 01:05:19.771401 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c5ed10c-1531-47da-8329-5589a12da9ac-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk\" (UID: \"7c5ed10c-1531-47da-8329-5589a12da9ac\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk" Oct 11 01:05:19 crc kubenswrapper[4743]: I1011 01:05:19.872910 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8jd7\" (UniqueName: \"kubernetes.io/projected/7c5ed10c-1531-47da-8329-5589a12da9ac-kube-api-access-m8jd7\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk\" (UID: \"7c5ed10c-1531-47da-8329-5589a12da9ac\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk" Oct 11 01:05:19 crc kubenswrapper[4743]: I1011 01:05:19.873260 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c5ed10c-1531-47da-8329-5589a12da9ac-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk\" (UID: \"7c5ed10c-1531-47da-8329-5589a12da9ac\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk" Oct 11 01:05:19 crc kubenswrapper[4743]: I1011 01:05:19.873366 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c5ed10c-1531-47da-8329-5589a12da9ac-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk\" (UID: \"7c5ed10c-1531-47da-8329-5589a12da9ac\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk" Oct 11 01:05:19 crc kubenswrapper[4743]: I1011 01:05:19.874006 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c5ed10c-1531-47da-8329-5589a12da9ac-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk\" (UID: \"7c5ed10c-1531-47da-8329-5589a12da9ac\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk" Oct 11 01:05:19 crc kubenswrapper[4743]: I1011 01:05:19.874151 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c5ed10c-1531-47da-8329-5589a12da9ac-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk\" (UID: \"7c5ed10c-1531-47da-8329-5589a12da9ac\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk" Oct 11 01:05:19 crc kubenswrapper[4743]: I1011 01:05:19.897154 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8jd7\" (UniqueName: \"kubernetes.io/projected/7c5ed10c-1531-47da-8329-5589a12da9ac-kube-api-access-m8jd7\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk\" (UID: \"7c5ed10c-1531-47da-8329-5589a12da9ac\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk" Oct 11 01:05:19 crc kubenswrapper[4743]: I1011 01:05:19.920750 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk" Oct 11 01:05:20 crc kubenswrapper[4743]: I1011 01:05:20.365529 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk"] Oct 11 01:05:20 crc kubenswrapper[4743]: W1011 01:05:20.372686 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c5ed10c_1531_47da_8329_5589a12da9ac.slice/crio-55956e4b97ed94ca051b60e2bbc09c861c72bf2fb7f6d77c7b14065896cefaa1 WatchSource:0}: Error finding container 55956e4b97ed94ca051b60e2bbc09c861c72bf2fb7f6d77c7b14065896cefaa1: Status 404 returned error can't find the container with id 55956e4b97ed94ca051b60e2bbc09c861c72bf2fb7f6d77c7b14065896cefaa1 Oct 11 01:05:20 crc kubenswrapper[4743]: I1011 01:05:20.714609 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk" event={"ID":"7c5ed10c-1531-47da-8329-5589a12da9ac","Type":"ContainerStarted","Data":"55956e4b97ed94ca051b60e2bbc09c861c72bf2fb7f6d77c7b14065896cefaa1"} Oct 11 01:05:21 crc kubenswrapper[4743]: I1011 01:05:21.131809 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f2hn9" Oct 11 01:05:21 crc kubenswrapper[4743]: I1011 01:05:21.194179 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8b4a18c-cf5d-44ed-98aa-66a6127a89f3-catalog-content\") pod \"e8b4a18c-cf5d-44ed-98aa-66a6127a89f3\" (UID: \"e8b4a18c-cf5d-44ed-98aa-66a6127a89f3\") " Oct 11 01:05:21 crc kubenswrapper[4743]: I1011 01:05:21.194343 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bvrn\" (UniqueName: \"kubernetes.io/projected/e8b4a18c-cf5d-44ed-98aa-66a6127a89f3-kube-api-access-2bvrn\") pod \"e8b4a18c-cf5d-44ed-98aa-66a6127a89f3\" (UID: \"e8b4a18c-cf5d-44ed-98aa-66a6127a89f3\") " Oct 11 01:05:21 crc kubenswrapper[4743]: I1011 01:05:21.194405 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8b4a18c-cf5d-44ed-98aa-66a6127a89f3-utilities\") pod \"e8b4a18c-cf5d-44ed-98aa-66a6127a89f3\" (UID: \"e8b4a18c-cf5d-44ed-98aa-66a6127a89f3\") " Oct 11 01:05:21 crc kubenswrapper[4743]: I1011 01:05:21.196036 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8b4a18c-cf5d-44ed-98aa-66a6127a89f3-utilities" (OuterVolumeSpecName: "utilities") pod "e8b4a18c-cf5d-44ed-98aa-66a6127a89f3" (UID: "e8b4a18c-cf5d-44ed-98aa-66a6127a89f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:05:21 crc kubenswrapper[4743]: I1011 01:05:21.199868 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8b4a18c-cf5d-44ed-98aa-66a6127a89f3-kube-api-access-2bvrn" (OuterVolumeSpecName: "kube-api-access-2bvrn") pod "e8b4a18c-cf5d-44ed-98aa-66a6127a89f3" (UID: "e8b4a18c-cf5d-44ed-98aa-66a6127a89f3"). InnerVolumeSpecName "kube-api-access-2bvrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:05:21 crc kubenswrapper[4743]: I1011 01:05:21.280886 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8b4a18c-cf5d-44ed-98aa-66a6127a89f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8b4a18c-cf5d-44ed-98aa-66a6127a89f3" (UID: "e8b4a18c-cf5d-44ed-98aa-66a6127a89f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:05:21 crc kubenswrapper[4743]: I1011 01:05:21.296037 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8b4a18c-cf5d-44ed-98aa-66a6127a89f3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 01:05:21 crc kubenswrapper[4743]: I1011 01:05:21.296090 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bvrn\" (UniqueName: \"kubernetes.io/projected/e8b4a18c-cf5d-44ed-98aa-66a6127a89f3-kube-api-access-2bvrn\") on node \"crc\" DevicePath \"\"" Oct 11 01:05:21 crc kubenswrapper[4743]: I1011 01:05:21.296119 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8b4a18c-cf5d-44ed-98aa-66a6127a89f3-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 01:05:21 crc kubenswrapper[4743]: I1011 01:05:21.735301 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2hn9" event={"ID":"e8b4a18c-cf5d-44ed-98aa-66a6127a89f3","Type":"ContainerDied","Data":"d603acb51ad39fbfecb0065fba8d7c615e9a652e1408459ad9acfd9a3de16dde"} Oct 11 01:05:21 crc kubenswrapper[4743]: I1011 01:05:21.735359 4743 scope.go:117] "RemoveContainer" containerID="df467cfec853a005213e3a05e2bc45bafb7c9490a1a75cb9cdf6514240cf2813" Oct 11 01:05:21 crc kubenswrapper[4743]: I1011 01:05:21.735476 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f2hn9" Oct 11 01:05:21 crc kubenswrapper[4743]: I1011 01:05:21.739778 4743 generic.go:334] "Generic (PLEG): container finished" podID="7c5ed10c-1531-47da-8329-5589a12da9ac" containerID="6d212f2ec142a168e4c767f1845ac582bdb6f3a11d3c109f3985bc2c273b4d41" exitCode=0 Oct 11 01:05:21 crc kubenswrapper[4743]: I1011 01:05:21.739836 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk" event={"ID":"7c5ed10c-1531-47da-8329-5589a12da9ac","Type":"ContainerDied","Data":"6d212f2ec142a168e4c767f1845ac582bdb6f3a11d3c109f3985bc2c273b4d41"} Oct 11 01:05:21 crc kubenswrapper[4743]: I1011 01:05:21.766517 4743 scope.go:117] "RemoveContainer" containerID="28c7af64ee50aea24c96d0606134e8912d71eae7b115c1e609fb3847afb6eea7" Oct 11 01:05:21 crc kubenswrapper[4743]: I1011 01:05:21.779596 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f2hn9"] Oct 11 01:05:21 crc kubenswrapper[4743]: I1011 01:05:21.793636 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f2hn9"] Oct 11 01:05:21 crc kubenswrapper[4743]: I1011 01:05:21.802687 4743 scope.go:117] "RemoveContainer" containerID="5b2b916dfd0f156fb19a31ae577fbb10cf443b2eda26dbae9ef7a4edf2b05057" Oct 11 01:05:22 crc kubenswrapper[4743]: I1011 01:05:22.103844 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8b4a18c-cf5d-44ed-98aa-66a6127a89f3" path="/var/lib/kubelet/pods/e8b4a18c-cf5d-44ed-98aa-66a6127a89f3/volumes" Oct 11 01:05:22 crc kubenswrapper[4743]: I1011 01:05:22.366246 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rvzdb"] Oct 11 01:05:22 crc kubenswrapper[4743]: E1011 01:05:22.366704 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8b4a18c-cf5d-44ed-98aa-66a6127a89f3" containerName="registry-server" Oct 11 01:05:22 crc kubenswrapper[4743]: I1011 01:05:22.366715 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8b4a18c-cf5d-44ed-98aa-66a6127a89f3" containerName="registry-server" Oct 11 01:05:22 crc kubenswrapper[4743]: E1011 01:05:22.366725 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8b4a18c-cf5d-44ed-98aa-66a6127a89f3" containerName="extract-utilities" Oct 11 01:05:22 crc kubenswrapper[4743]: I1011 01:05:22.366731 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8b4a18c-cf5d-44ed-98aa-66a6127a89f3" containerName="extract-utilities" Oct 11 01:05:22 crc kubenswrapper[4743]: E1011 01:05:22.366750 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8b4a18c-cf5d-44ed-98aa-66a6127a89f3" containerName="extract-content" Oct 11 01:05:22 crc kubenswrapper[4743]: I1011 01:05:22.366756 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8b4a18c-cf5d-44ed-98aa-66a6127a89f3" containerName="extract-content" Oct 11 01:05:22 crc kubenswrapper[4743]: I1011 01:05:22.366869 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8b4a18c-cf5d-44ed-98aa-66a6127a89f3" containerName="registry-server" Oct 11 01:05:22 crc kubenswrapper[4743]: I1011 01:05:22.367641 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rvzdb" Oct 11 01:05:22 crc kubenswrapper[4743]: I1011 01:05:22.450774 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvzdb"] Oct 11 01:05:22 crc kubenswrapper[4743]: I1011 01:05:22.513363 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0b42080-afd5-431e-9b1f-6ba5907b7331-catalog-content\") pod \"redhat-marketplace-rvzdb\" (UID: \"a0b42080-afd5-431e-9b1f-6ba5907b7331\") " pod="openshift-marketplace/redhat-marketplace-rvzdb" Oct 11 01:05:22 crc kubenswrapper[4743]: I1011 01:05:22.513428 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcww9\" (UniqueName: \"kubernetes.io/projected/a0b42080-afd5-431e-9b1f-6ba5907b7331-kube-api-access-dcww9\") pod \"redhat-marketplace-rvzdb\" (UID: \"a0b42080-afd5-431e-9b1f-6ba5907b7331\") " pod="openshift-marketplace/redhat-marketplace-rvzdb" Oct 11 01:05:22 crc kubenswrapper[4743]: I1011 01:05:22.513523 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0b42080-afd5-431e-9b1f-6ba5907b7331-utilities\") pod \"redhat-marketplace-rvzdb\" (UID: \"a0b42080-afd5-431e-9b1f-6ba5907b7331\") " pod="openshift-marketplace/redhat-marketplace-rvzdb" Oct 11 01:05:22 crc kubenswrapper[4743]: I1011 01:05:22.615164 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcww9\" (UniqueName: \"kubernetes.io/projected/a0b42080-afd5-431e-9b1f-6ba5907b7331-kube-api-access-dcww9\") pod \"redhat-marketplace-rvzdb\" (UID: \"a0b42080-afd5-431e-9b1f-6ba5907b7331\") " pod="openshift-marketplace/redhat-marketplace-rvzdb" Oct 11 01:05:22 crc kubenswrapper[4743]: I1011 01:05:22.615253 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0b42080-afd5-431e-9b1f-6ba5907b7331-utilities\") pod \"redhat-marketplace-rvzdb\" (UID: \"a0b42080-afd5-431e-9b1f-6ba5907b7331\") " pod="openshift-marketplace/redhat-marketplace-rvzdb" Oct 11 01:05:22 crc kubenswrapper[4743]: I1011 01:05:22.615310 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0b42080-afd5-431e-9b1f-6ba5907b7331-catalog-content\") pod \"redhat-marketplace-rvzdb\" (UID: \"a0b42080-afd5-431e-9b1f-6ba5907b7331\") " pod="openshift-marketplace/redhat-marketplace-rvzdb" Oct 11 01:05:22 crc kubenswrapper[4743]: I1011 01:05:22.615784 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0b42080-afd5-431e-9b1f-6ba5907b7331-catalog-content\") pod \"redhat-marketplace-rvzdb\" (UID: \"a0b42080-afd5-431e-9b1f-6ba5907b7331\") " pod="openshift-marketplace/redhat-marketplace-rvzdb" Oct 11 01:05:22 crc kubenswrapper[4743]: I1011 01:05:22.615841 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0b42080-afd5-431e-9b1f-6ba5907b7331-utilities\") pod \"redhat-marketplace-rvzdb\" (UID: \"a0b42080-afd5-431e-9b1f-6ba5907b7331\") " pod="openshift-marketplace/redhat-marketplace-rvzdb" Oct 11 01:05:22 crc kubenswrapper[4743]: I1011 01:05:22.631244 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcww9\" (UniqueName: \"kubernetes.io/projected/a0b42080-afd5-431e-9b1f-6ba5907b7331-kube-api-access-dcww9\") pod \"redhat-marketplace-rvzdb\" (UID: \"a0b42080-afd5-431e-9b1f-6ba5907b7331\") " pod="openshift-marketplace/redhat-marketplace-rvzdb" Oct 11 01:05:22 crc kubenswrapper[4743]: I1011 01:05:22.719169 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rvzdb" Oct 11 01:05:23 crc kubenswrapper[4743]: I1011 01:05:23.156465 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvzdb"] Oct 11 01:05:23 crc kubenswrapper[4743]: W1011 01:05:23.161896 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0b42080_afd5_431e_9b1f_6ba5907b7331.slice/crio-3773ea4e08f8735a72f141b323f4356e6ddc6fab300717f2dd54c71e1944bc67 WatchSource:0}: Error finding container 3773ea4e08f8735a72f141b323f4356e6ddc6fab300717f2dd54c71e1944bc67: Status 404 returned error can't find the container with id 3773ea4e08f8735a72f141b323f4356e6ddc6fab300717f2dd54c71e1944bc67 Oct 11 01:05:23 crc kubenswrapper[4743]: I1011 01:05:23.759467 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk" event={"ID":"7c5ed10c-1531-47da-8329-5589a12da9ac","Type":"ContainerStarted","Data":"0fca971769591759cb49a93e3e7b8919a37423e2c7ce98b51d908e8db39802ec"} Oct 11 01:05:23 crc kubenswrapper[4743]: I1011 01:05:23.761357 4743 generic.go:334] "Generic (PLEG): container finished" podID="a0b42080-afd5-431e-9b1f-6ba5907b7331" containerID="f4619419080ecf49e58a4d8fcd8fdd77831e549a0fc91b2d8ec4976d2de194e5" exitCode=0 Oct 11 01:05:23 crc kubenswrapper[4743]: I1011 01:05:23.761408 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvzdb" event={"ID":"a0b42080-afd5-431e-9b1f-6ba5907b7331","Type":"ContainerDied","Data":"f4619419080ecf49e58a4d8fcd8fdd77831e549a0fc91b2d8ec4976d2de194e5"} Oct 11 01:05:23 crc kubenswrapper[4743]: I1011 01:05:23.761439 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvzdb" event={"ID":"a0b42080-afd5-431e-9b1f-6ba5907b7331","Type":"ContainerStarted","Data":"3773ea4e08f8735a72f141b323f4356e6ddc6fab300717f2dd54c71e1944bc67"} Oct 11 01:05:24 crc kubenswrapper[4743]: I1011 01:05:24.770261 4743 generic.go:334] "Generic (PLEG): container finished" podID="7c5ed10c-1531-47da-8329-5589a12da9ac" containerID="0fca971769591759cb49a93e3e7b8919a37423e2c7ce98b51d908e8db39802ec" exitCode=0 Oct 11 01:05:24 crc kubenswrapper[4743]: I1011 01:05:24.770307 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk" event={"ID":"7c5ed10c-1531-47da-8329-5589a12da9ac","Type":"ContainerDied","Data":"0fca971769591759cb49a93e3e7b8919a37423e2c7ce98b51d908e8db39802ec"} Oct 11 01:05:24 crc kubenswrapper[4743]: I1011 01:05:24.772701 4743 generic.go:334] "Generic (PLEG): container finished" podID="a0b42080-afd5-431e-9b1f-6ba5907b7331" containerID="2a19fc01e7ada84faa1c3e4b6510519fcb2eebedacc9ccf342f91f6eafc2b82c" exitCode=0 Oct 11 01:05:24 crc kubenswrapper[4743]: I1011 01:05:24.772741 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvzdb" event={"ID":"a0b42080-afd5-431e-9b1f-6ba5907b7331","Type":"ContainerDied","Data":"2a19fc01e7ada84faa1c3e4b6510519fcb2eebedacc9ccf342f91f6eafc2b82c"} Oct 11 01:05:25 crc kubenswrapper[4743]: I1011 01:05:25.782920 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvzdb" event={"ID":"a0b42080-afd5-431e-9b1f-6ba5907b7331","Type":"ContainerStarted","Data":"f42da51d613ef1ee9995898cbe9f617aa8019b50dc68e381faf0c30786602dcd"} Oct 11 01:05:25 crc kubenswrapper[4743]: I1011 01:05:25.792038 4743 generic.go:334] "Generic (PLEG): container finished" podID="7c5ed10c-1531-47da-8329-5589a12da9ac" containerID="5959b8216078ccdc1ecb80ae1526be096d851822eb2a41f7d8ba97d2eb6bb13b" exitCode=0 Oct 11 01:05:25 crc kubenswrapper[4743]: I1011 01:05:25.792104 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk" event={"ID":"7c5ed10c-1531-47da-8329-5589a12da9ac","Type":"ContainerDied","Data":"5959b8216078ccdc1ecb80ae1526be096d851822eb2a41f7d8ba97d2eb6bb13b"} Oct 11 01:05:25 crc kubenswrapper[4743]: I1011 01:05:25.814506 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rvzdb" podStartSLOduration=2.402477021 podStartE2EDuration="3.814474515s" podCreationTimestamp="2025-10-11 01:05:22 +0000 UTC" firstStartedPulling="2025-10-11 01:05:23.763574849 +0000 UTC m=+818.416555256" lastFinishedPulling="2025-10-11 01:05:25.175572353 +0000 UTC m=+819.828552750" observedRunningTime="2025-10-11 01:05:25.80388896 +0000 UTC m=+820.456869387" watchObservedRunningTime="2025-10-11 01:05:25.814474515 +0000 UTC m=+820.467454972" Oct 11 01:05:27 crc kubenswrapper[4743]: I1011 01:05:27.100163 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk" Oct 11 01:05:27 crc kubenswrapper[4743]: I1011 01:05:27.176942 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8jd7\" (UniqueName: \"kubernetes.io/projected/7c5ed10c-1531-47da-8329-5589a12da9ac-kube-api-access-m8jd7\") pod \"7c5ed10c-1531-47da-8329-5589a12da9ac\" (UID: \"7c5ed10c-1531-47da-8329-5589a12da9ac\") " Oct 11 01:05:27 crc kubenswrapper[4743]: I1011 01:05:27.177050 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c5ed10c-1531-47da-8329-5589a12da9ac-util\") pod \"7c5ed10c-1531-47da-8329-5589a12da9ac\" (UID: \"7c5ed10c-1531-47da-8329-5589a12da9ac\") " Oct 11 01:05:27 crc kubenswrapper[4743]: I1011 01:05:27.177099 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c5ed10c-1531-47da-8329-5589a12da9ac-bundle\") pod \"7c5ed10c-1531-47da-8329-5589a12da9ac\" (UID: \"7c5ed10c-1531-47da-8329-5589a12da9ac\") " Oct 11 01:05:27 crc kubenswrapper[4743]: I1011 01:05:27.177835 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c5ed10c-1531-47da-8329-5589a12da9ac-bundle" (OuterVolumeSpecName: "bundle") pod "7c5ed10c-1531-47da-8329-5589a12da9ac" (UID: "7c5ed10c-1531-47da-8329-5589a12da9ac"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:05:27 crc kubenswrapper[4743]: I1011 01:05:27.182892 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c5ed10c-1531-47da-8329-5589a12da9ac-kube-api-access-m8jd7" (OuterVolumeSpecName: "kube-api-access-m8jd7") pod "7c5ed10c-1531-47da-8329-5589a12da9ac" (UID: "7c5ed10c-1531-47da-8329-5589a12da9ac"). InnerVolumeSpecName "kube-api-access-m8jd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:05:27 crc kubenswrapper[4743]: I1011 01:05:27.186830 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c5ed10c-1531-47da-8329-5589a12da9ac-util" (OuterVolumeSpecName: "util") pod "7c5ed10c-1531-47da-8329-5589a12da9ac" (UID: "7c5ed10c-1531-47da-8329-5589a12da9ac"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:05:27 crc kubenswrapper[4743]: I1011 01:05:27.278151 4743 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c5ed10c-1531-47da-8329-5589a12da9ac-util\") on node \"crc\" DevicePath \"\"" Oct 11 01:05:27 crc kubenswrapper[4743]: I1011 01:05:27.278378 4743 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c5ed10c-1531-47da-8329-5589a12da9ac-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:05:27 crc kubenswrapper[4743]: I1011 01:05:27.278388 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8jd7\" (UniqueName: \"kubernetes.io/projected/7c5ed10c-1531-47da-8329-5589a12da9ac-kube-api-access-m8jd7\") on node \"crc\" DevicePath \"\"" Oct 11 01:05:27 crc kubenswrapper[4743]: I1011 01:05:27.805938 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk" event={"ID":"7c5ed10c-1531-47da-8329-5589a12da9ac","Type":"ContainerDied","Data":"55956e4b97ed94ca051b60e2bbc09c861c72bf2fb7f6d77c7b14065896cefaa1"} Oct 11 01:05:27 crc kubenswrapper[4743]: I1011 01:05:27.805977 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55956e4b97ed94ca051b60e2bbc09c861c72bf2fb7f6d77c7b14065896cefaa1" Oct 11 01:05:27 crc kubenswrapper[4743]: I1011 01:05:27.806036 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk" Oct 11 01:05:29 crc kubenswrapper[4743]: I1011 01:05:29.962519 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-bmzmc"] Oct 11 01:05:29 crc kubenswrapper[4743]: E1011 01:05:29.962966 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c5ed10c-1531-47da-8329-5589a12da9ac" containerName="pull" Oct 11 01:05:29 crc kubenswrapper[4743]: I1011 01:05:29.962977 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c5ed10c-1531-47da-8329-5589a12da9ac" containerName="pull" Oct 11 01:05:29 crc kubenswrapper[4743]: E1011 01:05:29.962991 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c5ed10c-1531-47da-8329-5589a12da9ac" containerName="util" Oct 11 01:05:29 crc kubenswrapper[4743]: I1011 01:05:29.962998 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c5ed10c-1531-47da-8329-5589a12da9ac" containerName="util" Oct 11 01:05:29 crc kubenswrapper[4743]: E1011 01:05:29.963008 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c5ed10c-1531-47da-8329-5589a12da9ac" containerName="extract" Oct 11 01:05:29 crc kubenswrapper[4743]: I1011 01:05:29.963014 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c5ed10c-1531-47da-8329-5589a12da9ac" containerName="extract" Oct 11 01:05:29 crc kubenswrapper[4743]: I1011 01:05:29.963117 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c5ed10c-1531-47da-8329-5589a12da9ac" containerName="extract" Oct 11 01:05:29 crc kubenswrapper[4743]: I1011 01:05:29.963543 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-bmzmc" Oct 11 01:05:29 crc kubenswrapper[4743]: I1011 01:05:29.966058 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 11 01:05:29 crc kubenswrapper[4743]: I1011 01:05:29.966221 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 11 01:05:29 crc kubenswrapper[4743]: I1011 01:05:29.966754 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-b64vr" Oct 11 01:05:30 crc kubenswrapper[4743]: I1011 01:05:30.013028 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-bmzmc"] Oct 11 01:05:30 crc kubenswrapper[4743]: I1011 01:05:30.024337 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd74j\" (UniqueName: \"kubernetes.io/projected/0389ac5e-634b-4dd6-a9a8-084cb349b29e-kube-api-access-kd74j\") pod \"nmstate-operator-858ddd8f98-bmzmc\" (UID: \"0389ac5e-634b-4dd6-a9a8-084cb349b29e\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-bmzmc" Oct 11 01:05:30 crc kubenswrapper[4743]: I1011 01:05:30.125973 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd74j\" (UniqueName: \"kubernetes.io/projected/0389ac5e-634b-4dd6-a9a8-084cb349b29e-kube-api-access-kd74j\") pod \"nmstate-operator-858ddd8f98-bmzmc\" (UID: \"0389ac5e-634b-4dd6-a9a8-084cb349b29e\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-bmzmc" Oct 11 01:05:30 crc kubenswrapper[4743]: I1011 01:05:30.148404 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd74j\" (UniqueName: \"kubernetes.io/projected/0389ac5e-634b-4dd6-a9a8-084cb349b29e-kube-api-access-kd74j\") pod \"nmstate-operator-858ddd8f98-bmzmc\" (UID: \"0389ac5e-634b-4dd6-a9a8-084cb349b29e\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-bmzmc" Oct 11 01:05:30 crc kubenswrapper[4743]: I1011 01:05:30.276932 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-bmzmc" Oct 11 01:05:30 crc kubenswrapper[4743]: I1011 01:05:30.570711 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-bmzmc"] Oct 11 01:05:30 crc kubenswrapper[4743]: I1011 01:05:30.827466 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-bmzmc" event={"ID":"0389ac5e-634b-4dd6-a9a8-084cb349b29e","Type":"ContainerStarted","Data":"35986d46d2c59f84e11d9cfefd57e7ec330516539f74dc5df64534d41086130b"} Oct 11 01:05:32 crc kubenswrapper[4743]: I1011 01:05:32.170937 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mbkjs"] Oct 11 01:05:32 crc kubenswrapper[4743]: I1011 01:05:32.172612 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbkjs" Oct 11 01:05:32 crc kubenswrapper[4743]: I1011 01:05:32.188556 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mbkjs"] Oct 11 01:05:32 crc kubenswrapper[4743]: I1011 01:05:32.257904 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j94m2\" (UniqueName: \"kubernetes.io/projected/0e2570aa-046d-43a2-bf1e-ade9d92f9036-kube-api-access-j94m2\") pod \"community-operators-mbkjs\" (UID: \"0e2570aa-046d-43a2-bf1e-ade9d92f9036\") " pod="openshift-marketplace/community-operators-mbkjs" Oct 11 01:05:32 crc kubenswrapper[4743]: I1011 01:05:32.258006 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e2570aa-046d-43a2-bf1e-ade9d92f9036-utilities\") pod \"community-operators-mbkjs\" (UID: \"0e2570aa-046d-43a2-bf1e-ade9d92f9036\") " pod="openshift-marketplace/community-operators-mbkjs" Oct 11 01:05:32 crc kubenswrapper[4743]: I1011 01:05:32.258040 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e2570aa-046d-43a2-bf1e-ade9d92f9036-catalog-content\") pod \"community-operators-mbkjs\" (UID: \"0e2570aa-046d-43a2-bf1e-ade9d92f9036\") " pod="openshift-marketplace/community-operators-mbkjs" Oct 11 01:05:32 crc kubenswrapper[4743]: I1011 01:05:32.359174 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j94m2\" (UniqueName: \"kubernetes.io/projected/0e2570aa-046d-43a2-bf1e-ade9d92f9036-kube-api-access-j94m2\") pod \"community-operators-mbkjs\" (UID: \"0e2570aa-046d-43a2-bf1e-ade9d92f9036\") " pod="openshift-marketplace/community-operators-mbkjs" Oct 11 01:05:32 crc kubenswrapper[4743]: I1011 01:05:32.359243 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e2570aa-046d-43a2-bf1e-ade9d92f9036-utilities\") pod \"community-operators-mbkjs\" (UID: \"0e2570aa-046d-43a2-bf1e-ade9d92f9036\") " pod="openshift-marketplace/community-operators-mbkjs" Oct 11 01:05:32 crc kubenswrapper[4743]: I1011 01:05:32.359265 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e2570aa-046d-43a2-bf1e-ade9d92f9036-catalog-content\") pod \"community-operators-mbkjs\" (UID: \"0e2570aa-046d-43a2-bf1e-ade9d92f9036\") " pod="openshift-marketplace/community-operators-mbkjs" Oct 11 01:05:32 crc kubenswrapper[4743]: I1011 01:05:32.359687 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e2570aa-046d-43a2-bf1e-ade9d92f9036-catalog-content\") pod \"community-operators-mbkjs\" (UID: \"0e2570aa-046d-43a2-bf1e-ade9d92f9036\") " pod="openshift-marketplace/community-operators-mbkjs" Oct 11 01:05:32 crc kubenswrapper[4743]: I1011 01:05:32.359713 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e2570aa-046d-43a2-bf1e-ade9d92f9036-utilities\") pod \"community-operators-mbkjs\" (UID: \"0e2570aa-046d-43a2-bf1e-ade9d92f9036\") " pod="openshift-marketplace/community-operators-mbkjs" Oct 11 01:05:32 crc kubenswrapper[4743]: I1011 01:05:32.395757 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j94m2\" (UniqueName: \"kubernetes.io/projected/0e2570aa-046d-43a2-bf1e-ade9d92f9036-kube-api-access-j94m2\") pod \"community-operators-mbkjs\" (UID: \"0e2570aa-046d-43a2-bf1e-ade9d92f9036\") " pod="openshift-marketplace/community-operators-mbkjs" Oct 11 01:05:32 crc kubenswrapper[4743]: I1011 01:05:32.542155 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbkjs" Oct 11 01:05:32 crc kubenswrapper[4743]: I1011 01:05:32.720096 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rvzdb" Oct 11 01:05:32 crc kubenswrapper[4743]: I1011 01:05:32.720218 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rvzdb" Oct 11 01:05:32 crc kubenswrapper[4743]: I1011 01:05:32.779339 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rvzdb" Oct 11 01:05:32 crc kubenswrapper[4743]: I1011 01:05:32.877066 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rvzdb" Oct 11 01:05:33 crc kubenswrapper[4743]: I1011 01:05:33.408314 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mbkjs"] Oct 11 01:05:33 crc kubenswrapper[4743]: I1011 01:05:33.848465 4743 generic.go:334] "Generic (PLEG): container finished" podID="0e2570aa-046d-43a2-bf1e-ade9d92f9036" containerID="5f5141e5554a87c41a736c62e41cbe615d5d50c0103fd5fcb25fc08f1be884e3" exitCode=0 Oct 11 01:05:33 crc kubenswrapper[4743]: I1011 01:05:33.848556 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbkjs" event={"ID":"0e2570aa-046d-43a2-bf1e-ade9d92f9036","Type":"ContainerDied","Data":"5f5141e5554a87c41a736c62e41cbe615d5d50c0103fd5fcb25fc08f1be884e3"} Oct 11 01:05:33 crc kubenswrapper[4743]: I1011 01:05:33.849099 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbkjs" event={"ID":"0e2570aa-046d-43a2-bf1e-ade9d92f9036","Type":"ContainerStarted","Data":"3ff8d1a246a3242fa0358d6237437839f7b78dbbcb8b4634b9c974bbdc1c832a"} Oct 11 01:05:33 crc kubenswrapper[4743]: I1011 01:05:33.853590 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-bmzmc" event={"ID":"0389ac5e-634b-4dd6-a9a8-084cb349b29e","Type":"ContainerStarted","Data":"f2ba5b6b3cdfecf55eba3df27305f283e6fdc2882774cff9b2eeffeb32c01c38"} Oct 11 01:05:33 crc kubenswrapper[4743]: I1011 01:05:33.918007 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-bmzmc" podStartSLOduration=2.515514374 podStartE2EDuration="4.917979178s" podCreationTimestamp="2025-10-11 01:05:29 +0000 UTC" firstStartedPulling="2025-10-11 01:05:30.581921709 +0000 UTC m=+825.234902146" lastFinishedPulling="2025-10-11 01:05:32.984386563 +0000 UTC m=+827.637366950" observedRunningTime="2025-10-11 01:05:33.91537202 +0000 UTC m=+828.568352427" watchObservedRunningTime="2025-10-11 01:05:33.917979178 +0000 UTC m=+828.570959635" Oct 11 01:05:34 crc kubenswrapper[4743]: I1011 01:05:34.362323 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvzdb"] Oct 11 01:05:34 crc kubenswrapper[4743]: I1011 01:05:34.863023 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbkjs" event={"ID":"0e2570aa-046d-43a2-bf1e-ade9d92f9036","Type":"ContainerStarted","Data":"a008e0b41cac7c8df79f35746b8d6832130d6eda0c91dff5dd071253b40994b3"} Oct 11 01:05:35 crc kubenswrapper[4743]: I1011 01:05:35.872022 4743 generic.go:334] "Generic (PLEG): container finished" podID="0e2570aa-046d-43a2-bf1e-ade9d92f9036" containerID="a008e0b41cac7c8df79f35746b8d6832130d6eda0c91dff5dd071253b40994b3" exitCode=0 Oct 11 01:05:35 crc kubenswrapper[4743]: I1011 01:05:35.872107 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbkjs" event={"ID":"0e2570aa-046d-43a2-bf1e-ade9d92f9036","Type":"ContainerDied","Data":"a008e0b41cac7c8df79f35746b8d6832130d6eda0c91dff5dd071253b40994b3"} Oct 11 01:05:35 crc kubenswrapper[4743]: I1011 01:05:35.872395 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rvzdb" podUID="a0b42080-afd5-431e-9b1f-6ba5907b7331" containerName="registry-server" containerID="cri-o://f42da51d613ef1ee9995898cbe9f617aa8019b50dc68e381faf0c30786602dcd" gracePeriod=2 Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.314453 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rvzdb" Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.424883 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0b42080-afd5-431e-9b1f-6ba5907b7331-catalog-content\") pod \"a0b42080-afd5-431e-9b1f-6ba5907b7331\" (UID: \"a0b42080-afd5-431e-9b1f-6ba5907b7331\") " Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.425070 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcww9\" (UniqueName: \"kubernetes.io/projected/a0b42080-afd5-431e-9b1f-6ba5907b7331-kube-api-access-dcww9\") pod \"a0b42080-afd5-431e-9b1f-6ba5907b7331\" (UID: \"a0b42080-afd5-431e-9b1f-6ba5907b7331\") " Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.425172 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0b42080-afd5-431e-9b1f-6ba5907b7331-utilities\") pod \"a0b42080-afd5-431e-9b1f-6ba5907b7331\" (UID: \"a0b42080-afd5-431e-9b1f-6ba5907b7331\") " Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.426101 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0b42080-afd5-431e-9b1f-6ba5907b7331-utilities" (OuterVolumeSpecName: "utilities") pod "a0b42080-afd5-431e-9b1f-6ba5907b7331" (UID: "a0b42080-afd5-431e-9b1f-6ba5907b7331"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.431468 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0b42080-afd5-431e-9b1f-6ba5907b7331-kube-api-access-dcww9" (OuterVolumeSpecName: "kube-api-access-dcww9") pod "a0b42080-afd5-431e-9b1f-6ba5907b7331" (UID: "a0b42080-afd5-431e-9b1f-6ba5907b7331"). InnerVolumeSpecName "kube-api-access-dcww9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.440762 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0b42080-afd5-431e-9b1f-6ba5907b7331-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0b42080-afd5-431e-9b1f-6ba5907b7331" (UID: "a0b42080-afd5-431e-9b1f-6ba5907b7331"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.526465 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcww9\" (UniqueName: \"kubernetes.io/projected/a0b42080-afd5-431e-9b1f-6ba5907b7331-kube-api-access-dcww9\") on node \"crc\" DevicePath \"\"" Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.526504 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0b42080-afd5-431e-9b1f-6ba5907b7331-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.526543 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0b42080-afd5-431e-9b1f-6ba5907b7331-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.882842 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbkjs" event={"ID":"0e2570aa-046d-43a2-bf1e-ade9d92f9036","Type":"ContainerStarted","Data":"2d033cbcb3ef0246a2b84658f644f879b6cb0953c7952e1449c7d49d7f240408"} Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.886866 4743 generic.go:334] "Generic (PLEG): container finished" podID="a0b42080-afd5-431e-9b1f-6ba5907b7331" containerID="f42da51d613ef1ee9995898cbe9f617aa8019b50dc68e381faf0c30786602dcd" exitCode=0 Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.886908 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvzdb" event={"ID":"a0b42080-afd5-431e-9b1f-6ba5907b7331","Type":"ContainerDied","Data":"f42da51d613ef1ee9995898cbe9f617aa8019b50dc68e381faf0c30786602dcd"} Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.886931 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvzdb" event={"ID":"a0b42080-afd5-431e-9b1f-6ba5907b7331","Type":"ContainerDied","Data":"3773ea4e08f8735a72f141b323f4356e6ddc6fab300717f2dd54c71e1944bc67"} Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.886951 4743 scope.go:117] "RemoveContainer" containerID="f42da51d613ef1ee9995898cbe9f617aa8019b50dc68e381faf0c30786602dcd" Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.887040 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rvzdb" Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.913609 4743 scope.go:117] "RemoveContainer" containerID="2a19fc01e7ada84faa1c3e4b6510519fcb2eebedacc9ccf342f91f6eafc2b82c" Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.923937 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mbkjs" podStartSLOduration=2.430740936 podStartE2EDuration="4.923905013s" podCreationTimestamp="2025-10-11 01:05:32 +0000 UTC" firstStartedPulling="2025-10-11 01:05:33.85098316 +0000 UTC m=+828.503963587" lastFinishedPulling="2025-10-11 01:05:36.344147227 +0000 UTC m=+830.997127664" observedRunningTime="2025-10-11 01:05:36.916673656 +0000 UTC m=+831.569654063" watchObservedRunningTime="2025-10-11 01:05:36.923905013 +0000 UTC m=+831.576885450" Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.948117 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvzdb"] Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.951354 4743 scope.go:117] "RemoveContainer" containerID="f4619419080ecf49e58a4d8fcd8fdd77831e549a0fc91b2d8ec4976d2de194e5" Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.952763 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvzdb"] Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.992047 4743 scope.go:117] "RemoveContainer" containerID="f42da51d613ef1ee9995898cbe9f617aa8019b50dc68e381faf0c30786602dcd" Oct 11 01:05:36 crc kubenswrapper[4743]: E1011 01:05:36.992652 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f42da51d613ef1ee9995898cbe9f617aa8019b50dc68e381faf0c30786602dcd\": container with ID starting with f42da51d613ef1ee9995898cbe9f617aa8019b50dc68e381faf0c30786602dcd not found: ID does not exist" containerID="f42da51d613ef1ee9995898cbe9f617aa8019b50dc68e381faf0c30786602dcd" Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.992701 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f42da51d613ef1ee9995898cbe9f617aa8019b50dc68e381faf0c30786602dcd"} err="failed to get container status \"f42da51d613ef1ee9995898cbe9f617aa8019b50dc68e381faf0c30786602dcd\": rpc error: code = NotFound desc = could not find container \"f42da51d613ef1ee9995898cbe9f617aa8019b50dc68e381faf0c30786602dcd\": container with ID starting with f42da51d613ef1ee9995898cbe9f617aa8019b50dc68e381faf0c30786602dcd not found: ID does not exist" Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.992734 4743 scope.go:117] "RemoveContainer" containerID="2a19fc01e7ada84faa1c3e4b6510519fcb2eebedacc9ccf342f91f6eafc2b82c" Oct 11 01:05:36 crc kubenswrapper[4743]: E1011 01:05:36.996712 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a19fc01e7ada84faa1c3e4b6510519fcb2eebedacc9ccf342f91f6eafc2b82c\": container with ID starting with 2a19fc01e7ada84faa1c3e4b6510519fcb2eebedacc9ccf342f91f6eafc2b82c not found: ID does not exist" containerID="2a19fc01e7ada84faa1c3e4b6510519fcb2eebedacc9ccf342f91f6eafc2b82c" Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.996772 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a19fc01e7ada84faa1c3e4b6510519fcb2eebedacc9ccf342f91f6eafc2b82c"} err="failed to get container status \"2a19fc01e7ada84faa1c3e4b6510519fcb2eebedacc9ccf342f91f6eafc2b82c\": rpc error: code = NotFound desc = could not find container \"2a19fc01e7ada84faa1c3e4b6510519fcb2eebedacc9ccf342f91f6eafc2b82c\": container with ID starting with 2a19fc01e7ada84faa1c3e4b6510519fcb2eebedacc9ccf342f91f6eafc2b82c not found: ID does not exist" Oct 11 01:05:36 crc kubenswrapper[4743]: I1011 01:05:36.996801 4743 scope.go:117] "RemoveContainer" containerID="f4619419080ecf49e58a4d8fcd8fdd77831e549a0fc91b2d8ec4976d2de194e5" Oct 11 01:05:37 crc kubenswrapper[4743]: E1011 01:05:37.000114 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4619419080ecf49e58a4d8fcd8fdd77831e549a0fc91b2d8ec4976d2de194e5\": container with ID starting with f4619419080ecf49e58a4d8fcd8fdd77831e549a0fc91b2d8ec4976d2de194e5 not found: ID does not exist" containerID="f4619419080ecf49e58a4d8fcd8fdd77831e549a0fc91b2d8ec4976d2de194e5" Oct 11 01:05:37 crc kubenswrapper[4743]: I1011 01:05:37.000150 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4619419080ecf49e58a4d8fcd8fdd77831e549a0fc91b2d8ec4976d2de194e5"} err="failed to get container status \"f4619419080ecf49e58a4d8fcd8fdd77831e549a0fc91b2d8ec4976d2de194e5\": rpc error: code = NotFound desc = could not find container \"f4619419080ecf49e58a4d8fcd8fdd77831e549a0fc91b2d8ec4976d2de194e5\": container with ID starting with f4619419080ecf49e58a4d8fcd8fdd77831e549a0fc91b2d8ec4976d2de194e5 not found: ID does not exist" Oct 11 01:05:38 crc kubenswrapper[4743]: I1011 01:05:38.098988 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0b42080-afd5-431e-9b1f-6ba5907b7331" path="/var/lib/kubelet/pods/a0b42080-afd5-431e-9b1f-6ba5907b7331/volumes" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.375343 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-2v928"] Oct 11 01:05:40 crc kubenswrapper[4743]: E1011 01:05:40.375900 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0b42080-afd5-431e-9b1f-6ba5907b7331" containerName="registry-server" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.375917 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0b42080-afd5-431e-9b1f-6ba5907b7331" containerName="registry-server" Oct 11 01:05:40 crc kubenswrapper[4743]: E1011 01:05:40.375934 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0b42080-afd5-431e-9b1f-6ba5907b7331" containerName="extract-content" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.375944 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0b42080-afd5-431e-9b1f-6ba5907b7331" containerName="extract-content" Oct 11 01:05:40 crc kubenswrapper[4743]: E1011 01:05:40.375960 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0b42080-afd5-431e-9b1f-6ba5907b7331" containerName="extract-utilities" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.375969 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0b42080-afd5-431e-9b1f-6ba5907b7331" containerName="extract-utilities" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.376099 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0b42080-afd5-431e-9b1f-6ba5907b7331" containerName="registry-server" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.376781 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2v928" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.385910 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-nl9th"] Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.387816 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nl9th" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.399415 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.399712 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-hds8c" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.409812 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6d192208-f92a-4299-866d-14cf8ecffe17-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-nl9th\" (UID: \"6d192208-f92a-4299-866d-14cf8ecffe17\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nl9th" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.409936 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q54gx\" (UniqueName: \"kubernetes.io/projected/c2323cd1-eebb-46d7-9393-586093c921f1-kube-api-access-q54gx\") pod \"nmstate-metrics-fdff9cb8d-2v928\" (UID: \"c2323cd1-eebb-46d7-9393-586093c921f1\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2v928" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.410017 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgbg9\" (UniqueName: \"kubernetes.io/projected/6d192208-f92a-4299-866d-14cf8ecffe17-kube-api-access-bgbg9\") pod \"nmstate-webhook-6cdbc54649-nl9th\" (UID: \"6d192208-f92a-4299-866d-14cf8ecffe17\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nl9th" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.422051 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-j4sk5"] Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.422997 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-j4sk5" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.434637 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-nl9th"] Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.443711 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-2v928"] Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.515458 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-gvckr"] Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.516187 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-gvckr" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.524558 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.524637 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-bctfq" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.524746 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.526575 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8e3fe60f-ace7-445d-8994-03a95ff90479-dbus-socket\") pod \"nmstate-handler-j4sk5\" (UID: \"8e3fe60f-ace7-445d-8994-03a95ff90479\") " pod="openshift-nmstate/nmstate-handler-j4sk5" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.526624 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8e3fe60f-ace7-445d-8994-03a95ff90479-ovs-socket\") pod \"nmstate-handler-j4sk5\" (UID: \"8e3fe60f-ace7-445d-8994-03a95ff90479\") " pod="openshift-nmstate/nmstate-handler-j4sk5" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.526685 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc9zl\" (UniqueName: \"kubernetes.io/projected/8e3fe60f-ace7-445d-8994-03a95ff90479-kube-api-access-kc9zl\") pod \"nmstate-handler-j4sk5\" (UID: \"8e3fe60f-ace7-445d-8994-03a95ff90479\") " pod="openshift-nmstate/nmstate-handler-j4sk5" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.526715 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6d192208-f92a-4299-866d-14cf8ecffe17-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-nl9th\" (UID: \"6d192208-f92a-4299-866d-14cf8ecffe17\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nl9th" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.526770 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8e3fe60f-ace7-445d-8994-03a95ff90479-nmstate-lock\") pod \"nmstate-handler-j4sk5\" (UID: \"8e3fe60f-ace7-445d-8994-03a95ff90479\") " pod="openshift-nmstate/nmstate-handler-j4sk5" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.526800 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q54gx\" (UniqueName: \"kubernetes.io/projected/c2323cd1-eebb-46d7-9393-586093c921f1-kube-api-access-q54gx\") pod \"nmstate-metrics-fdff9cb8d-2v928\" (UID: \"c2323cd1-eebb-46d7-9393-586093c921f1\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2v928" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.526872 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgbg9\" (UniqueName: \"kubernetes.io/projected/6d192208-f92a-4299-866d-14cf8ecffe17-kube-api-access-bgbg9\") pod \"nmstate-webhook-6cdbc54649-nl9th\" (UID: \"6d192208-f92a-4299-866d-14cf8ecffe17\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nl9th" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.531208 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-gvckr"] Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.542935 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6d192208-f92a-4299-866d-14cf8ecffe17-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-nl9th\" (UID: \"6d192208-f92a-4299-866d-14cf8ecffe17\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nl9th" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.546305 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgbg9\" (UniqueName: \"kubernetes.io/projected/6d192208-f92a-4299-866d-14cf8ecffe17-kube-api-access-bgbg9\") pod \"nmstate-webhook-6cdbc54649-nl9th\" (UID: \"6d192208-f92a-4299-866d-14cf8ecffe17\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nl9th" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.549131 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q54gx\" (UniqueName: \"kubernetes.io/projected/c2323cd1-eebb-46d7-9393-586093c921f1-kube-api-access-q54gx\") pod \"nmstate-metrics-fdff9cb8d-2v928\" (UID: \"c2323cd1-eebb-46d7-9393-586093c921f1\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2v928" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.629692 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8e3fe60f-ace7-445d-8994-03a95ff90479-dbus-socket\") pod \"nmstate-handler-j4sk5\" (UID: \"8e3fe60f-ace7-445d-8994-03a95ff90479\") " pod="openshift-nmstate/nmstate-handler-j4sk5" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.629996 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8e3fe60f-ace7-445d-8994-03a95ff90479-ovs-socket\") pod \"nmstate-handler-j4sk5\" (UID: \"8e3fe60f-ace7-445d-8994-03a95ff90479\") " pod="openshift-nmstate/nmstate-handler-j4sk5" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.630085 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f12fff02-4cd4-437d-b704-99766f165f0e-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-gvckr\" (UID: \"f12fff02-4cd4-437d-b704-99766f165f0e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-gvckr" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.630163 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s7cf\" (UniqueName: \"kubernetes.io/projected/f12fff02-4cd4-437d-b704-99766f165f0e-kube-api-access-9s7cf\") pod \"nmstate-console-plugin-6b874cbd85-gvckr\" (UID: \"f12fff02-4cd4-437d-b704-99766f165f0e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-gvckr" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.630236 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc9zl\" (UniqueName: \"kubernetes.io/projected/8e3fe60f-ace7-445d-8994-03a95ff90479-kube-api-access-kc9zl\") pod \"nmstate-handler-j4sk5\" (UID: \"8e3fe60f-ace7-445d-8994-03a95ff90479\") " pod="openshift-nmstate/nmstate-handler-j4sk5" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.630332 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8e3fe60f-ace7-445d-8994-03a95ff90479-nmstate-lock\") pod \"nmstate-handler-j4sk5\" (UID: \"8e3fe60f-ace7-445d-8994-03a95ff90479\") " pod="openshift-nmstate/nmstate-handler-j4sk5" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.630441 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f12fff02-4cd4-437d-b704-99766f165f0e-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-gvckr\" (UID: \"f12fff02-4cd4-437d-b704-99766f165f0e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-gvckr" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.630801 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8e3fe60f-ace7-445d-8994-03a95ff90479-dbus-socket\") pod \"nmstate-handler-j4sk5\" (UID: \"8e3fe60f-ace7-445d-8994-03a95ff90479\") " pod="openshift-nmstate/nmstate-handler-j4sk5" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.630960 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8e3fe60f-ace7-445d-8994-03a95ff90479-ovs-socket\") pod \"nmstate-handler-j4sk5\" (UID: \"8e3fe60f-ace7-445d-8994-03a95ff90479\") " pod="openshift-nmstate/nmstate-handler-j4sk5" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.631272 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8e3fe60f-ace7-445d-8994-03a95ff90479-nmstate-lock\") pod \"nmstate-handler-j4sk5\" (UID: \"8e3fe60f-ace7-445d-8994-03a95ff90479\") " pod="openshift-nmstate/nmstate-handler-j4sk5" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.655243 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc9zl\" (UniqueName: \"kubernetes.io/projected/8e3fe60f-ace7-445d-8994-03a95ff90479-kube-api-access-kc9zl\") pod \"nmstate-handler-j4sk5\" (UID: \"8e3fe60f-ace7-445d-8994-03a95ff90479\") " pod="openshift-nmstate/nmstate-handler-j4sk5" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.704104 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7bff76c5fd-k9d6v"] Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.705312 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.709195 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2v928" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.729761 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nl9th" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.731092 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcjbg\" (UniqueName: \"kubernetes.io/projected/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-kube-api-access-mcjbg\") pod \"console-7bff76c5fd-k9d6v\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.731125 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f12fff02-4cd4-437d-b704-99766f165f0e-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-gvckr\" (UID: \"f12fff02-4cd4-437d-b704-99766f165f0e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-gvckr" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.731147 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-console-oauth-config\") pod \"console-7bff76c5fd-k9d6v\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.731176 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f12fff02-4cd4-437d-b704-99766f165f0e-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-gvckr\" (UID: \"f12fff02-4cd4-437d-b704-99766f165f0e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-gvckr" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.731201 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s7cf\" (UniqueName: \"kubernetes.io/projected/f12fff02-4cd4-437d-b704-99766f165f0e-kube-api-access-9s7cf\") pod \"nmstate-console-plugin-6b874cbd85-gvckr\" (UID: \"f12fff02-4cd4-437d-b704-99766f165f0e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-gvckr" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.731229 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-service-ca\") pod \"console-7bff76c5fd-k9d6v\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.731246 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-console-serving-cert\") pod \"console-7bff76c5fd-k9d6v\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.731280 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-console-config\") pod \"console-7bff76c5fd-k9d6v\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.731296 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-trusted-ca-bundle\") pod \"console-7bff76c5fd-k9d6v\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.731328 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-oauth-serving-cert\") pod \"console-7bff76c5fd-k9d6v\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.731765 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f12fff02-4cd4-437d-b704-99766f165f0e-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-gvckr\" (UID: \"f12fff02-4cd4-437d-b704-99766f165f0e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-gvckr" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.738547 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f12fff02-4cd4-437d-b704-99766f165f0e-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-gvckr\" (UID: \"f12fff02-4cd4-437d-b704-99766f165f0e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-gvckr" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.750770 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-j4sk5" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.751552 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s7cf\" (UniqueName: \"kubernetes.io/projected/f12fff02-4cd4-437d-b704-99766f165f0e-kube-api-access-9s7cf\") pod \"nmstate-console-plugin-6b874cbd85-gvckr\" (UID: \"f12fff02-4cd4-437d-b704-99766f165f0e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-gvckr" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.757051 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bff76c5fd-k9d6v"] Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.833147 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-service-ca\") pod \"console-7bff76c5fd-k9d6v\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.833188 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-console-serving-cert\") pod \"console-7bff76c5fd-k9d6v\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.833208 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-console-config\") pod \"console-7bff76c5fd-k9d6v\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.833229 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-trusted-ca-bundle\") pod \"console-7bff76c5fd-k9d6v\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.833268 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-oauth-serving-cert\") pod \"console-7bff76c5fd-k9d6v\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.833298 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcjbg\" (UniqueName: \"kubernetes.io/projected/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-kube-api-access-mcjbg\") pod \"console-7bff76c5fd-k9d6v\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.833318 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-console-oauth-config\") pod \"console-7bff76c5fd-k9d6v\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.834651 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-console-config\") pod \"console-7bff76c5fd-k9d6v\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.834671 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-oauth-serving-cert\") pod \"console-7bff76c5fd-k9d6v\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.835236 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-service-ca\") pod \"console-7bff76c5fd-k9d6v\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.835906 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-trusted-ca-bundle\") pod \"console-7bff76c5fd-k9d6v\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.841897 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-console-oauth-config\") pod \"console-7bff76c5fd-k9d6v\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.842447 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-console-serving-cert\") pod \"console-7bff76c5fd-k9d6v\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.842899 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-gvckr" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.852085 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcjbg\" (UniqueName: \"kubernetes.io/projected/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-kube-api-access-mcjbg\") pod \"console-7bff76c5fd-k9d6v\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:40 crc kubenswrapper[4743]: I1011 01:05:40.911791 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-j4sk5" event={"ID":"8e3fe60f-ace7-445d-8994-03a95ff90479","Type":"ContainerStarted","Data":"176e425da7321b68c6ab58c3b340eac3e71cb4890d3cfc728c7bd828989e1085"} Oct 11 01:05:41 crc kubenswrapper[4743]: I1011 01:05:41.021311 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:41 crc kubenswrapper[4743]: I1011 01:05:41.096395 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-gvckr"] Oct 11 01:05:41 crc kubenswrapper[4743]: W1011 01:05:41.112015 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf12fff02_4cd4_437d_b704_99766f165f0e.slice/crio-aab15acfb88ea2ecd9d23df7e373a7879e0a7b72df274c6e253af25fd9fded4c WatchSource:0}: Error finding container aab15acfb88ea2ecd9d23df7e373a7879e0a7b72df274c6e253af25fd9fded4c: Status 404 returned error can't find the container with id aab15acfb88ea2ecd9d23df7e373a7879e0a7b72df274c6e253af25fd9fded4c Oct 11 01:05:41 crc kubenswrapper[4743]: I1011 01:05:41.131497 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-2v928"] Oct 11 01:05:41 crc kubenswrapper[4743]: W1011 01:05:41.137821 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2323cd1_eebb_46d7_9393_586093c921f1.slice/crio-810163474c001dc22a2ab292aa5d0af7a6c210e9954e6e47be6edcf30de7c537 WatchSource:0}: Error finding container 810163474c001dc22a2ab292aa5d0af7a6c210e9954e6e47be6edcf30de7c537: Status 404 returned error can't find the container with id 810163474c001dc22a2ab292aa5d0af7a6c210e9954e6e47be6edcf30de7c537 Oct 11 01:05:41 crc kubenswrapper[4743]: I1011 01:05:41.173806 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-nl9th"] Oct 11 01:05:41 crc kubenswrapper[4743]: I1011 01:05:41.286364 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bff76c5fd-k9d6v"] Oct 11 01:05:41 crc kubenswrapper[4743]: W1011 01:05:41.289013 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e0ffbae_47a1_49cd_93b6_46bc31e2ab74.slice/crio-a5bc4ecfb2ebfb4aad21b01d41c27d8cc141715c024bd41d07f8cdc8a60f0fbe WatchSource:0}: Error finding container a5bc4ecfb2ebfb4aad21b01d41c27d8cc141715c024bd41d07f8cdc8a60f0fbe: Status 404 returned error can't find the container with id a5bc4ecfb2ebfb4aad21b01d41c27d8cc141715c024bd41d07f8cdc8a60f0fbe Oct 11 01:05:41 crc kubenswrapper[4743]: I1011 01:05:41.923355 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bff76c5fd-k9d6v" event={"ID":"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74","Type":"ContainerStarted","Data":"6024f101ee89d07405f68dd24eff958f2c6395a098f015f88711b3424ab17c14"} Oct 11 01:05:41 crc kubenswrapper[4743]: I1011 01:05:41.923727 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bff76c5fd-k9d6v" event={"ID":"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74","Type":"ContainerStarted","Data":"a5bc4ecfb2ebfb4aad21b01d41c27d8cc141715c024bd41d07f8cdc8a60f0fbe"} Oct 11 01:05:41 crc kubenswrapper[4743]: I1011 01:05:41.926071 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2v928" event={"ID":"c2323cd1-eebb-46d7-9393-586093c921f1","Type":"ContainerStarted","Data":"810163474c001dc22a2ab292aa5d0af7a6c210e9954e6e47be6edcf30de7c537"} Oct 11 01:05:41 crc kubenswrapper[4743]: I1011 01:05:41.927676 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-gvckr" event={"ID":"f12fff02-4cd4-437d-b704-99766f165f0e","Type":"ContainerStarted","Data":"aab15acfb88ea2ecd9d23df7e373a7879e0a7b72df274c6e253af25fd9fded4c"} Oct 11 01:05:41 crc kubenswrapper[4743]: I1011 01:05:41.928631 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nl9th" event={"ID":"6d192208-f92a-4299-866d-14cf8ecffe17","Type":"ContainerStarted","Data":"9047ecc7381d7c06a45b30ef70a907e6d3f4781d54259331c49c498a054fdcfa"} Oct 11 01:05:41 crc kubenswrapper[4743]: I1011 01:05:41.947781 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7bff76c5fd-k9d6v" podStartSLOduration=1.947756199 podStartE2EDuration="1.947756199s" podCreationTimestamp="2025-10-11 01:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:05:41.944813103 +0000 UTC m=+836.597793550" watchObservedRunningTime="2025-10-11 01:05:41.947756199 +0000 UTC m=+836.600736636" Oct 11 01:05:42 crc kubenswrapper[4743]: I1011 01:05:42.543058 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mbkjs" Oct 11 01:05:42 crc kubenswrapper[4743]: I1011 01:05:42.543101 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mbkjs" Oct 11 01:05:42 crc kubenswrapper[4743]: I1011 01:05:42.600607 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mbkjs" Oct 11 01:05:43 crc kubenswrapper[4743]: I1011 01:05:43.013433 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mbkjs" Oct 11 01:05:44 crc kubenswrapper[4743]: I1011 01:05:44.160717 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mbkjs"] Oct 11 01:05:44 crc kubenswrapper[4743]: I1011 01:05:44.952540 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nl9th" event={"ID":"6d192208-f92a-4299-866d-14cf8ecffe17","Type":"ContainerStarted","Data":"f4e99c99b570d97ab7b5dc5fbaa824212196dd9f15ecaa746f5f103b3c3e6df9"} Oct 11 01:05:44 crc kubenswrapper[4743]: I1011 01:05:44.953959 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nl9th" Oct 11 01:05:44 crc kubenswrapper[4743]: I1011 01:05:44.955366 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2v928" event={"ID":"c2323cd1-eebb-46d7-9393-586093c921f1","Type":"ContainerStarted","Data":"20675f1fff8f431fb6a67cff656cb7da0b40e89fcc6e96e1b6a508cdcd1992ff"} Oct 11 01:05:44 crc kubenswrapper[4743]: I1011 01:05:44.957610 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-j4sk5" event={"ID":"8e3fe60f-ace7-445d-8994-03a95ff90479","Type":"ContainerStarted","Data":"a23373379841c45f4d16b1ddc8b1143db683af308f7b2c233926ce0de434d06c"} Oct 11 01:05:44 crc kubenswrapper[4743]: I1011 01:05:44.958131 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-j4sk5" Oct 11 01:05:44 crc kubenswrapper[4743]: I1011 01:05:44.959622 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-gvckr" event={"ID":"f12fff02-4cd4-437d-b704-99766f165f0e","Type":"ContainerStarted","Data":"b179910d76a89a8ef9c9aef21cf191850e523005e83ae03856261bcddb281bb0"} Oct 11 01:05:44 crc kubenswrapper[4743]: I1011 01:05:44.959922 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mbkjs" podUID="0e2570aa-046d-43a2-bf1e-ade9d92f9036" containerName="registry-server" containerID="cri-o://2d033cbcb3ef0246a2b84658f644f879b6cb0953c7952e1449c7d49d7f240408" gracePeriod=2 Oct 11 01:05:44 crc kubenswrapper[4743]: I1011 01:05:44.991733 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nl9th" podStartSLOduration=1.6910460010000001 podStartE2EDuration="4.991709381s" podCreationTimestamp="2025-10-11 01:05:40 +0000 UTC" firstStartedPulling="2025-10-11 01:05:41.196012701 +0000 UTC m=+835.848993098" lastFinishedPulling="2025-10-11 01:05:44.496676071 +0000 UTC m=+839.149656478" observedRunningTime="2025-10-11 01:05:44.969993348 +0000 UTC m=+839.622973815" watchObservedRunningTime="2025-10-11 01:05:44.991709381 +0000 UTC m=+839.644689798" Oct 11 01:05:45 crc kubenswrapper[4743]: I1011 01:05:45.003519 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-gvckr" podStartSLOduration=1.839235935 podStartE2EDuration="5.003496947s" podCreationTimestamp="2025-10-11 01:05:40 +0000 UTC" firstStartedPulling="2025-10-11 01:05:41.11539264 +0000 UTC m=+835.768373037" lastFinishedPulling="2025-10-11 01:05:44.279653652 +0000 UTC m=+838.932634049" observedRunningTime="2025-10-11 01:05:44.990581192 +0000 UTC m=+839.643561619" watchObservedRunningTime="2025-10-11 01:05:45.003496947 +0000 UTC m=+839.656477364" Oct 11 01:05:45 crc kubenswrapper[4743]: I1011 01:05:45.013939 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-j4sk5" podStartSLOduration=1.559622852 podStartE2EDuration="5.013917967s" podCreationTimestamp="2025-10-11 01:05:40 +0000 UTC" firstStartedPulling="2025-10-11 01:05:40.819923506 +0000 UTC m=+835.472903903" lastFinishedPulling="2025-10-11 01:05:44.274218621 +0000 UTC m=+838.927199018" observedRunningTime="2025-10-11 01:05:45.006084304 +0000 UTC m=+839.659064711" watchObservedRunningTime="2025-10-11 01:05:45.013917967 +0000 UTC m=+839.666898374" Oct 11 01:05:45 crc kubenswrapper[4743]: I1011 01:05:45.397925 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbkjs" Oct 11 01:05:45 crc kubenswrapper[4743]: I1011 01:05:45.505987 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j94m2\" (UniqueName: \"kubernetes.io/projected/0e2570aa-046d-43a2-bf1e-ade9d92f9036-kube-api-access-j94m2\") pod \"0e2570aa-046d-43a2-bf1e-ade9d92f9036\" (UID: \"0e2570aa-046d-43a2-bf1e-ade9d92f9036\") " Oct 11 01:05:45 crc kubenswrapper[4743]: I1011 01:05:45.506069 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e2570aa-046d-43a2-bf1e-ade9d92f9036-catalog-content\") pod \"0e2570aa-046d-43a2-bf1e-ade9d92f9036\" (UID: \"0e2570aa-046d-43a2-bf1e-ade9d92f9036\") " Oct 11 01:05:45 crc kubenswrapper[4743]: I1011 01:05:45.506153 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e2570aa-046d-43a2-bf1e-ade9d92f9036-utilities\") pod \"0e2570aa-046d-43a2-bf1e-ade9d92f9036\" (UID: \"0e2570aa-046d-43a2-bf1e-ade9d92f9036\") " Oct 11 01:05:45 crc kubenswrapper[4743]: I1011 01:05:45.507130 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e2570aa-046d-43a2-bf1e-ade9d92f9036-utilities" (OuterVolumeSpecName: "utilities") pod "0e2570aa-046d-43a2-bf1e-ade9d92f9036" (UID: "0e2570aa-046d-43a2-bf1e-ade9d92f9036"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:05:45 crc kubenswrapper[4743]: I1011 01:05:45.528927 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e2570aa-046d-43a2-bf1e-ade9d92f9036-kube-api-access-j94m2" (OuterVolumeSpecName: "kube-api-access-j94m2") pod "0e2570aa-046d-43a2-bf1e-ade9d92f9036" (UID: "0e2570aa-046d-43a2-bf1e-ade9d92f9036"). InnerVolumeSpecName "kube-api-access-j94m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:05:45 crc kubenswrapper[4743]: I1011 01:05:45.593257 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e2570aa-046d-43a2-bf1e-ade9d92f9036-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e2570aa-046d-43a2-bf1e-ade9d92f9036" (UID: "0e2570aa-046d-43a2-bf1e-ade9d92f9036"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:05:45 crc kubenswrapper[4743]: I1011 01:05:45.607649 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e2570aa-046d-43a2-bf1e-ade9d92f9036-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 01:05:45 crc kubenswrapper[4743]: I1011 01:05:45.607690 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j94m2\" (UniqueName: \"kubernetes.io/projected/0e2570aa-046d-43a2-bf1e-ade9d92f9036-kube-api-access-j94m2\") on node \"crc\" DevicePath \"\"" Oct 11 01:05:45 crc kubenswrapper[4743]: I1011 01:05:45.607704 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e2570aa-046d-43a2-bf1e-ade9d92f9036-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 01:05:45 crc kubenswrapper[4743]: I1011 01:05:45.968706 4743 generic.go:334] "Generic (PLEG): container finished" podID="0e2570aa-046d-43a2-bf1e-ade9d92f9036" containerID="2d033cbcb3ef0246a2b84658f644f879b6cb0953c7952e1449c7d49d7f240408" exitCode=0 Oct 11 01:05:45 crc kubenswrapper[4743]: I1011 01:05:45.968796 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbkjs" event={"ID":"0e2570aa-046d-43a2-bf1e-ade9d92f9036","Type":"ContainerDied","Data":"2d033cbcb3ef0246a2b84658f644f879b6cb0953c7952e1449c7d49d7f240408"} Oct 11 01:05:45 crc kubenswrapper[4743]: I1011 01:05:45.968957 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbkjs" Oct 11 01:05:45 crc kubenswrapper[4743]: I1011 01:05:45.969659 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbkjs" event={"ID":"0e2570aa-046d-43a2-bf1e-ade9d92f9036","Type":"ContainerDied","Data":"3ff8d1a246a3242fa0358d6237437839f7b78dbbcb8b4634b9c974bbdc1c832a"} Oct 11 01:05:45 crc kubenswrapper[4743]: I1011 01:05:45.969713 4743 scope.go:117] "RemoveContainer" containerID="2d033cbcb3ef0246a2b84658f644f879b6cb0953c7952e1449c7d49d7f240408" Oct 11 01:05:45 crc kubenswrapper[4743]: I1011 01:05:45.996482 4743 scope.go:117] "RemoveContainer" containerID="a008e0b41cac7c8df79f35746b8d6832130d6eda0c91dff5dd071253b40994b3" Oct 11 01:05:46 crc kubenswrapper[4743]: I1011 01:05:46.009223 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mbkjs"] Oct 11 01:05:46 crc kubenswrapper[4743]: I1011 01:05:46.016115 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mbkjs"] Oct 11 01:05:46 crc kubenswrapper[4743]: I1011 01:05:46.023662 4743 scope.go:117] "RemoveContainer" containerID="5f5141e5554a87c41a736c62e41cbe615d5d50c0103fd5fcb25fc08f1be884e3" Oct 11 01:05:46 crc kubenswrapper[4743]: I1011 01:05:46.054251 4743 scope.go:117] "RemoveContainer" containerID="2d033cbcb3ef0246a2b84658f644f879b6cb0953c7952e1449c7d49d7f240408" Oct 11 01:05:46 crc kubenswrapper[4743]: E1011 01:05:46.054767 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d033cbcb3ef0246a2b84658f644f879b6cb0953c7952e1449c7d49d7f240408\": container with ID starting with 2d033cbcb3ef0246a2b84658f644f879b6cb0953c7952e1449c7d49d7f240408 not found: ID does not exist" containerID="2d033cbcb3ef0246a2b84658f644f879b6cb0953c7952e1449c7d49d7f240408" Oct 11 01:05:46 crc kubenswrapper[4743]: I1011 01:05:46.054816 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d033cbcb3ef0246a2b84658f644f879b6cb0953c7952e1449c7d49d7f240408"} err="failed to get container status \"2d033cbcb3ef0246a2b84658f644f879b6cb0953c7952e1449c7d49d7f240408\": rpc error: code = NotFound desc = could not find container \"2d033cbcb3ef0246a2b84658f644f879b6cb0953c7952e1449c7d49d7f240408\": container with ID starting with 2d033cbcb3ef0246a2b84658f644f879b6cb0953c7952e1449c7d49d7f240408 not found: ID does not exist" Oct 11 01:05:46 crc kubenswrapper[4743]: I1011 01:05:46.054845 4743 scope.go:117] "RemoveContainer" containerID="a008e0b41cac7c8df79f35746b8d6832130d6eda0c91dff5dd071253b40994b3" Oct 11 01:05:46 crc kubenswrapper[4743]: E1011 01:05:46.055454 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a008e0b41cac7c8df79f35746b8d6832130d6eda0c91dff5dd071253b40994b3\": container with ID starting with a008e0b41cac7c8df79f35746b8d6832130d6eda0c91dff5dd071253b40994b3 not found: ID does not exist" containerID="a008e0b41cac7c8df79f35746b8d6832130d6eda0c91dff5dd071253b40994b3" Oct 11 01:05:46 crc kubenswrapper[4743]: I1011 01:05:46.055499 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a008e0b41cac7c8df79f35746b8d6832130d6eda0c91dff5dd071253b40994b3"} err="failed to get container status \"a008e0b41cac7c8df79f35746b8d6832130d6eda0c91dff5dd071253b40994b3\": rpc error: code = NotFound desc = could not find container \"a008e0b41cac7c8df79f35746b8d6832130d6eda0c91dff5dd071253b40994b3\": container with ID starting with a008e0b41cac7c8df79f35746b8d6832130d6eda0c91dff5dd071253b40994b3 not found: ID does not exist" Oct 11 01:05:46 crc kubenswrapper[4743]: I1011 01:05:46.055535 4743 scope.go:117] "RemoveContainer" containerID="5f5141e5554a87c41a736c62e41cbe615d5d50c0103fd5fcb25fc08f1be884e3" Oct 11 01:05:46 crc kubenswrapper[4743]: E1011 01:05:46.055971 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f5141e5554a87c41a736c62e41cbe615d5d50c0103fd5fcb25fc08f1be884e3\": container with ID starting with 5f5141e5554a87c41a736c62e41cbe615d5d50c0103fd5fcb25fc08f1be884e3 not found: ID does not exist" containerID="5f5141e5554a87c41a736c62e41cbe615d5d50c0103fd5fcb25fc08f1be884e3" Oct 11 01:05:46 crc kubenswrapper[4743]: I1011 01:05:46.056006 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f5141e5554a87c41a736c62e41cbe615d5d50c0103fd5fcb25fc08f1be884e3"} err="failed to get container status \"5f5141e5554a87c41a736c62e41cbe615d5d50c0103fd5fcb25fc08f1be884e3\": rpc error: code = NotFound desc = could not find container \"5f5141e5554a87c41a736c62e41cbe615d5d50c0103fd5fcb25fc08f1be884e3\": container with ID starting with 5f5141e5554a87c41a736c62e41cbe615d5d50c0103fd5fcb25fc08f1be884e3 not found: ID does not exist" Oct 11 01:05:46 crc kubenswrapper[4743]: I1011 01:05:46.121543 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e2570aa-046d-43a2-bf1e-ade9d92f9036" path="/var/lib/kubelet/pods/0e2570aa-046d-43a2-bf1e-ade9d92f9036/volumes" Oct 11 01:05:47 crc kubenswrapper[4743]: I1011 01:05:47.989620 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2v928" event={"ID":"c2323cd1-eebb-46d7-9393-586093c921f1","Type":"ContainerStarted","Data":"1aeecf99537d8398e74151c39e09c9386492063766f0fc912bc6057c1770db49"} Oct 11 01:05:48 crc kubenswrapper[4743]: I1011 01:05:48.036048 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2v928" podStartSLOduration=2.19726982 podStartE2EDuration="8.036022813s" podCreationTimestamp="2025-10-11 01:05:40 +0000 UTC" firstStartedPulling="2025-10-11 01:05:41.138965431 +0000 UTC m=+835.791945828" lastFinishedPulling="2025-10-11 01:05:46.977718424 +0000 UTC m=+841.630698821" observedRunningTime="2025-10-11 01:05:48.021172597 +0000 UTC m=+842.674152994" watchObservedRunningTime="2025-10-11 01:05:48.036022813 +0000 UTC m=+842.689003230" Oct 11 01:05:50 crc kubenswrapper[4743]: I1011 01:05:50.779987 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-j4sk5" Oct 11 01:05:51 crc kubenswrapper[4743]: I1011 01:05:51.023208 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:51 crc kubenswrapper[4743]: I1011 01:05:51.023295 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:51 crc kubenswrapper[4743]: I1011 01:05:51.028580 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:52 crc kubenswrapper[4743]: I1011 01:05:52.031644 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:05:52 crc kubenswrapper[4743]: I1011 01:05:52.124820 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-468m5"] Oct 11 01:06:00 crc kubenswrapper[4743]: I1011 01:06:00.740032 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-nl9th" Oct 11 01:06:14 crc kubenswrapper[4743]: I1011 01:06:14.458802 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:06:14 crc kubenswrapper[4743]: I1011 01:06:14.459535 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:06:17 crc kubenswrapper[4743]: I1011 01:06:17.187894 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-468m5" podUID="b5776f9a-8455-4c34-8496-0b4c4e821135" containerName="console" containerID="cri-o://ab022353864545543708f1cd63246b729bd9128d7b912a63741aed10601ac6f9" gracePeriod=15 Oct 11 01:06:17 crc kubenswrapper[4743]: I1011 01:06:17.735136 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-468m5_b5776f9a-8455-4c34-8496-0b4c4e821135/console/0.log" Oct 11 01:06:17 crc kubenswrapper[4743]: I1011 01:06:17.735450 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-468m5" Oct 11 01:06:17 crc kubenswrapper[4743]: I1011 01:06:17.918684 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5776f9a-8455-4c34-8496-0b4c4e821135-service-ca\") pod \"b5776f9a-8455-4c34-8496-0b4c4e821135\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " Oct 11 01:06:17 crc kubenswrapper[4743]: I1011 01:06:17.918752 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5776f9a-8455-4c34-8496-0b4c4e821135-console-serving-cert\") pod \"b5776f9a-8455-4c34-8496-0b4c4e821135\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " Oct 11 01:06:17 crc kubenswrapper[4743]: I1011 01:06:17.918796 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5776f9a-8455-4c34-8496-0b4c4e821135-trusted-ca-bundle\") pod \"b5776f9a-8455-4c34-8496-0b4c4e821135\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " Oct 11 01:06:17 crc kubenswrapper[4743]: I1011 01:06:17.918819 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rxkv\" (UniqueName: \"kubernetes.io/projected/b5776f9a-8455-4c34-8496-0b4c4e821135-kube-api-access-6rxkv\") pod \"b5776f9a-8455-4c34-8496-0b4c4e821135\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " Oct 11 01:06:17 crc kubenswrapper[4743]: I1011 01:06:17.918875 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b5776f9a-8455-4c34-8496-0b4c4e821135-console-config\") pod \"b5776f9a-8455-4c34-8496-0b4c4e821135\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " Oct 11 01:06:17 crc kubenswrapper[4743]: I1011 01:06:17.918949 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b5776f9a-8455-4c34-8496-0b4c4e821135-console-oauth-config\") pod \"b5776f9a-8455-4c34-8496-0b4c4e821135\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " Oct 11 01:06:17 crc kubenswrapper[4743]: I1011 01:06:17.918977 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b5776f9a-8455-4c34-8496-0b4c4e821135-oauth-serving-cert\") pod \"b5776f9a-8455-4c34-8496-0b4c4e821135\" (UID: \"b5776f9a-8455-4c34-8496-0b4c4e821135\") " Oct 11 01:06:17 crc kubenswrapper[4743]: I1011 01:06:17.919494 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5776f9a-8455-4c34-8496-0b4c4e821135-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b5776f9a-8455-4c34-8496-0b4c4e821135" (UID: "b5776f9a-8455-4c34-8496-0b4c4e821135"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:06:17 crc kubenswrapper[4743]: I1011 01:06:17.919554 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5776f9a-8455-4c34-8496-0b4c4e821135-service-ca" (OuterVolumeSpecName: "service-ca") pod "b5776f9a-8455-4c34-8496-0b4c4e821135" (UID: "b5776f9a-8455-4c34-8496-0b4c4e821135"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:06:17 crc kubenswrapper[4743]: I1011 01:06:17.919576 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5776f9a-8455-4c34-8496-0b4c4e821135-console-config" (OuterVolumeSpecName: "console-config") pod "b5776f9a-8455-4c34-8496-0b4c4e821135" (UID: "b5776f9a-8455-4c34-8496-0b4c4e821135"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:06:17 crc kubenswrapper[4743]: I1011 01:06:17.919665 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5776f9a-8455-4c34-8496-0b4c4e821135-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b5776f9a-8455-4c34-8496-0b4c4e821135" (UID: "b5776f9a-8455-4c34-8496-0b4c4e821135"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:06:17 crc kubenswrapper[4743]: I1011 01:06:17.924655 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5776f9a-8455-4c34-8496-0b4c4e821135-kube-api-access-6rxkv" (OuterVolumeSpecName: "kube-api-access-6rxkv") pod "b5776f9a-8455-4c34-8496-0b4c4e821135" (UID: "b5776f9a-8455-4c34-8496-0b4c4e821135"). InnerVolumeSpecName "kube-api-access-6rxkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:06:17 crc kubenswrapper[4743]: I1011 01:06:17.934254 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5776f9a-8455-4c34-8496-0b4c4e821135-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b5776f9a-8455-4c34-8496-0b4c4e821135" (UID: "b5776f9a-8455-4c34-8496-0b4c4e821135"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:06:17 crc kubenswrapper[4743]: I1011 01:06:17.934656 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5776f9a-8455-4c34-8496-0b4c4e821135-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b5776f9a-8455-4c34-8496-0b4c4e821135" (UID: "b5776f9a-8455-4c34-8496-0b4c4e821135"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.020742 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5776f9a-8455-4c34-8496-0b4c4e821135-service-ca\") on node \"crc\" DevicePath \"\"" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.021106 4743 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5776f9a-8455-4c34-8496-0b4c4e821135-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.021122 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5776f9a-8455-4c34-8496-0b4c4e821135-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.021134 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rxkv\" (UniqueName: \"kubernetes.io/projected/b5776f9a-8455-4c34-8496-0b4c4e821135-kube-api-access-6rxkv\") on node \"crc\" DevicePath \"\"" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.021145 4743 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b5776f9a-8455-4c34-8496-0b4c4e821135-console-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.021156 4743 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b5776f9a-8455-4c34-8496-0b4c4e821135-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.021166 4743 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b5776f9a-8455-4c34-8496-0b4c4e821135-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.256921 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-468m5_b5776f9a-8455-4c34-8496-0b4c4e821135/console/0.log" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.256999 4743 generic.go:334] "Generic (PLEG): container finished" podID="b5776f9a-8455-4c34-8496-0b4c4e821135" containerID="ab022353864545543708f1cd63246b729bd9128d7b912a63741aed10601ac6f9" exitCode=2 Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.257033 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-468m5" event={"ID":"b5776f9a-8455-4c34-8496-0b4c4e821135","Type":"ContainerDied","Data":"ab022353864545543708f1cd63246b729bd9128d7b912a63741aed10601ac6f9"} Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.257062 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-468m5" event={"ID":"b5776f9a-8455-4c34-8496-0b4c4e821135","Type":"ContainerDied","Data":"d78d37fa4d7b8b9b5ecc1aeafdd5ffb9dc28daef3c828360238f9bd4bf381480"} Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.257083 4743 scope.go:117] "RemoveContainer" containerID="ab022353864545543708f1cd63246b729bd9128d7b912a63741aed10601ac6f9" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.257218 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-468m5" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.278179 4743 scope.go:117] "RemoveContainer" containerID="ab022353864545543708f1cd63246b729bd9128d7b912a63741aed10601ac6f9" Oct 11 01:06:18 crc kubenswrapper[4743]: E1011 01:06:18.279130 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab022353864545543708f1cd63246b729bd9128d7b912a63741aed10601ac6f9\": container with ID starting with ab022353864545543708f1cd63246b729bd9128d7b912a63741aed10601ac6f9 not found: ID does not exist" containerID="ab022353864545543708f1cd63246b729bd9128d7b912a63741aed10601ac6f9" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.279175 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab022353864545543708f1cd63246b729bd9128d7b912a63741aed10601ac6f9"} err="failed to get container status \"ab022353864545543708f1cd63246b729bd9128d7b912a63741aed10601ac6f9\": rpc error: code = NotFound desc = could not find container \"ab022353864545543708f1cd63246b729bd9128d7b912a63741aed10601ac6f9\": container with ID starting with ab022353864545543708f1cd63246b729bd9128d7b912a63741aed10601ac6f9 not found: ID does not exist" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.280714 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-468m5"] Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.286698 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-468m5"] Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.604445 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8"] Oct 11 01:06:18 crc kubenswrapper[4743]: E1011 01:06:18.605326 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5776f9a-8455-4c34-8496-0b4c4e821135" containerName="console" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.605399 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5776f9a-8455-4c34-8496-0b4c4e821135" containerName="console" Oct 11 01:06:18 crc kubenswrapper[4743]: E1011 01:06:18.605498 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e2570aa-046d-43a2-bf1e-ade9d92f9036" containerName="extract-content" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.605561 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e2570aa-046d-43a2-bf1e-ade9d92f9036" containerName="extract-content" Oct 11 01:06:18 crc kubenswrapper[4743]: E1011 01:06:18.605622 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e2570aa-046d-43a2-bf1e-ade9d92f9036" containerName="extract-utilities" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.605670 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e2570aa-046d-43a2-bf1e-ade9d92f9036" containerName="extract-utilities" Oct 11 01:06:18 crc kubenswrapper[4743]: E1011 01:06:18.605723 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e2570aa-046d-43a2-bf1e-ade9d92f9036" containerName="registry-server" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.605774 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e2570aa-046d-43a2-bf1e-ade9d92f9036" containerName="registry-server" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.605944 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e2570aa-046d-43a2-bf1e-ade9d92f9036" containerName="registry-server" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.606016 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5776f9a-8455-4c34-8496-0b4c4e821135" containerName="console" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.606941 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.611180 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.623290 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8"] Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.732523 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bthzm\" (UniqueName: \"kubernetes.io/projected/69d0b6c4-03c5-4968-9dfb-0b6b2774954a-kube-api-access-bthzm\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8\" (UID: \"69d0b6c4-03c5-4968-9dfb-0b6b2774954a\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.732749 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69d0b6c4-03c5-4968-9dfb-0b6b2774954a-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8\" (UID: \"69d0b6c4-03c5-4968-9dfb-0b6b2774954a\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.732842 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69d0b6c4-03c5-4968-9dfb-0b6b2774954a-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8\" (UID: \"69d0b6c4-03c5-4968-9dfb-0b6b2774954a\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.834184 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bthzm\" (UniqueName: \"kubernetes.io/projected/69d0b6c4-03c5-4968-9dfb-0b6b2774954a-kube-api-access-bthzm\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8\" (UID: \"69d0b6c4-03c5-4968-9dfb-0b6b2774954a\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.834231 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69d0b6c4-03c5-4968-9dfb-0b6b2774954a-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8\" (UID: \"69d0b6c4-03c5-4968-9dfb-0b6b2774954a\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.834261 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69d0b6c4-03c5-4968-9dfb-0b6b2774954a-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8\" (UID: \"69d0b6c4-03c5-4968-9dfb-0b6b2774954a\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.834720 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69d0b6c4-03c5-4968-9dfb-0b6b2774954a-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8\" (UID: \"69d0b6c4-03c5-4968-9dfb-0b6b2774954a\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.834828 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69d0b6c4-03c5-4968-9dfb-0b6b2774954a-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8\" (UID: \"69d0b6c4-03c5-4968-9dfb-0b6b2774954a\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.852235 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bthzm\" (UniqueName: \"kubernetes.io/projected/69d0b6c4-03c5-4968-9dfb-0b6b2774954a-kube-api-access-bthzm\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8\" (UID: \"69d0b6c4-03c5-4968-9dfb-0b6b2774954a\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8" Oct 11 01:06:18 crc kubenswrapper[4743]: I1011 01:06:18.924724 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8" Oct 11 01:06:19 crc kubenswrapper[4743]: I1011 01:06:19.369586 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8"] Oct 11 01:06:20 crc kubenswrapper[4743]: I1011 01:06:20.102777 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5776f9a-8455-4c34-8496-0b4c4e821135" path="/var/lib/kubelet/pods/b5776f9a-8455-4c34-8496-0b4c4e821135/volumes" Oct 11 01:06:20 crc kubenswrapper[4743]: I1011 01:06:20.280785 4743 generic.go:334] "Generic (PLEG): container finished" podID="69d0b6c4-03c5-4968-9dfb-0b6b2774954a" containerID="6a5b48732068466a70b3b70bc106e93636e52f523e50428c483b3ba8abddaf32" exitCode=0 Oct 11 01:06:20 crc kubenswrapper[4743]: I1011 01:06:20.280838 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8" event={"ID":"69d0b6c4-03c5-4968-9dfb-0b6b2774954a","Type":"ContainerDied","Data":"6a5b48732068466a70b3b70bc106e93636e52f523e50428c483b3ba8abddaf32"} Oct 11 01:06:20 crc kubenswrapper[4743]: I1011 01:06:20.280924 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8" event={"ID":"69d0b6c4-03c5-4968-9dfb-0b6b2774954a","Type":"ContainerStarted","Data":"6c3cd92f2ad39319a067d9371cf98f21b15ac7d2c1d38396d4e420ec7abb3439"} Oct 11 01:06:20 crc kubenswrapper[4743]: I1011 01:06:20.283016 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 01:06:22 crc kubenswrapper[4743]: I1011 01:06:22.302492 4743 generic.go:334] "Generic (PLEG): container finished" podID="69d0b6c4-03c5-4968-9dfb-0b6b2774954a" containerID="1bced95231348e860f544f3d81cc0ae0856b2b0715976e7d2796ed0720a8f8c1" exitCode=0 Oct 11 01:06:22 crc kubenswrapper[4743]: I1011 01:06:22.302572 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8" event={"ID":"69d0b6c4-03c5-4968-9dfb-0b6b2774954a","Type":"ContainerDied","Data":"1bced95231348e860f544f3d81cc0ae0856b2b0715976e7d2796ed0720a8f8c1"} Oct 11 01:06:23 crc kubenswrapper[4743]: I1011 01:06:23.317977 4743 generic.go:334] "Generic (PLEG): container finished" podID="69d0b6c4-03c5-4968-9dfb-0b6b2774954a" containerID="6b5ce438ee3f282a28c56d1fc294c88e1a8f478306dd2f0e7fba41689f115138" exitCode=0 Oct 11 01:06:23 crc kubenswrapper[4743]: I1011 01:06:23.318061 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8" event={"ID":"69d0b6c4-03c5-4968-9dfb-0b6b2774954a","Type":"ContainerDied","Data":"6b5ce438ee3f282a28c56d1fc294c88e1a8f478306dd2f0e7fba41689f115138"} Oct 11 01:06:24 crc kubenswrapper[4743]: I1011 01:06:24.659612 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8" Oct 11 01:06:24 crc kubenswrapper[4743]: I1011 01:06:24.824810 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69d0b6c4-03c5-4968-9dfb-0b6b2774954a-util\") pod \"69d0b6c4-03c5-4968-9dfb-0b6b2774954a\" (UID: \"69d0b6c4-03c5-4968-9dfb-0b6b2774954a\") " Oct 11 01:06:24 crc kubenswrapper[4743]: I1011 01:06:24.825103 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69d0b6c4-03c5-4968-9dfb-0b6b2774954a-bundle\") pod \"69d0b6c4-03c5-4968-9dfb-0b6b2774954a\" (UID: \"69d0b6c4-03c5-4968-9dfb-0b6b2774954a\") " Oct 11 01:06:24 crc kubenswrapper[4743]: I1011 01:06:24.825185 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bthzm\" (UniqueName: \"kubernetes.io/projected/69d0b6c4-03c5-4968-9dfb-0b6b2774954a-kube-api-access-bthzm\") pod \"69d0b6c4-03c5-4968-9dfb-0b6b2774954a\" (UID: \"69d0b6c4-03c5-4968-9dfb-0b6b2774954a\") " Oct 11 01:06:24 crc kubenswrapper[4743]: I1011 01:06:24.827724 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69d0b6c4-03c5-4968-9dfb-0b6b2774954a-bundle" (OuterVolumeSpecName: "bundle") pod "69d0b6c4-03c5-4968-9dfb-0b6b2774954a" (UID: "69d0b6c4-03c5-4968-9dfb-0b6b2774954a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:06:24 crc kubenswrapper[4743]: I1011 01:06:24.834096 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69d0b6c4-03c5-4968-9dfb-0b6b2774954a-kube-api-access-bthzm" (OuterVolumeSpecName: "kube-api-access-bthzm") pod "69d0b6c4-03c5-4968-9dfb-0b6b2774954a" (UID: "69d0b6c4-03c5-4968-9dfb-0b6b2774954a"). InnerVolumeSpecName "kube-api-access-bthzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:06:24 crc kubenswrapper[4743]: I1011 01:06:24.842186 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69d0b6c4-03c5-4968-9dfb-0b6b2774954a-util" (OuterVolumeSpecName: "util") pod "69d0b6c4-03c5-4968-9dfb-0b6b2774954a" (UID: "69d0b6c4-03c5-4968-9dfb-0b6b2774954a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:06:24 crc kubenswrapper[4743]: I1011 01:06:24.927147 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bthzm\" (UniqueName: \"kubernetes.io/projected/69d0b6c4-03c5-4968-9dfb-0b6b2774954a-kube-api-access-bthzm\") on node \"crc\" DevicePath \"\"" Oct 11 01:06:24 crc kubenswrapper[4743]: I1011 01:06:24.927195 4743 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69d0b6c4-03c5-4968-9dfb-0b6b2774954a-util\") on node \"crc\" DevicePath \"\"" Oct 11 01:06:24 crc kubenswrapper[4743]: I1011 01:06:24.927215 4743 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69d0b6c4-03c5-4968-9dfb-0b6b2774954a-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:06:25 crc kubenswrapper[4743]: I1011 01:06:25.333417 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8" event={"ID":"69d0b6c4-03c5-4968-9dfb-0b6b2774954a","Type":"ContainerDied","Data":"6c3cd92f2ad39319a067d9371cf98f21b15ac7d2c1d38396d4e420ec7abb3439"} Oct 11 01:06:25 crc kubenswrapper[4743]: I1011 01:06:25.333453 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c3cd92f2ad39319a067d9371cf98f21b15ac7d2c1d38396d4e420ec7abb3439" Oct 11 01:06:25 crc kubenswrapper[4743]: I1011 01:06:25.333481 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8" Oct 11 01:06:33 crc kubenswrapper[4743]: I1011 01:06:33.738300 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7969b47488-dm7g4"] Oct 11 01:06:33 crc kubenswrapper[4743]: E1011 01:06:33.739029 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69d0b6c4-03c5-4968-9dfb-0b6b2774954a" containerName="util" Oct 11 01:06:33 crc kubenswrapper[4743]: I1011 01:06:33.739043 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="69d0b6c4-03c5-4968-9dfb-0b6b2774954a" containerName="util" Oct 11 01:06:33 crc kubenswrapper[4743]: E1011 01:06:33.739064 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69d0b6c4-03c5-4968-9dfb-0b6b2774954a" containerName="pull" Oct 11 01:06:33 crc kubenswrapper[4743]: I1011 01:06:33.739070 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="69d0b6c4-03c5-4968-9dfb-0b6b2774954a" containerName="pull" Oct 11 01:06:33 crc kubenswrapper[4743]: E1011 01:06:33.739080 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69d0b6c4-03c5-4968-9dfb-0b6b2774954a" containerName="extract" Oct 11 01:06:33 crc kubenswrapper[4743]: I1011 01:06:33.739085 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="69d0b6c4-03c5-4968-9dfb-0b6b2774954a" containerName="extract" Oct 11 01:06:33 crc kubenswrapper[4743]: I1011 01:06:33.739189 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="69d0b6c4-03c5-4968-9dfb-0b6b2774954a" containerName="extract" Oct 11 01:06:33 crc kubenswrapper[4743]: I1011 01:06:33.739631 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7969b47488-dm7g4" Oct 11 01:06:33 crc kubenswrapper[4743]: I1011 01:06:33.742054 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-r699h" Oct 11 01:06:33 crc kubenswrapper[4743]: I1011 01:06:33.742151 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 11 01:06:33 crc kubenswrapper[4743]: I1011 01:06:33.742163 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 11 01:06:33 crc kubenswrapper[4743]: I1011 01:06:33.742281 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 11 01:06:33 crc kubenswrapper[4743]: I1011 01:06:33.742463 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 11 01:06:33 crc kubenswrapper[4743]: I1011 01:06:33.762928 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7969b47488-dm7g4"] Oct 11 01:06:33 crc kubenswrapper[4743]: I1011 01:06:33.854165 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ef8e01c5-e132-4e07-9ce3-9a5578548ad7-apiservice-cert\") pod \"metallb-operator-controller-manager-7969b47488-dm7g4\" (UID: \"ef8e01c5-e132-4e07-9ce3-9a5578548ad7\") " pod="metallb-system/metallb-operator-controller-manager-7969b47488-dm7g4" Oct 11 01:06:33 crc kubenswrapper[4743]: I1011 01:06:33.854568 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt62d\" (UniqueName: \"kubernetes.io/projected/ef8e01c5-e132-4e07-9ce3-9a5578548ad7-kube-api-access-nt62d\") pod \"metallb-operator-controller-manager-7969b47488-dm7g4\" (UID: \"ef8e01c5-e132-4e07-9ce3-9a5578548ad7\") " pod="metallb-system/metallb-operator-controller-manager-7969b47488-dm7g4" Oct 11 01:06:33 crc kubenswrapper[4743]: I1011 01:06:33.854642 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef8e01c5-e132-4e07-9ce3-9a5578548ad7-webhook-cert\") pod \"metallb-operator-controller-manager-7969b47488-dm7g4\" (UID: \"ef8e01c5-e132-4e07-9ce3-9a5578548ad7\") " pod="metallb-system/metallb-operator-controller-manager-7969b47488-dm7g4" Oct 11 01:06:33 crc kubenswrapper[4743]: I1011 01:06:33.956235 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ef8e01c5-e132-4e07-9ce3-9a5578548ad7-apiservice-cert\") pod \"metallb-operator-controller-manager-7969b47488-dm7g4\" (UID: \"ef8e01c5-e132-4e07-9ce3-9a5578548ad7\") " pod="metallb-system/metallb-operator-controller-manager-7969b47488-dm7g4" Oct 11 01:06:33 crc kubenswrapper[4743]: I1011 01:06:33.956278 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt62d\" (UniqueName: \"kubernetes.io/projected/ef8e01c5-e132-4e07-9ce3-9a5578548ad7-kube-api-access-nt62d\") pod \"metallb-operator-controller-manager-7969b47488-dm7g4\" (UID: \"ef8e01c5-e132-4e07-9ce3-9a5578548ad7\") " pod="metallb-system/metallb-operator-controller-manager-7969b47488-dm7g4" Oct 11 01:06:33 crc kubenswrapper[4743]: I1011 01:06:33.956363 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef8e01c5-e132-4e07-9ce3-9a5578548ad7-webhook-cert\") pod \"metallb-operator-controller-manager-7969b47488-dm7g4\" (UID: \"ef8e01c5-e132-4e07-9ce3-9a5578548ad7\") " pod="metallb-system/metallb-operator-controller-manager-7969b47488-dm7g4" Oct 11 01:06:33 crc kubenswrapper[4743]: I1011 01:06:33.961910 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef8e01c5-e132-4e07-9ce3-9a5578548ad7-webhook-cert\") pod \"metallb-operator-controller-manager-7969b47488-dm7g4\" (UID: \"ef8e01c5-e132-4e07-9ce3-9a5578548ad7\") " pod="metallb-system/metallb-operator-controller-manager-7969b47488-dm7g4" Oct 11 01:06:33 crc kubenswrapper[4743]: I1011 01:06:33.962677 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ef8e01c5-e132-4e07-9ce3-9a5578548ad7-apiservice-cert\") pod \"metallb-operator-controller-manager-7969b47488-dm7g4\" (UID: \"ef8e01c5-e132-4e07-9ce3-9a5578548ad7\") " pod="metallb-system/metallb-operator-controller-manager-7969b47488-dm7g4" Oct 11 01:06:33 crc kubenswrapper[4743]: I1011 01:06:33.981667 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-79c6c9bd96-5sx97"] Oct 11 01:06:33 crc kubenswrapper[4743]: I1011 01:06:33.990980 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-79c6c9bd96-5sx97" Oct 11 01:06:33 crc kubenswrapper[4743]: I1011 01:06:33.995629 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt62d\" (UniqueName: \"kubernetes.io/projected/ef8e01c5-e132-4e07-9ce3-9a5578548ad7-kube-api-access-nt62d\") pod \"metallb-operator-controller-manager-7969b47488-dm7g4\" (UID: \"ef8e01c5-e132-4e07-9ce3-9a5578548ad7\") " pod="metallb-system/metallb-operator-controller-manager-7969b47488-dm7g4" Oct 11 01:06:34 crc kubenswrapper[4743]: I1011 01:06:34.006415 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 11 01:06:34 crc kubenswrapper[4743]: I1011 01:06:34.006536 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-qpxx7" Oct 11 01:06:34 crc kubenswrapper[4743]: I1011 01:06:34.006687 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 11 01:06:34 crc kubenswrapper[4743]: I1011 01:06:34.027130 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-79c6c9bd96-5sx97"] Oct 11 01:06:34 crc kubenswrapper[4743]: I1011 01:06:34.059937 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7969b47488-dm7g4" Oct 11 01:06:34 crc kubenswrapper[4743]: I1011 01:06:34.060736 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxbrd\" (UniqueName: \"kubernetes.io/projected/3cf19c00-f066-4814-9134-4a6d4aed88a7-kube-api-access-wxbrd\") pod \"metallb-operator-webhook-server-79c6c9bd96-5sx97\" (UID: \"3cf19c00-f066-4814-9134-4a6d4aed88a7\") " pod="metallb-system/metallb-operator-webhook-server-79c6c9bd96-5sx97" Oct 11 01:06:34 crc kubenswrapper[4743]: I1011 01:06:34.060801 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3cf19c00-f066-4814-9134-4a6d4aed88a7-apiservice-cert\") pod \"metallb-operator-webhook-server-79c6c9bd96-5sx97\" (UID: \"3cf19c00-f066-4814-9134-4a6d4aed88a7\") " pod="metallb-system/metallb-operator-webhook-server-79c6c9bd96-5sx97" Oct 11 01:06:34 crc kubenswrapper[4743]: I1011 01:06:34.060842 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3cf19c00-f066-4814-9134-4a6d4aed88a7-webhook-cert\") pod \"metallb-operator-webhook-server-79c6c9bd96-5sx97\" (UID: \"3cf19c00-f066-4814-9134-4a6d4aed88a7\") " pod="metallb-system/metallb-operator-webhook-server-79c6c9bd96-5sx97" Oct 11 01:06:34 crc kubenswrapper[4743]: I1011 01:06:34.166675 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxbrd\" (UniqueName: \"kubernetes.io/projected/3cf19c00-f066-4814-9134-4a6d4aed88a7-kube-api-access-wxbrd\") pod \"metallb-operator-webhook-server-79c6c9bd96-5sx97\" (UID: \"3cf19c00-f066-4814-9134-4a6d4aed88a7\") " pod="metallb-system/metallb-operator-webhook-server-79c6c9bd96-5sx97" Oct 11 01:06:34 crc kubenswrapper[4743]: I1011 01:06:34.166752 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3cf19c00-f066-4814-9134-4a6d4aed88a7-apiservice-cert\") pod \"metallb-operator-webhook-server-79c6c9bd96-5sx97\" (UID: \"3cf19c00-f066-4814-9134-4a6d4aed88a7\") " pod="metallb-system/metallb-operator-webhook-server-79c6c9bd96-5sx97" Oct 11 01:06:34 crc kubenswrapper[4743]: I1011 01:06:34.166777 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3cf19c00-f066-4814-9134-4a6d4aed88a7-webhook-cert\") pod \"metallb-operator-webhook-server-79c6c9bd96-5sx97\" (UID: \"3cf19c00-f066-4814-9134-4a6d4aed88a7\") " pod="metallb-system/metallb-operator-webhook-server-79c6c9bd96-5sx97" Oct 11 01:06:34 crc kubenswrapper[4743]: I1011 01:06:34.170398 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3cf19c00-f066-4814-9134-4a6d4aed88a7-webhook-cert\") pod \"metallb-operator-webhook-server-79c6c9bd96-5sx97\" (UID: \"3cf19c00-f066-4814-9134-4a6d4aed88a7\") " pod="metallb-system/metallb-operator-webhook-server-79c6c9bd96-5sx97" Oct 11 01:06:34 crc kubenswrapper[4743]: I1011 01:06:34.172094 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3cf19c00-f066-4814-9134-4a6d4aed88a7-apiservice-cert\") pod \"metallb-operator-webhook-server-79c6c9bd96-5sx97\" (UID: \"3cf19c00-f066-4814-9134-4a6d4aed88a7\") " pod="metallb-system/metallb-operator-webhook-server-79c6c9bd96-5sx97" Oct 11 01:06:34 crc kubenswrapper[4743]: I1011 01:06:34.189579 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxbrd\" (UniqueName: \"kubernetes.io/projected/3cf19c00-f066-4814-9134-4a6d4aed88a7-kube-api-access-wxbrd\") pod \"metallb-operator-webhook-server-79c6c9bd96-5sx97\" (UID: \"3cf19c00-f066-4814-9134-4a6d4aed88a7\") " pod="metallb-system/metallb-operator-webhook-server-79c6c9bd96-5sx97" Oct 11 01:06:34 crc kubenswrapper[4743]: I1011 01:06:34.352911 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-79c6c9bd96-5sx97" Oct 11 01:06:34 crc kubenswrapper[4743]: I1011 01:06:34.491952 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7969b47488-dm7g4"] Oct 11 01:06:34 crc kubenswrapper[4743]: W1011 01:06:34.497828 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef8e01c5_e132_4e07_9ce3_9a5578548ad7.slice/crio-cc0d62fd3f05c33282e3a6729698cc052c350faca7738e35354ad06cca8f4f72 WatchSource:0}: Error finding container cc0d62fd3f05c33282e3a6729698cc052c350faca7738e35354ad06cca8f4f72: Status 404 returned error can't find the container with id cc0d62fd3f05c33282e3a6729698cc052c350faca7738e35354ad06cca8f4f72 Oct 11 01:06:34 crc kubenswrapper[4743]: I1011 01:06:34.773202 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-79c6c9bd96-5sx97"] Oct 11 01:06:34 crc kubenswrapper[4743]: W1011 01:06:34.777239 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cf19c00_f066_4814_9134_4a6d4aed88a7.slice/crio-358be8ca90d01986c0262cf7091f1080c6f38abdc73cd0bd69956af2bee58565 WatchSource:0}: Error finding container 358be8ca90d01986c0262cf7091f1080c6f38abdc73cd0bd69956af2bee58565: Status 404 returned error can't find the container with id 358be8ca90d01986c0262cf7091f1080c6f38abdc73cd0bd69956af2bee58565 Oct 11 01:06:35 crc kubenswrapper[4743]: I1011 01:06:35.421476 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7969b47488-dm7g4" event={"ID":"ef8e01c5-e132-4e07-9ce3-9a5578548ad7","Type":"ContainerStarted","Data":"cc0d62fd3f05c33282e3a6729698cc052c350faca7738e35354ad06cca8f4f72"} Oct 11 01:06:35 crc kubenswrapper[4743]: I1011 01:06:35.424201 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-79c6c9bd96-5sx97" event={"ID":"3cf19c00-f066-4814-9134-4a6d4aed88a7","Type":"ContainerStarted","Data":"358be8ca90d01986c0262cf7091f1080c6f38abdc73cd0bd69956af2bee58565"} Oct 11 01:06:40 crc kubenswrapper[4743]: I1011 01:06:40.461918 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7969b47488-dm7g4" event={"ID":"ef8e01c5-e132-4e07-9ce3-9a5578548ad7","Type":"ContainerStarted","Data":"82b8236efaef3f75883513edb0de9dfb48666db35ca141e10829d667be65c08a"} Oct 11 01:06:40 crc kubenswrapper[4743]: I1011 01:06:40.462560 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7969b47488-dm7g4" Oct 11 01:06:40 crc kubenswrapper[4743]: I1011 01:06:40.464251 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-79c6c9bd96-5sx97" event={"ID":"3cf19c00-f066-4814-9134-4a6d4aed88a7","Type":"ContainerStarted","Data":"ca310f6b304df7893989c3caef13842ed3d67f6adc925df136b5e5a7db0e1b25"} Oct 11 01:06:40 crc kubenswrapper[4743]: I1011 01:06:40.464388 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-79c6c9bd96-5sx97" Oct 11 01:06:40 crc kubenswrapper[4743]: I1011 01:06:40.486172 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7969b47488-dm7g4" podStartSLOduration=2.425843847 podStartE2EDuration="7.48615135s" podCreationTimestamp="2025-10-11 01:06:33 +0000 UTC" firstStartedPulling="2025-10-11 01:06:34.500048993 +0000 UTC m=+889.153029390" lastFinishedPulling="2025-10-11 01:06:39.560356496 +0000 UTC m=+894.213336893" observedRunningTime="2025-10-11 01:06:40.481619759 +0000 UTC m=+895.134600166" watchObservedRunningTime="2025-10-11 01:06:40.48615135 +0000 UTC m=+895.139131757" Oct 11 01:06:40 crc kubenswrapper[4743]: I1011 01:06:40.508035 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-79c6c9bd96-5sx97" podStartSLOduration=2.7045691830000003 podStartE2EDuration="7.508016085s" podCreationTimestamp="2025-10-11 01:06:33 +0000 UTC" firstStartedPulling="2025-10-11 01:06:34.784222193 +0000 UTC m=+889.437202630" lastFinishedPulling="2025-10-11 01:06:39.587669135 +0000 UTC m=+894.240649532" observedRunningTime="2025-10-11 01:06:40.505697698 +0000 UTC m=+895.158678095" watchObservedRunningTime="2025-10-11 01:06:40.508016085 +0000 UTC m=+895.160996482" Oct 11 01:06:44 crc kubenswrapper[4743]: I1011 01:06:44.458642 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:06:44 crc kubenswrapper[4743]: I1011 01:06:44.460639 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:06:54 crc kubenswrapper[4743]: I1011 01:06:54.356532 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-79c6c9bd96-5sx97" Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.063334 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7969b47488-dm7g4" Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.458378 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.458466 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.458520 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.459280 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"06d63da6139508ac6d7d3ccf51eec7dcc1dbdfea0379b704f2d1844d8e86a974"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.459358 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://06d63da6139508ac6d7d3ccf51eec7dcc1dbdfea0379b704f2d1844d8e86a974" gracePeriod=600 Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.754571 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="06d63da6139508ac6d7d3ccf51eec7dcc1dbdfea0379b704f2d1844d8e86a974" exitCode=0 Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.754662 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"06d63da6139508ac6d7d3ccf51eec7dcc1dbdfea0379b704f2d1844d8e86a974"} Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.754971 4743 scope.go:117] "RemoveContainer" containerID="88127d52f7db156c5804bc403a408594bcfb43a90269eb1302483bd25dec7ebe" Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.957218 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-v524d"] Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.960843 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.963302 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.963437 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.963510 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-6b7j7" Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.975009 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-jct7h"] Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.975806 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-jct7h" Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.983229 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.986092 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95e8f67c-537e-4744-a3c2-7dd93084f455-metrics-certs\") pod \"frr-k8s-v524d\" (UID: \"95e8f67c-537e-4744-a3c2-7dd93084f455\") " pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.986173 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/95e8f67c-537e-4744-a3c2-7dd93084f455-metrics\") pod \"frr-k8s-v524d\" (UID: \"95e8f67c-537e-4744-a3c2-7dd93084f455\") " pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.986206 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/95e8f67c-537e-4744-a3c2-7dd93084f455-reloader\") pod \"frr-k8s-v524d\" (UID: \"95e8f67c-537e-4744-a3c2-7dd93084f455\") " pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.986228 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/95e8f67c-537e-4744-a3c2-7dd93084f455-frr-conf\") pod \"frr-k8s-v524d\" (UID: \"95e8f67c-537e-4744-a3c2-7dd93084f455\") " pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.986256 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2772ef34-f307-4a91-8f2f-28e3b22375a0-cert\") pod \"frr-k8s-webhook-server-64bf5d555-jct7h\" (UID: \"2772ef34-f307-4a91-8f2f-28e3b22375a0\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-jct7h" Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.986275 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzzn4\" (UniqueName: \"kubernetes.io/projected/2772ef34-f307-4a91-8f2f-28e3b22375a0-kube-api-access-pzzn4\") pod \"frr-k8s-webhook-server-64bf5d555-jct7h\" (UID: \"2772ef34-f307-4a91-8f2f-28e3b22375a0\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-jct7h" Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.986293 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swlqb\" (UniqueName: \"kubernetes.io/projected/95e8f67c-537e-4744-a3c2-7dd93084f455-kube-api-access-swlqb\") pod \"frr-k8s-v524d\" (UID: \"95e8f67c-537e-4744-a3c2-7dd93084f455\") " pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.986327 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/95e8f67c-537e-4744-a3c2-7dd93084f455-frr-sockets\") pod \"frr-k8s-v524d\" (UID: \"95e8f67c-537e-4744-a3c2-7dd93084f455\") " pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.986346 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/95e8f67c-537e-4744-a3c2-7dd93084f455-frr-startup\") pod \"frr-k8s-v524d\" (UID: \"95e8f67c-537e-4744-a3c2-7dd93084f455\") " pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:14 crc kubenswrapper[4743]: I1011 01:07:14.994973 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-jct7h"] Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.064429 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-9g4hk"] Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.065587 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-9g4hk" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.068535 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.068550 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.068589 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.068642 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-l7st9" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.073843 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-bwqtl"] Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.075035 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-bwqtl" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.076378 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.086207 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-bwqtl"] Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.087608 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/95e8f67c-537e-4744-a3c2-7dd93084f455-reloader\") pod \"frr-k8s-v524d\" (UID: \"95e8f67c-537e-4744-a3c2-7dd93084f455\") " pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.087647 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr2qf\" (UniqueName: \"kubernetes.io/projected/83bddd85-204d-438d-a29f-e7fca659542a-kube-api-access-hr2qf\") pod \"speaker-9g4hk\" (UID: \"83bddd85-204d-438d-a29f-e7fca659542a\") " pod="metallb-system/speaker-9g4hk" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.087691 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/95e8f67c-537e-4744-a3c2-7dd93084f455-frr-conf\") pod \"frr-k8s-v524d\" (UID: \"95e8f67c-537e-4744-a3c2-7dd93084f455\") " pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.087708 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b7ecaea-d42f-44b0-a181-3c61cf45bde2-cert\") pod \"controller-68d546b9d8-bwqtl\" (UID: \"3b7ecaea-d42f-44b0-a181-3c61cf45bde2\") " pod="metallb-system/controller-68d546b9d8-bwqtl" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.087735 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/83bddd85-204d-438d-a29f-e7fca659542a-metallb-excludel2\") pod \"speaker-9g4hk\" (UID: \"83bddd85-204d-438d-a29f-e7fca659542a\") " pod="metallb-system/speaker-9g4hk" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.087780 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3b7ecaea-d42f-44b0-a181-3c61cf45bde2-metrics-certs\") pod \"controller-68d546b9d8-bwqtl\" (UID: \"3b7ecaea-d42f-44b0-a181-3c61cf45bde2\") " pod="metallb-system/controller-68d546b9d8-bwqtl" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.087797 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2772ef34-f307-4a91-8f2f-28e3b22375a0-cert\") pod \"frr-k8s-webhook-server-64bf5d555-jct7h\" (UID: \"2772ef34-f307-4a91-8f2f-28e3b22375a0\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-jct7h" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.087822 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzzn4\" (UniqueName: \"kubernetes.io/projected/2772ef34-f307-4a91-8f2f-28e3b22375a0-kube-api-access-pzzn4\") pod \"frr-k8s-webhook-server-64bf5d555-jct7h\" (UID: \"2772ef34-f307-4a91-8f2f-28e3b22375a0\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-jct7h" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.087837 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc8nx\" (UniqueName: \"kubernetes.io/projected/3b7ecaea-d42f-44b0-a181-3c61cf45bde2-kube-api-access-bc8nx\") pod \"controller-68d546b9d8-bwqtl\" (UID: \"3b7ecaea-d42f-44b0-a181-3c61cf45bde2\") " pod="metallb-system/controller-68d546b9d8-bwqtl" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.087878 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swlqb\" (UniqueName: \"kubernetes.io/projected/95e8f67c-537e-4744-a3c2-7dd93084f455-kube-api-access-swlqb\") pod \"frr-k8s-v524d\" (UID: \"95e8f67c-537e-4744-a3c2-7dd93084f455\") " pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.087942 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/95e8f67c-537e-4744-a3c2-7dd93084f455-frr-sockets\") pod \"frr-k8s-v524d\" (UID: \"95e8f67c-537e-4744-a3c2-7dd93084f455\") " pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.087974 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/83bddd85-204d-438d-a29f-e7fca659542a-memberlist\") pod \"speaker-9g4hk\" (UID: \"83bddd85-204d-438d-a29f-e7fca659542a\") " pod="metallb-system/speaker-9g4hk" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.087996 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/95e8f67c-537e-4744-a3c2-7dd93084f455-frr-startup\") pod \"frr-k8s-v524d\" (UID: \"95e8f67c-537e-4744-a3c2-7dd93084f455\") " pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.088041 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95e8f67c-537e-4744-a3c2-7dd93084f455-metrics-certs\") pod \"frr-k8s-v524d\" (UID: \"95e8f67c-537e-4744-a3c2-7dd93084f455\") " pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.088075 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/95e8f67c-537e-4744-a3c2-7dd93084f455-metrics\") pod \"frr-k8s-v524d\" (UID: \"95e8f67c-537e-4744-a3c2-7dd93084f455\") " pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.088104 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83bddd85-204d-438d-a29f-e7fca659542a-metrics-certs\") pod \"speaker-9g4hk\" (UID: \"83bddd85-204d-438d-a29f-e7fca659542a\") " pod="metallb-system/speaker-9g4hk" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.088466 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/95e8f67c-537e-4744-a3c2-7dd93084f455-reloader\") pod \"frr-k8s-v524d\" (UID: \"95e8f67c-537e-4744-a3c2-7dd93084f455\") " pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.088643 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/95e8f67c-537e-4744-a3c2-7dd93084f455-frr-conf\") pod \"frr-k8s-v524d\" (UID: \"95e8f67c-537e-4744-a3c2-7dd93084f455\") " pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:15 crc kubenswrapper[4743]: E1011 01:07:15.088719 4743 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Oct 11 01:07:15 crc kubenswrapper[4743]: E1011 01:07:15.088757 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2772ef34-f307-4a91-8f2f-28e3b22375a0-cert podName:2772ef34-f307-4a91-8f2f-28e3b22375a0 nodeName:}" failed. No retries permitted until 2025-10-11 01:07:15.588742606 +0000 UTC m=+930.241723003 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2772ef34-f307-4a91-8f2f-28e3b22375a0-cert") pod "frr-k8s-webhook-server-64bf5d555-jct7h" (UID: "2772ef34-f307-4a91-8f2f-28e3b22375a0") : secret "frr-k8s-webhook-server-cert" not found Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.089280 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/95e8f67c-537e-4744-a3c2-7dd93084f455-frr-sockets\") pod \"frr-k8s-v524d\" (UID: \"95e8f67c-537e-4744-a3c2-7dd93084f455\") " pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.089907 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/95e8f67c-537e-4744-a3c2-7dd93084f455-frr-startup\") pod \"frr-k8s-v524d\" (UID: \"95e8f67c-537e-4744-a3c2-7dd93084f455\") " pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:15 crc kubenswrapper[4743]: E1011 01:07:15.090037 4743 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 11 01:07:15 crc kubenswrapper[4743]: E1011 01:07:15.090124 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95e8f67c-537e-4744-a3c2-7dd93084f455-metrics-certs podName:95e8f67c-537e-4744-a3c2-7dd93084f455 nodeName:}" failed. No retries permitted until 2025-10-11 01:07:15.59011223 +0000 UTC m=+930.243092627 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95e8f67c-537e-4744-a3c2-7dd93084f455-metrics-certs") pod "frr-k8s-v524d" (UID: "95e8f67c-537e-4744-a3c2-7dd93084f455") : secret "frr-k8s-certs-secret" not found Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.090231 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/95e8f67c-537e-4744-a3c2-7dd93084f455-metrics\") pod \"frr-k8s-v524d\" (UID: \"95e8f67c-537e-4744-a3c2-7dd93084f455\") " pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.126486 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swlqb\" (UniqueName: \"kubernetes.io/projected/95e8f67c-537e-4744-a3c2-7dd93084f455-kube-api-access-swlqb\") pod \"frr-k8s-v524d\" (UID: \"95e8f67c-537e-4744-a3c2-7dd93084f455\") " pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.126761 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzzn4\" (UniqueName: \"kubernetes.io/projected/2772ef34-f307-4a91-8f2f-28e3b22375a0-kube-api-access-pzzn4\") pod \"frr-k8s-webhook-server-64bf5d555-jct7h\" (UID: \"2772ef34-f307-4a91-8f2f-28e3b22375a0\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-jct7h" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.188985 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3b7ecaea-d42f-44b0-a181-3c61cf45bde2-metrics-certs\") pod \"controller-68d546b9d8-bwqtl\" (UID: \"3b7ecaea-d42f-44b0-a181-3c61cf45bde2\") " pod="metallb-system/controller-68d546b9d8-bwqtl" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.189269 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc8nx\" (UniqueName: \"kubernetes.io/projected/3b7ecaea-d42f-44b0-a181-3c61cf45bde2-kube-api-access-bc8nx\") pod \"controller-68d546b9d8-bwqtl\" (UID: \"3b7ecaea-d42f-44b0-a181-3c61cf45bde2\") " pod="metallb-system/controller-68d546b9d8-bwqtl" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.189342 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/83bddd85-204d-438d-a29f-e7fca659542a-memberlist\") pod \"speaker-9g4hk\" (UID: \"83bddd85-204d-438d-a29f-e7fca659542a\") " pod="metallb-system/speaker-9g4hk" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.189415 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83bddd85-204d-438d-a29f-e7fca659542a-metrics-certs\") pod \"speaker-9g4hk\" (UID: \"83bddd85-204d-438d-a29f-e7fca659542a\") " pod="metallb-system/speaker-9g4hk" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.189448 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr2qf\" (UniqueName: \"kubernetes.io/projected/83bddd85-204d-438d-a29f-e7fca659542a-kube-api-access-hr2qf\") pod \"speaker-9g4hk\" (UID: \"83bddd85-204d-438d-a29f-e7fca659542a\") " pod="metallb-system/speaker-9g4hk" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.189480 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b7ecaea-d42f-44b0-a181-3c61cf45bde2-cert\") pod \"controller-68d546b9d8-bwqtl\" (UID: \"3b7ecaea-d42f-44b0-a181-3c61cf45bde2\") " pod="metallb-system/controller-68d546b9d8-bwqtl" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.189508 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/83bddd85-204d-438d-a29f-e7fca659542a-metallb-excludel2\") pod \"speaker-9g4hk\" (UID: \"83bddd85-204d-438d-a29f-e7fca659542a\") " pod="metallb-system/speaker-9g4hk" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.190256 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/83bddd85-204d-438d-a29f-e7fca659542a-metallb-excludel2\") pod \"speaker-9g4hk\" (UID: \"83bddd85-204d-438d-a29f-e7fca659542a\") " pod="metallb-system/speaker-9g4hk" Oct 11 01:07:15 crc kubenswrapper[4743]: E1011 01:07:15.190360 4743 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Oct 11 01:07:15 crc kubenswrapper[4743]: E1011 01:07:15.190416 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b7ecaea-d42f-44b0-a181-3c61cf45bde2-metrics-certs podName:3b7ecaea-d42f-44b0-a181-3c61cf45bde2 nodeName:}" failed. No retries permitted until 2025-10-11 01:07:15.690399926 +0000 UTC m=+930.343380323 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3b7ecaea-d42f-44b0-a181-3c61cf45bde2-metrics-certs") pod "controller-68d546b9d8-bwqtl" (UID: "3b7ecaea-d42f-44b0-a181-3c61cf45bde2") : secret "controller-certs-secret" not found Oct 11 01:07:15 crc kubenswrapper[4743]: E1011 01:07:15.190701 4743 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 11 01:07:15 crc kubenswrapper[4743]: E1011 01:07:15.190730 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83bddd85-204d-438d-a29f-e7fca659542a-memberlist podName:83bddd85-204d-438d-a29f-e7fca659542a nodeName:}" failed. No retries permitted until 2025-10-11 01:07:15.690721964 +0000 UTC m=+930.343702351 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/83bddd85-204d-438d-a29f-e7fca659542a-memberlist") pod "speaker-9g4hk" (UID: "83bddd85-204d-438d-a29f-e7fca659542a") : secret "metallb-memberlist" not found Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.192909 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.205217 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b7ecaea-d42f-44b0-a181-3c61cf45bde2-cert\") pod \"controller-68d546b9d8-bwqtl\" (UID: \"3b7ecaea-d42f-44b0-a181-3c61cf45bde2\") " pod="metallb-system/controller-68d546b9d8-bwqtl" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.207682 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc8nx\" (UniqueName: \"kubernetes.io/projected/3b7ecaea-d42f-44b0-a181-3c61cf45bde2-kube-api-access-bc8nx\") pod \"controller-68d546b9d8-bwqtl\" (UID: \"3b7ecaea-d42f-44b0-a181-3c61cf45bde2\") " pod="metallb-system/controller-68d546b9d8-bwqtl" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.210160 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr2qf\" (UniqueName: \"kubernetes.io/projected/83bddd85-204d-438d-a29f-e7fca659542a-kube-api-access-hr2qf\") pod \"speaker-9g4hk\" (UID: \"83bddd85-204d-438d-a29f-e7fca659542a\") " pod="metallb-system/speaker-9g4hk" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.212272 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83bddd85-204d-438d-a29f-e7fca659542a-metrics-certs\") pod \"speaker-9g4hk\" (UID: \"83bddd85-204d-438d-a29f-e7fca659542a\") " pod="metallb-system/speaker-9g4hk" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.594580 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95e8f67c-537e-4744-a3c2-7dd93084f455-metrics-certs\") pod \"frr-k8s-v524d\" (UID: \"95e8f67c-537e-4744-a3c2-7dd93084f455\") " pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.594684 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2772ef34-f307-4a91-8f2f-28e3b22375a0-cert\") pod \"frr-k8s-webhook-server-64bf5d555-jct7h\" (UID: \"2772ef34-f307-4a91-8f2f-28e3b22375a0\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-jct7h" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.597835 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95e8f67c-537e-4744-a3c2-7dd93084f455-metrics-certs\") pod \"frr-k8s-v524d\" (UID: \"95e8f67c-537e-4744-a3c2-7dd93084f455\") " pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.599468 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2772ef34-f307-4a91-8f2f-28e3b22375a0-cert\") pod \"frr-k8s-webhook-server-64bf5d555-jct7h\" (UID: \"2772ef34-f307-4a91-8f2f-28e3b22375a0\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-jct7h" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.695605 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/83bddd85-204d-438d-a29f-e7fca659542a-memberlist\") pod \"speaker-9g4hk\" (UID: \"83bddd85-204d-438d-a29f-e7fca659542a\") " pod="metallb-system/speaker-9g4hk" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.695702 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3b7ecaea-d42f-44b0-a181-3c61cf45bde2-metrics-certs\") pod \"controller-68d546b9d8-bwqtl\" (UID: \"3b7ecaea-d42f-44b0-a181-3c61cf45bde2\") " pod="metallb-system/controller-68d546b9d8-bwqtl" Oct 11 01:07:15 crc kubenswrapper[4743]: E1011 01:07:15.695791 4743 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 11 01:07:15 crc kubenswrapper[4743]: E1011 01:07:15.695879 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83bddd85-204d-438d-a29f-e7fca659542a-memberlist podName:83bddd85-204d-438d-a29f-e7fca659542a nodeName:}" failed. No retries permitted until 2025-10-11 01:07:16.695843935 +0000 UTC m=+931.348824332 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/83bddd85-204d-438d-a29f-e7fca659542a-memberlist") pod "speaker-9g4hk" (UID: "83bddd85-204d-438d-a29f-e7fca659542a") : secret "metallb-memberlist" not found Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.699532 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3b7ecaea-d42f-44b0-a181-3c61cf45bde2-metrics-certs\") pod \"controller-68d546b9d8-bwqtl\" (UID: \"3b7ecaea-d42f-44b0-a181-3c61cf45bde2\") " pod="metallb-system/controller-68d546b9d8-bwqtl" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.763623 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"d141abb12335a71090b8204b0a7206f68b485cc9db85f994938ef978a23ae624"} Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.876842 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.896633 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-jct7h" Oct 11 01:07:15 crc kubenswrapper[4743]: I1011 01:07:15.991530 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-bwqtl" Oct 11 01:07:16 crc kubenswrapper[4743]: I1011 01:07:16.343828 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-jct7h"] Oct 11 01:07:16 crc kubenswrapper[4743]: I1011 01:07:16.430372 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-bwqtl"] Oct 11 01:07:16 crc kubenswrapper[4743]: W1011 01:07:16.440360 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b7ecaea_d42f_44b0_a181_3c61cf45bde2.slice/crio-60953a0408538b23b361a05e3112d2ec6e9887425581f8194a294217a180915c WatchSource:0}: Error finding container 60953a0408538b23b361a05e3112d2ec6e9887425581f8194a294217a180915c: Status 404 returned error can't find the container with id 60953a0408538b23b361a05e3112d2ec6e9887425581f8194a294217a180915c Oct 11 01:07:16 crc kubenswrapper[4743]: I1011 01:07:16.712495 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/83bddd85-204d-438d-a29f-e7fca659542a-memberlist\") pod \"speaker-9g4hk\" (UID: \"83bddd85-204d-438d-a29f-e7fca659542a\") " pod="metallb-system/speaker-9g4hk" Oct 11 01:07:16 crc kubenswrapper[4743]: I1011 01:07:16.718939 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/83bddd85-204d-438d-a29f-e7fca659542a-memberlist\") pod \"speaker-9g4hk\" (UID: \"83bddd85-204d-438d-a29f-e7fca659542a\") " pod="metallb-system/speaker-9g4hk" Oct 11 01:07:16 crc kubenswrapper[4743]: I1011 01:07:16.770265 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-jct7h" event={"ID":"2772ef34-f307-4a91-8f2f-28e3b22375a0","Type":"ContainerStarted","Data":"f23bace4e4e97bb071dbd6b5ca1686c9d20a7d370955a685e81439c838f68c2f"} Oct 11 01:07:16 crc kubenswrapper[4743]: I1011 01:07:16.772004 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-bwqtl" event={"ID":"3b7ecaea-d42f-44b0-a181-3c61cf45bde2","Type":"ContainerStarted","Data":"b40d3fa26835ebb28f00b4bfc653119a12daaef02e2c7e73dd119665c706f6fc"} Oct 11 01:07:16 crc kubenswrapper[4743]: I1011 01:07:16.772033 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-bwqtl" event={"ID":"3b7ecaea-d42f-44b0-a181-3c61cf45bde2","Type":"ContainerStarted","Data":"60953a0408538b23b361a05e3112d2ec6e9887425581f8194a294217a180915c"} Oct 11 01:07:16 crc kubenswrapper[4743]: I1011 01:07:16.773874 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-v524d" event={"ID":"95e8f67c-537e-4744-a3c2-7dd93084f455","Type":"ContainerStarted","Data":"00992cedb18d628c5077c70dad39b39203b3ac0a1f5b7bb89475ea9f4bb34eff"} Oct 11 01:07:16 crc kubenswrapper[4743]: I1011 01:07:16.883529 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-9g4hk" Oct 11 01:07:17 crc kubenswrapper[4743]: I1011 01:07:17.792169 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-bwqtl" event={"ID":"3b7ecaea-d42f-44b0-a181-3c61cf45bde2","Type":"ContainerStarted","Data":"d0dde31016dba87b6bc8613fda74b2cdc4c9aee22e0518dd278fbb988d485e56"} Oct 11 01:07:17 crc kubenswrapper[4743]: I1011 01:07:17.793551 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-bwqtl" Oct 11 01:07:17 crc kubenswrapper[4743]: I1011 01:07:17.797593 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-9g4hk" event={"ID":"83bddd85-204d-438d-a29f-e7fca659542a","Type":"ContainerStarted","Data":"018e650423529a4d81309a74fe56e988162d38d2f39d1e80f0b43ab6efbf71fe"} Oct 11 01:07:17 crc kubenswrapper[4743]: I1011 01:07:17.797654 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-9g4hk" event={"ID":"83bddd85-204d-438d-a29f-e7fca659542a","Type":"ContainerStarted","Data":"4a50ff2a7c64199e3bee727311f28aef8073117f23006ced20ff3ed798424a93"} Oct 11 01:07:17 crc kubenswrapper[4743]: I1011 01:07:17.797665 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-9g4hk" event={"ID":"83bddd85-204d-438d-a29f-e7fca659542a","Type":"ContainerStarted","Data":"89b4595e865274394e7032d13da21f035049d3a0ba0f42ab408220534ecf2241"} Oct 11 01:07:17 crc kubenswrapper[4743]: I1011 01:07:17.797862 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-9g4hk" Oct 11 01:07:17 crc kubenswrapper[4743]: I1011 01:07:17.842495 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-bwqtl" podStartSLOduration=2.842480048 podStartE2EDuration="2.842480048s" podCreationTimestamp="2025-10-11 01:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:07:17.81436378 +0000 UTC m=+932.467344177" watchObservedRunningTime="2025-10-11 01:07:17.842480048 +0000 UTC m=+932.495460455" Oct 11 01:07:17 crc kubenswrapper[4743]: I1011 01:07:17.843542 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-9g4hk" podStartSLOduration=2.843534264 podStartE2EDuration="2.843534264s" podCreationTimestamp="2025-10-11 01:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:07:17.839247689 +0000 UTC m=+932.492228086" watchObservedRunningTime="2025-10-11 01:07:17.843534264 +0000 UTC m=+932.496514671" Oct 11 01:07:23 crc kubenswrapper[4743]: I1011 01:07:23.869333 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-jct7h" event={"ID":"2772ef34-f307-4a91-8f2f-28e3b22375a0","Type":"ContainerStarted","Data":"a444ecc7bb8dd1c7af3440dbf5094e6bdaf72f0424bf75723e8a467d478677c6"} Oct 11 01:07:23 crc kubenswrapper[4743]: I1011 01:07:23.870624 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-jct7h" Oct 11 01:07:23 crc kubenswrapper[4743]: I1011 01:07:23.871247 4743 generic.go:334] "Generic (PLEG): container finished" podID="95e8f67c-537e-4744-a3c2-7dd93084f455" containerID="548b299185184ada3a7ca9d62645e1f450f200360e62dc753b3a94115b725eb8" exitCode=0 Oct 11 01:07:23 crc kubenswrapper[4743]: I1011 01:07:23.871310 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-v524d" event={"ID":"95e8f67c-537e-4744-a3c2-7dd93084f455","Type":"ContainerDied","Data":"548b299185184ada3a7ca9d62645e1f450f200360e62dc753b3a94115b725eb8"} Oct 11 01:07:23 crc kubenswrapper[4743]: I1011 01:07:23.889344 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-jct7h" podStartSLOduration=2.8466059489999997 podStartE2EDuration="9.889321993s" podCreationTimestamp="2025-10-11 01:07:14 +0000 UTC" firstStartedPulling="2025-10-11 01:07:16.349016202 +0000 UTC m=+931.001996599" lastFinishedPulling="2025-10-11 01:07:23.391732226 +0000 UTC m=+938.044712643" observedRunningTime="2025-10-11 01:07:23.885079019 +0000 UTC m=+938.538059426" watchObservedRunningTime="2025-10-11 01:07:23.889321993 +0000 UTC m=+938.542302390" Oct 11 01:07:24 crc kubenswrapper[4743]: I1011 01:07:24.883884 4743 generic.go:334] "Generic (PLEG): container finished" podID="95e8f67c-537e-4744-a3c2-7dd93084f455" containerID="addd935bf4bd455c21f89c7b402947bda536d6a1f0e4fa742cea0dcdf7ef79b3" exitCode=0 Oct 11 01:07:24 crc kubenswrapper[4743]: I1011 01:07:24.884003 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-v524d" event={"ID":"95e8f67c-537e-4744-a3c2-7dd93084f455","Type":"ContainerDied","Data":"addd935bf4bd455c21f89c7b402947bda536d6a1f0e4fa742cea0dcdf7ef79b3"} Oct 11 01:07:25 crc kubenswrapper[4743]: I1011 01:07:25.897560 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-v524d" event={"ID":"95e8f67c-537e-4744-a3c2-7dd93084f455","Type":"ContainerStarted","Data":"196579f021df10d0b98bf1884ed6c75acacbc864d03b80c1a3c23a52a1bab8b2"} Oct 11 01:07:26 crc kubenswrapper[4743]: I1011 01:07:26.907700 4743 generic.go:334] "Generic (PLEG): container finished" podID="95e8f67c-537e-4744-a3c2-7dd93084f455" containerID="196579f021df10d0b98bf1884ed6c75acacbc864d03b80c1a3c23a52a1bab8b2" exitCode=0 Oct 11 01:07:26 crc kubenswrapper[4743]: I1011 01:07:26.907776 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-v524d" event={"ID":"95e8f67c-537e-4744-a3c2-7dd93084f455","Type":"ContainerDied","Data":"196579f021df10d0b98bf1884ed6c75acacbc864d03b80c1a3c23a52a1bab8b2"} Oct 11 01:07:27 crc kubenswrapper[4743]: I1011 01:07:27.916369 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-v524d" event={"ID":"95e8f67c-537e-4744-a3c2-7dd93084f455","Type":"ContainerStarted","Data":"cc758f0ca48651f4a3e8aa25a3eebe38d1e593222ededacce7418ac117d8d052"} Oct 11 01:07:27 crc kubenswrapper[4743]: I1011 01:07:27.916672 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-v524d" event={"ID":"95e8f67c-537e-4744-a3c2-7dd93084f455","Type":"ContainerStarted","Data":"1876626790d4be34829e12f611b565753ab3e646989369dbc1d010adf4941a66"} Oct 11 01:07:27 crc kubenswrapper[4743]: I1011 01:07:27.916684 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-v524d" event={"ID":"95e8f67c-537e-4744-a3c2-7dd93084f455","Type":"ContainerStarted","Data":"babc27cf7e826f307566b91f6dc8f0461c86918255cf298c90ee9be7aa0b9b38"} Oct 11 01:07:27 crc kubenswrapper[4743]: I1011 01:07:27.916692 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-v524d" event={"ID":"95e8f67c-537e-4744-a3c2-7dd93084f455","Type":"ContainerStarted","Data":"15527f99b726d26978d32d2253598166e0621bcba3fba6a7dc59fbe51173a44a"} Oct 11 01:07:27 crc kubenswrapper[4743]: I1011 01:07:27.916702 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-v524d" event={"ID":"95e8f67c-537e-4744-a3c2-7dd93084f455","Type":"ContainerStarted","Data":"7afb84f29cacb50b91ae14e862ce2208c036ef73d092e25933a5b56fb16b4836"} Oct 11 01:07:28 crc kubenswrapper[4743]: I1011 01:07:28.933133 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-v524d" event={"ID":"95e8f67c-537e-4744-a3c2-7dd93084f455","Type":"ContainerStarted","Data":"ad6e8c4095c9a223ffa46fe8581ec55e30fc5893a365251d55c4c220d161c5e0"} Oct 11 01:07:28 crc kubenswrapper[4743]: I1011 01:07:28.933577 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:28 crc kubenswrapper[4743]: I1011 01:07:28.967763 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-v524d" podStartSLOduration=7.583673974 podStartE2EDuration="14.967737778s" podCreationTimestamp="2025-10-11 01:07:14 +0000 UTC" firstStartedPulling="2025-10-11 01:07:16.03904065 +0000 UTC m=+930.692021057" lastFinishedPulling="2025-10-11 01:07:23.423104424 +0000 UTC m=+938.076084861" observedRunningTime="2025-10-11 01:07:28.959919156 +0000 UTC m=+943.612899573" watchObservedRunningTime="2025-10-11 01:07:28.967737778 +0000 UTC m=+943.620718215" Oct 11 01:07:30 crc kubenswrapper[4743]: I1011 01:07:30.877761 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:30 crc kubenswrapper[4743]: I1011 01:07:30.911098 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:35 crc kubenswrapper[4743]: I1011 01:07:35.948160 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-jct7h" Oct 11 01:07:35 crc kubenswrapper[4743]: I1011 01:07:35.996633 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-bwqtl" Oct 11 01:07:36 crc kubenswrapper[4743]: I1011 01:07:36.890842 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-9g4hk" Oct 11 01:07:40 crc kubenswrapper[4743]: I1011 01:07:40.026812 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-gd5f2"] Oct 11 01:07:40 crc kubenswrapper[4743]: I1011 01:07:40.028588 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gd5f2" Oct 11 01:07:40 crc kubenswrapper[4743]: I1011 01:07:40.030993 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 11 01:07:40 crc kubenswrapper[4743]: I1011 01:07:40.031017 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 11 01:07:40 crc kubenswrapper[4743]: I1011 01:07:40.033781 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-7dfnd" Oct 11 01:07:40 crc kubenswrapper[4743]: I1011 01:07:40.080755 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-gd5f2"] Oct 11 01:07:40 crc kubenswrapper[4743]: I1011 01:07:40.212990 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql86c\" (UniqueName: \"kubernetes.io/projected/3629019a-f2bd-4af5-b4cf-0cbe889a8f7b-kube-api-access-ql86c\") pod \"openstack-operator-index-gd5f2\" (UID: \"3629019a-f2bd-4af5-b4cf-0cbe889a8f7b\") " pod="openstack-operators/openstack-operator-index-gd5f2" Oct 11 01:07:40 crc kubenswrapper[4743]: I1011 01:07:40.315523 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql86c\" (UniqueName: \"kubernetes.io/projected/3629019a-f2bd-4af5-b4cf-0cbe889a8f7b-kube-api-access-ql86c\") pod \"openstack-operator-index-gd5f2\" (UID: \"3629019a-f2bd-4af5-b4cf-0cbe889a8f7b\") " pod="openstack-operators/openstack-operator-index-gd5f2" Oct 11 01:07:40 crc kubenswrapper[4743]: I1011 01:07:40.333442 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql86c\" (UniqueName: \"kubernetes.io/projected/3629019a-f2bd-4af5-b4cf-0cbe889a8f7b-kube-api-access-ql86c\") pod \"openstack-operator-index-gd5f2\" (UID: \"3629019a-f2bd-4af5-b4cf-0cbe889a8f7b\") " pod="openstack-operators/openstack-operator-index-gd5f2" Oct 11 01:07:40 crc kubenswrapper[4743]: I1011 01:07:40.365923 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gd5f2" Oct 11 01:07:40 crc kubenswrapper[4743]: I1011 01:07:40.641431 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-gd5f2"] Oct 11 01:07:41 crc kubenswrapper[4743]: I1011 01:07:41.033062 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gd5f2" event={"ID":"3629019a-f2bd-4af5-b4cf-0cbe889a8f7b","Type":"ContainerStarted","Data":"98f2712dbc56681a7316b8987e67a22fc9a7f71f6c95f615117695304468a051"} Oct 11 01:07:43 crc kubenswrapper[4743]: I1011 01:07:43.048482 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gd5f2" event={"ID":"3629019a-f2bd-4af5-b4cf-0cbe889a8f7b","Type":"ContainerStarted","Data":"4f23d54d329c5dc437fa072703e83dfaf3efa4855b3dc9938947832b1c5a2b7b"} Oct 11 01:07:43 crc kubenswrapper[4743]: I1011 01:07:43.065604 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-gd5f2" podStartSLOduration=2.004256627 podStartE2EDuration="4.0655874s" podCreationTimestamp="2025-10-11 01:07:39 +0000 UTC" firstStartedPulling="2025-10-11 01:07:40.657189406 +0000 UTC m=+955.310169793" lastFinishedPulling="2025-10-11 01:07:42.718520179 +0000 UTC m=+957.371500566" observedRunningTime="2025-10-11 01:07:43.061637743 +0000 UTC m=+957.714618180" watchObservedRunningTime="2025-10-11 01:07:43.0655874 +0000 UTC m=+957.718567797" Oct 11 01:07:43 crc kubenswrapper[4743]: I1011 01:07:43.389598 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-gd5f2"] Oct 11 01:07:44 crc kubenswrapper[4743]: I1011 01:07:44.006406 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-848vd"] Oct 11 01:07:44 crc kubenswrapper[4743]: I1011 01:07:44.009101 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-848vd" Oct 11 01:07:44 crc kubenswrapper[4743]: I1011 01:07:44.032366 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-848vd"] Oct 11 01:07:44 crc kubenswrapper[4743]: I1011 01:07:44.178134 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vfvf\" (UniqueName: \"kubernetes.io/projected/bd73b279-e2a4-4500-aed1-70c73212cba1-kube-api-access-2vfvf\") pod \"openstack-operator-index-848vd\" (UID: \"bd73b279-e2a4-4500-aed1-70c73212cba1\") " pod="openstack-operators/openstack-operator-index-848vd" Oct 11 01:07:44 crc kubenswrapper[4743]: I1011 01:07:44.280007 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vfvf\" (UniqueName: \"kubernetes.io/projected/bd73b279-e2a4-4500-aed1-70c73212cba1-kube-api-access-2vfvf\") pod \"openstack-operator-index-848vd\" (UID: \"bd73b279-e2a4-4500-aed1-70c73212cba1\") " pod="openstack-operators/openstack-operator-index-848vd" Oct 11 01:07:44 crc kubenswrapper[4743]: I1011 01:07:44.313573 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vfvf\" (UniqueName: \"kubernetes.io/projected/bd73b279-e2a4-4500-aed1-70c73212cba1-kube-api-access-2vfvf\") pod \"openstack-operator-index-848vd\" (UID: \"bd73b279-e2a4-4500-aed1-70c73212cba1\") " pod="openstack-operators/openstack-operator-index-848vd" Oct 11 01:07:44 crc kubenswrapper[4743]: I1011 01:07:44.344364 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-848vd" Oct 11 01:07:44 crc kubenswrapper[4743]: I1011 01:07:44.807019 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-848vd"] Oct 11 01:07:45 crc kubenswrapper[4743]: I1011 01:07:45.066041 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-848vd" event={"ID":"bd73b279-e2a4-4500-aed1-70c73212cba1","Type":"ContainerStarted","Data":"83f755c8e1555eface11afa35c0b543dc9f7bd5538337e8e936fe296af0839cd"} Oct 11 01:07:45 crc kubenswrapper[4743]: I1011 01:07:45.066087 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-848vd" event={"ID":"bd73b279-e2a4-4500-aed1-70c73212cba1","Type":"ContainerStarted","Data":"6e2597baee8dc05a922bbf674d1238f7624e57b98abeb5348b6c9210ba5ceefb"} Oct 11 01:07:45 crc kubenswrapper[4743]: I1011 01:07:45.066205 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-gd5f2" podUID="3629019a-f2bd-4af5-b4cf-0cbe889a8f7b" containerName="registry-server" containerID="cri-o://4f23d54d329c5dc437fa072703e83dfaf3efa4855b3dc9938947832b1c5a2b7b" gracePeriod=2 Oct 11 01:07:45 crc kubenswrapper[4743]: I1011 01:07:45.080029 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-848vd" podStartSLOduration=2.024564968 podStartE2EDuration="2.080012546s" podCreationTimestamp="2025-10-11 01:07:43 +0000 UTC" firstStartedPulling="2025-10-11 01:07:44.820777527 +0000 UTC m=+959.473757924" lastFinishedPulling="2025-10-11 01:07:44.876225105 +0000 UTC m=+959.529205502" observedRunningTime="2025-10-11 01:07:45.078452328 +0000 UTC m=+959.731432735" watchObservedRunningTime="2025-10-11 01:07:45.080012546 +0000 UTC m=+959.732992933" Oct 11 01:07:45 crc kubenswrapper[4743]: I1011 01:07:45.599595 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gd5f2" Oct 11 01:07:45 crc kubenswrapper[4743]: I1011 01:07:45.802476 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql86c\" (UniqueName: \"kubernetes.io/projected/3629019a-f2bd-4af5-b4cf-0cbe889a8f7b-kube-api-access-ql86c\") pod \"3629019a-f2bd-4af5-b4cf-0cbe889a8f7b\" (UID: \"3629019a-f2bd-4af5-b4cf-0cbe889a8f7b\") " Oct 11 01:07:45 crc kubenswrapper[4743]: I1011 01:07:45.807561 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3629019a-f2bd-4af5-b4cf-0cbe889a8f7b-kube-api-access-ql86c" (OuterVolumeSpecName: "kube-api-access-ql86c") pod "3629019a-f2bd-4af5-b4cf-0cbe889a8f7b" (UID: "3629019a-f2bd-4af5-b4cf-0cbe889a8f7b"). InnerVolumeSpecName "kube-api-access-ql86c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:07:45 crc kubenswrapper[4743]: I1011 01:07:45.885069 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-v524d" Oct 11 01:07:45 crc kubenswrapper[4743]: I1011 01:07:45.904176 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql86c\" (UniqueName: \"kubernetes.io/projected/3629019a-f2bd-4af5-b4cf-0cbe889a8f7b-kube-api-access-ql86c\") on node \"crc\" DevicePath \"\"" Oct 11 01:07:46 crc kubenswrapper[4743]: I1011 01:07:46.074996 4743 generic.go:334] "Generic (PLEG): container finished" podID="3629019a-f2bd-4af5-b4cf-0cbe889a8f7b" containerID="4f23d54d329c5dc437fa072703e83dfaf3efa4855b3dc9938947832b1c5a2b7b" exitCode=0 Oct 11 01:07:46 crc kubenswrapper[4743]: I1011 01:07:46.075193 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gd5f2" Oct 11 01:07:46 crc kubenswrapper[4743]: I1011 01:07:46.075164 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gd5f2" event={"ID":"3629019a-f2bd-4af5-b4cf-0cbe889a8f7b","Type":"ContainerDied","Data":"4f23d54d329c5dc437fa072703e83dfaf3efa4855b3dc9938947832b1c5a2b7b"} Oct 11 01:07:46 crc kubenswrapper[4743]: I1011 01:07:46.075500 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gd5f2" event={"ID":"3629019a-f2bd-4af5-b4cf-0cbe889a8f7b","Type":"ContainerDied","Data":"98f2712dbc56681a7316b8987e67a22fc9a7f71f6c95f615117695304468a051"} Oct 11 01:07:46 crc kubenswrapper[4743]: I1011 01:07:46.075522 4743 scope.go:117] "RemoveContainer" containerID="4f23d54d329c5dc437fa072703e83dfaf3efa4855b3dc9938947832b1c5a2b7b" Oct 11 01:07:46 crc kubenswrapper[4743]: I1011 01:07:46.102995 4743 scope.go:117] "RemoveContainer" containerID="4f23d54d329c5dc437fa072703e83dfaf3efa4855b3dc9938947832b1c5a2b7b" Oct 11 01:07:46 crc kubenswrapper[4743]: E1011 01:07:46.104024 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f23d54d329c5dc437fa072703e83dfaf3efa4855b3dc9938947832b1c5a2b7b\": container with ID starting with 4f23d54d329c5dc437fa072703e83dfaf3efa4855b3dc9938947832b1c5a2b7b not found: ID does not exist" containerID="4f23d54d329c5dc437fa072703e83dfaf3efa4855b3dc9938947832b1c5a2b7b" Oct 11 01:07:46 crc kubenswrapper[4743]: I1011 01:07:46.104085 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f23d54d329c5dc437fa072703e83dfaf3efa4855b3dc9938947832b1c5a2b7b"} err="failed to get container status \"4f23d54d329c5dc437fa072703e83dfaf3efa4855b3dc9938947832b1c5a2b7b\": rpc error: code = NotFound desc = could not find container \"4f23d54d329c5dc437fa072703e83dfaf3efa4855b3dc9938947832b1c5a2b7b\": container with ID starting with 4f23d54d329c5dc437fa072703e83dfaf3efa4855b3dc9938947832b1c5a2b7b not found: ID does not exist" Oct 11 01:07:46 crc kubenswrapper[4743]: I1011 01:07:46.115771 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-gd5f2"] Oct 11 01:07:46 crc kubenswrapper[4743]: I1011 01:07:46.121682 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-gd5f2"] Oct 11 01:07:48 crc kubenswrapper[4743]: I1011 01:07:48.104033 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3629019a-f2bd-4af5-b4cf-0cbe889a8f7b" path="/var/lib/kubelet/pods/3629019a-f2bd-4af5-b4cf-0cbe889a8f7b/volumes" Oct 11 01:07:49 crc kubenswrapper[4743]: E1011 01:07:49.965839 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3629019a_f2bd_4af5_b4cf_0cbe889a8f7b.slice/crio-98f2712dbc56681a7316b8987e67a22fc9a7f71f6c95f615117695304468a051\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3629019a_f2bd_4af5_b4cf_0cbe889a8f7b.slice\": RecentStats: unable to find data in memory cache]" Oct 11 01:07:54 crc kubenswrapper[4743]: I1011 01:07:54.345483 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-848vd" Oct 11 01:07:54 crc kubenswrapper[4743]: I1011 01:07:54.346046 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-848vd" Oct 11 01:07:54 crc kubenswrapper[4743]: I1011 01:07:54.392017 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-848vd" Oct 11 01:07:55 crc kubenswrapper[4743]: I1011 01:07:55.203476 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-848vd" Oct 11 01:08:00 crc kubenswrapper[4743]: E1011 01:08:00.177653 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3629019a_f2bd_4af5_b4cf_0cbe889a8f7b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3629019a_f2bd_4af5_b4cf_0cbe889a8f7b.slice/crio-98f2712dbc56681a7316b8987e67a22fc9a7f71f6c95f615117695304468a051\": RecentStats: unable to find data in memory cache]" Oct 11 01:08:02 crc kubenswrapper[4743]: I1011 01:08:02.158972 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6"] Oct 11 01:08:02 crc kubenswrapper[4743]: E1011 01:08:02.159907 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3629019a-f2bd-4af5-b4cf-0cbe889a8f7b" containerName="registry-server" Oct 11 01:08:02 crc kubenswrapper[4743]: I1011 01:08:02.160104 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3629019a-f2bd-4af5-b4cf-0cbe889a8f7b" containerName="registry-server" Oct 11 01:08:02 crc kubenswrapper[4743]: I1011 01:08:02.162551 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3629019a-f2bd-4af5-b4cf-0cbe889a8f7b" containerName="registry-server" Oct 11 01:08:02 crc kubenswrapper[4743]: I1011 01:08:02.164432 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6" Oct 11 01:08:02 crc kubenswrapper[4743]: I1011 01:08:02.169641 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6"] Oct 11 01:08:02 crc kubenswrapper[4743]: I1011 01:08:02.169695 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-w674d" Oct 11 01:08:02 crc kubenswrapper[4743]: I1011 01:08:02.215593 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/61bd1af6-1438-46b0-8762-6ee4abb576cd-util\") pod \"361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6\" (UID: \"61bd1af6-1438-46b0-8762-6ee4abb576cd\") " pod="openstack-operators/361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6" Oct 11 01:08:02 crc kubenswrapper[4743]: I1011 01:08:02.215899 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/61bd1af6-1438-46b0-8762-6ee4abb576cd-bundle\") pod \"361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6\" (UID: \"61bd1af6-1438-46b0-8762-6ee4abb576cd\") " pod="openstack-operators/361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6" Oct 11 01:08:02 crc kubenswrapper[4743]: I1011 01:08:02.216036 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpqhb\" (UniqueName: \"kubernetes.io/projected/61bd1af6-1438-46b0-8762-6ee4abb576cd-kube-api-access-cpqhb\") pod \"361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6\" (UID: \"61bd1af6-1438-46b0-8762-6ee4abb576cd\") " pod="openstack-operators/361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6" Oct 11 01:08:02 crc kubenswrapper[4743]: I1011 01:08:02.317538 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/61bd1af6-1438-46b0-8762-6ee4abb576cd-util\") pod \"361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6\" (UID: \"61bd1af6-1438-46b0-8762-6ee4abb576cd\") " pod="openstack-operators/361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6" Oct 11 01:08:02 crc kubenswrapper[4743]: I1011 01:08:02.317714 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/61bd1af6-1438-46b0-8762-6ee4abb576cd-bundle\") pod \"361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6\" (UID: \"61bd1af6-1438-46b0-8762-6ee4abb576cd\") " pod="openstack-operators/361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6" Oct 11 01:08:02 crc kubenswrapper[4743]: I1011 01:08:02.317785 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpqhb\" (UniqueName: \"kubernetes.io/projected/61bd1af6-1438-46b0-8762-6ee4abb576cd-kube-api-access-cpqhb\") pod \"361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6\" (UID: \"61bd1af6-1438-46b0-8762-6ee4abb576cd\") " pod="openstack-operators/361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6" Oct 11 01:08:02 crc kubenswrapper[4743]: I1011 01:08:02.318798 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/61bd1af6-1438-46b0-8762-6ee4abb576cd-util\") pod \"361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6\" (UID: \"61bd1af6-1438-46b0-8762-6ee4abb576cd\") " pod="openstack-operators/361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6" Oct 11 01:08:02 crc kubenswrapper[4743]: I1011 01:08:02.318937 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/61bd1af6-1438-46b0-8762-6ee4abb576cd-bundle\") pod \"361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6\" (UID: \"61bd1af6-1438-46b0-8762-6ee4abb576cd\") " pod="openstack-operators/361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6" Oct 11 01:08:02 crc kubenswrapper[4743]: I1011 01:08:02.344102 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpqhb\" (UniqueName: \"kubernetes.io/projected/61bd1af6-1438-46b0-8762-6ee4abb576cd-kube-api-access-cpqhb\") pod \"361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6\" (UID: \"61bd1af6-1438-46b0-8762-6ee4abb576cd\") " pod="openstack-operators/361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6" Oct 11 01:08:02 crc kubenswrapper[4743]: I1011 01:08:02.526527 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6" Oct 11 01:08:02 crc kubenswrapper[4743]: I1011 01:08:02.977751 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6"] Oct 11 01:08:03 crc kubenswrapper[4743]: I1011 01:08:03.241344 4743 generic.go:334] "Generic (PLEG): container finished" podID="61bd1af6-1438-46b0-8762-6ee4abb576cd" containerID="1a3925c11421b857f43b755947f6882ae138ad4478a94e2c3373b1635daf3d02" exitCode=0 Oct 11 01:08:03 crc kubenswrapper[4743]: I1011 01:08:03.241425 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6" event={"ID":"61bd1af6-1438-46b0-8762-6ee4abb576cd","Type":"ContainerDied","Data":"1a3925c11421b857f43b755947f6882ae138ad4478a94e2c3373b1635daf3d02"} Oct 11 01:08:03 crc kubenswrapper[4743]: I1011 01:08:03.241663 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6" event={"ID":"61bd1af6-1438-46b0-8762-6ee4abb576cd","Type":"ContainerStarted","Data":"c0076ce927c38e224d71643b94680c6760b02bd6ba416e445e78193c88434d52"} Oct 11 01:08:04 crc kubenswrapper[4743]: I1011 01:08:04.254704 4743 generic.go:334] "Generic (PLEG): container finished" podID="61bd1af6-1438-46b0-8762-6ee4abb576cd" containerID="d1ad4280f12115e23b94e05d5abb188141b64efd70627abe2f10ac37fdafbd03" exitCode=0 Oct 11 01:08:04 crc kubenswrapper[4743]: I1011 01:08:04.254767 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6" event={"ID":"61bd1af6-1438-46b0-8762-6ee4abb576cd","Type":"ContainerDied","Data":"d1ad4280f12115e23b94e05d5abb188141b64efd70627abe2f10ac37fdafbd03"} Oct 11 01:08:05 crc kubenswrapper[4743]: I1011 01:08:05.264961 4743 generic.go:334] "Generic (PLEG): container finished" podID="61bd1af6-1438-46b0-8762-6ee4abb576cd" containerID="35649f6b3652097f2c1b55229059b19cfdcc6081473871878530ba2d82eba541" exitCode=0 Oct 11 01:08:05 crc kubenswrapper[4743]: I1011 01:08:05.265046 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6" event={"ID":"61bd1af6-1438-46b0-8762-6ee4abb576cd","Type":"ContainerDied","Data":"35649f6b3652097f2c1b55229059b19cfdcc6081473871878530ba2d82eba541"} Oct 11 01:08:06 crc kubenswrapper[4743]: I1011 01:08:06.698515 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6" Oct 11 01:08:06 crc kubenswrapper[4743]: I1011 01:08:06.795302 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpqhb\" (UniqueName: \"kubernetes.io/projected/61bd1af6-1438-46b0-8762-6ee4abb576cd-kube-api-access-cpqhb\") pod \"61bd1af6-1438-46b0-8762-6ee4abb576cd\" (UID: \"61bd1af6-1438-46b0-8762-6ee4abb576cd\") " Oct 11 01:08:06 crc kubenswrapper[4743]: I1011 01:08:06.795472 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/61bd1af6-1438-46b0-8762-6ee4abb576cd-util\") pod \"61bd1af6-1438-46b0-8762-6ee4abb576cd\" (UID: \"61bd1af6-1438-46b0-8762-6ee4abb576cd\") " Oct 11 01:08:06 crc kubenswrapper[4743]: I1011 01:08:06.795537 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/61bd1af6-1438-46b0-8762-6ee4abb576cd-bundle\") pod \"61bd1af6-1438-46b0-8762-6ee4abb576cd\" (UID: \"61bd1af6-1438-46b0-8762-6ee4abb576cd\") " Oct 11 01:08:06 crc kubenswrapper[4743]: I1011 01:08:06.796817 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61bd1af6-1438-46b0-8762-6ee4abb576cd-bundle" (OuterVolumeSpecName: "bundle") pod "61bd1af6-1438-46b0-8762-6ee4abb576cd" (UID: "61bd1af6-1438-46b0-8762-6ee4abb576cd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:08:06 crc kubenswrapper[4743]: I1011 01:08:06.808669 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61bd1af6-1438-46b0-8762-6ee4abb576cd-kube-api-access-cpqhb" (OuterVolumeSpecName: "kube-api-access-cpqhb") pod "61bd1af6-1438-46b0-8762-6ee4abb576cd" (UID: "61bd1af6-1438-46b0-8762-6ee4abb576cd"). InnerVolumeSpecName "kube-api-access-cpqhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:08:06 crc kubenswrapper[4743]: I1011 01:08:06.825762 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61bd1af6-1438-46b0-8762-6ee4abb576cd-util" (OuterVolumeSpecName: "util") pod "61bd1af6-1438-46b0-8762-6ee4abb576cd" (UID: "61bd1af6-1438-46b0-8762-6ee4abb576cd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:08:06 crc kubenswrapper[4743]: I1011 01:08:06.897668 4743 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/61bd1af6-1438-46b0-8762-6ee4abb576cd-util\") on node \"crc\" DevicePath \"\"" Oct 11 01:08:06 crc kubenswrapper[4743]: I1011 01:08:06.897704 4743 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/61bd1af6-1438-46b0-8762-6ee4abb576cd-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:08:06 crc kubenswrapper[4743]: I1011 01:08:06.897716 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpqhb\" (UniqueName: \"kubernetes.io/projected/61bd1af6-1438-46b0-8762-6ee4abb576cd-kube-api-access-cpqhb\") on node \"crc\" DevicePath \"\"" Oct 11 01:08:07 crc kubenswrapper[4743]: I1011 01:08:07.284552 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6" event={"ID":"61bd1af6-1438-46b0-8762-6ee4abb576cd","Type":"ContainerDied","Data":"c0076ce927c38e224d71643b94680c6760b02bd6ba416e445e78193c88434d52"} Oct 11 01:08:07 crc kubenswrapper[4743]: I1011 01:08:07.284610 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0076ce927c38e224d71643b94680c6760b02bd6ba416e445e78193c88434d52" Oct 11 01:08:07 crc kubenswrapper[4743]: I1011 01:08:07.284627 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6" Oct 11 01:08:10 crc kubenswrapper[4743]: E1011 01:08:10.373218 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3629019a_f2bd_4af5_b4cf_0cbe889a8f7b.slice/crio-98f2712dbc56681a7316b8987e67a22fc9a7f71f6c95f615117695304468a051\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3629019a_f2bd_4af5_b4cf_0cbe889a8f7b.slice\": RecentStats: unable to find data in memory cache]" Oct 11 01:08:13 crc kubenswrapper[4743]: I1011 01:08:13.524562 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-79cb6b48d5-wqg8k"] Oct 11 01:08:13 crc kubenswrapper[4743]: E1011 01:08:13.525252 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61bd1af6-1438-46b0-8762-6ee4abb576cd" containerName="extract" Oct 11 01:08:13 crc kubenswrapper[4743]: I1011 01:08:13.525268 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="61bd1af6-1438-46b0-8762-6ee4abb576cd" containerName="extract" Oct 11 01:08:13 crc kubenswrapper[4743]: E1011 01:08:13.525280 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61bd1af6-1438-46b0-8762-6ee4abb576cd" containerName="pull" Oct 11 01:08:13 crc kubenswrapper[4743]: I1011 01:08:13.525288 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="61bd1af6-1438-46b0-8762-6ee4abb576cd" containerName="pull" Oct 11 01:08:13 crc kubenswrapper[4743]: E1011 01:08:13.525302 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61bd1af6-1438-46b0-8762-6ee4abb576cd" containerName="util" Oct 11 01:08:13 crc kubenswrapper[4743]: I1011 01:08:13.525311 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="61bd1af6-1438-46b0-8762-6ee4abb576cd" containerName="util" Oct 11 01:08:13 crc kubenswrapper[4743]: I1011 01:08:13.525497 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="61bd1af6-1438-46b0-8762-6ee4abb576cd" containerName="extract" Oct 11 01:08:13 crc kubenswrapper[4743]: I1011 01:08:13.526484 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-79cb6b48d5-wqg8k" Oct 11 01:08:13 crc kubenswrapper[4743]: I1011 01:08:13.529030 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-n4dfj" Oct 11 01:08:13 crc kubenswrapper[4743]: I1011 01:08:13.550185 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-79cb6b48d5-wqg8k"] Oct 11 01:08:13 crc kubenswrapper[4743]: I1011 01:08:13.703831 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwbcn\" (UniqueName: \"kubernetes.io/projected/9607e624-b661-41bd-bfaf-ceb7e552fbf2-kube-api-access-lwbcn\") pod \"openstack-operator-controller-operator-79cb6b48d5-wqg8k\" (UID: \"9607e624-b661-41bd-bfaf-ceb7e552fbf2\") " pod="openstack-operators/openstack-operator-controller-operator-79cb6b48d5-wqg8k" Oct 11 01:08:13 crc kubenswrapper[4743]: I1011 01:08:13.805077 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwbcn\" (UniqueName: \"kubernetes.io/projected/9607e624-b661-41bd-bfaf-ceb7e552fbf2-kube-api-access-lwbcn\") pod \"openstack-operator-controller-operator-79cb6b48d5-wqg8k\" (UID: \"9607e624-b661-41bd-bfaf-ceb7e552fbf2\") " pod="openstack-operators/openstack-operator-controller-operator-79cb6b48d5-wqg8k" Oct 11 01:08:13 crc kubenswrapper[4743]: I1011 01:08:13.824611 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwbcn\" (UniqueName: \"kubernetes.io/projected/9607e624-b661-41bd-bfaf-ceb7e552fbf2-kube-api-access-lwbcn\") pod \"openstack-operator-controller-operator-79cb6b48d5-wqg8k\" (UID: \"9607e624-b661-41bd-bfaf-ceb7e552fbf2\") " pod="openstack-operators/openstack-operator-controller-operator-79cb6b48d5-wqg8k" Oct 11 01:08:13 crc kubenswrapper[4743]: I1011 01:08:13.909672 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-79cb6b48d5-wqg8k" Oct 11 01:08:14 crc kubenswrapper[4743]: I1011 01:08:14.387364 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-79cb6b48d5-wqg8k"] Oct 11 01:08:15 crc kubenswrapper[4743]: I1011 01:08:15.375612 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-79cb6b48d5-wqg8k" event={"ID":"9607e624-b661-41bd-bfaf-ceb7e552fbf2","Type":"ContainerStarted","Data":"f1249dac4b18b73524f89b153565602c6b1949632b98e73e826b0b6cc2716999"} Oct 11 01:08:18 crc kubenswrapper[4743]: I1011 01:08:18.407121 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-79cb6b48d5-wqg8k" event={"ID":"9607e624-b661-41bd-bfaf-ceb7e552fbf2","Type":"ContainerStarted","Data":"5ed2398466d2e66cb0115a18b3b279b53f2df58f6e4729e40f1553f7a57cf4bb"} Oct 11 01:08:20 crc kubenswrapper[4743]: E1011 01:08:20.531230 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3629019a_f2bd_4af5_b4cf_0cbe889a8f7b.slice/crio-98f2712dbc56681a7316b8987e67a22fc9a7f71f6c95f615117695304468a051\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3629019a_f2bd_4af5_b4cf_0cbe889a8f7b.slice\": RecentStats: unable to find data in memory cache]" Oct 11 01:08:21 crc kubenswrapper[4743]: I1011 01:08:21.431641 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-79cb6b48d5-wqg8k" event={"ID":"9607e624-b661-41bd-bfaf-ceb7e552fbf2","Type":"ContainerStarted","Data":"2ac3c708edef81c1eeb6adfec68ecb240b2920241675b96ae20e6e54ff981605"} Oct 11 01:08:21 crc kubenswrapper[4743]: I1011 01:08:21.432131 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-79cb6b48d5-wqg8k" Oct 11 01:08:21 crc kubenswrapper[4743]: I1011 01:08:21.484261 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-79cb6b48d5-wqg8k" podStartSLOduration=2.285478816 podStartE2EDuration="8.48423842s" podCreationTimestamp="2025-10-11 01:08:13 +0000 UTC" firstStartedPulling="2025-10-11 01:08:14.409190147 +0000 UTC m=+989.062170554" lastFinishedPulling="2025-10-11 01:08:20.607949761 +0000 UTC m=+995.260930158" observedRunningTime="2025-10-11 01:08:21.478181008 +0000 UTC m=+996.131161445" watchObservedRunningTime="2025-10-11 01:08:21.48423842 +0000 UTC m=+996.137218857" Oct 11 01:08:23 crc kubenswrapper[4743]: I1011 01:08:23.913843 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-79cb6b48d5-wqg8k" Oct 11 01:08:30 crc kubenswrapper[4743]: E1011 01:08:30.772993 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3629019a_f2bd_4af5_b4cf_0cbe889a8f7b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3629019a_f2bd_4af5_b4cf_0cbe889a8f7b.slice/crio-98f2712dbc56681a7316b8987e67a22fc9a7f71f6c95f615117695304468a051\": RecentStats: unable to find data in memory cache]" Oct 11 01:08:40 crc kubenswrapper[4743]: E1011 01:08:40.963191 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3629019a_f2bd_4af5_b4cf_0cbe889a8f7b.slice/crio-98f2712dbc56681a7316b8987e67a22fc9a7f71f6c95f615117695304468a051\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3629019a_f2bd_4af5_b4cf_0cbe889a8f7b.slice\": RecentStats: unable to find data in memory cache]" Oct 11 01:08:46 crc kubenswrapper[4743]: E1011 01:08:46.120132 4743 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/c0f9d294e62eeafb0fba6b51d6cfaad73e3d1fc13f26fc761cd71e46f7ed3f38/diff" to get inode usage: stat /var/lib/containers/storage/overlay/c0f9d294e62eeafb0fba6b51d6cfaad73e3d1fc13f26fc761cd71e46f7ed3f38/diff: no such file or directory, extraDiskErr: Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.566433 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-nfnkk"] Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.568274 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-nfnkk" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.570424 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-6vhwk" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.598292 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-lbnjc"] Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.599688 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-lbnjc" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.602082 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-7njgr" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.606230 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-nfnkk"] Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.615265 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-95j97"] Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.616432 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-95j97" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.623392 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-485st" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.624896 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-lbnjc"] Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.635254 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-d58jj"] Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.636399 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-d58jj" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.638893 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-8rtjq" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.652170 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-sjsvr"] Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.653649 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-sjsvr" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.657042 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-4spw6" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.672611 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4nmq\" (UniqueName: \"kubernetes.io/projected/ed4aa42c-bd83-4fa8-99f2-5d7cde436979-kube-api-access-s4nmq\") pod \"barbican-operator-controller-manager-64f84fcdbb-nfnkk\" (UID: \"ed4aa42c-bd83-4fa8-99f2-5d7cde436979\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-nfnkk" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.672695 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88852\" (UniqueName: \"kubernetes.io/projected/a2f6fc09-a2cb-46e0-9fe6-e5dad1025bfe-kube-api-access-88852\") pod \"heat-operator-controller-manager-6d9967f8dd-sjsvr\" (UID: \"a2f6fc09-a2cb-46e0-9fe6-e5dad1025bfe\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-sjsvr" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.672728 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pmks\" (UniqueName: \"kubernetes.io/projected/01d65505-63e1-4355-a1dc-675d22f5bdea-kube-api-access-5pmks\") pod \"designate-operator-controller-manager-687df44cdb-95j97\" (UID: \"01d65505-63e1-4355-a1dc-675d22f5bdea\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-95j97" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.672752 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqwfr\" (UniqueName: \"kubernetes.io/projected/3f7d0e6e-1b92-48de-910b-ef415fac5e7c-kube-api-access-dqwfr\") pod \"glance-operator-controller-manager-7bb46cd7d-d58jj\" (UID: \"3f7d0e6e-1b92-48de-910b-ef415fac5e7c\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-d58jj" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.672785 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c48z5\" (UniqueName: \"kubernetes.io/projected/088785b6-72f1-472b-accb-fef0261e024b-kube-api-access-c48z5\") pod \"cinder-operator-controller-manager-59cdc64769-lbnjc\" (UID: \"088785b6-72f1-472b-accb-fef0261e024b\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-lbnjc" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.682995 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-d58jj"] Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.691938 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-95j97"] Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.694932 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-sjsvr"] Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.715009 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-5t64t"] Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.716510 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-5t64t" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.721283 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-xccsz" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.723549 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-6pvq9"] Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.724654 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6pvq9" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.728224 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.728315 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5j9dn" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.738821 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-5t64t"] Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.749972 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-6pvq9"] Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.770637 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-zrk9n"] Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.771753 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-zrk9n" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.781368 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-kqxpq" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.782570 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4nmq\" (UniqueName: \"kubernetes.io/projected/ed4aa42c-bd83-4fa8-99f2-5d7cde436979-kube-api-access-s4nmq\") pod \"barbican-operator-controller-manager-64f84fcdbb-nfnkk\" (UID: \"ed4aa42c-bd83-4fa8-99f2-5d7cde436979\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-nfnkk" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.782696 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88852\" (UniqueName: \"kubernetes.io/projected/a2f6fc09-a2cb-46e0-9fe6-e5dad1025bfe-kube-api-access-88852\") pod \"heat-operator-controller-manager-6d9967f8dd-sjsvr\" (UID: \"a2f6fc09-a2cb-46e0-9fe6-e5dad1025bfe\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-sjsvr" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.782881 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pmks\" (UniqueName: \"kubernetes.io/projected/01d65505-63e1-4355-a1dc-675d22f5bdea-kube-api-access-5pmks\") pod \"designate-operator-controller-manager-687df44cdb-95j97\" (UID: \"01d65505-63e1-4355-a1dc-675d22f5bdea\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-95j97" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.782911 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqwfr\" (UniqueName: \"kubernetes.io/projected/3f7d0e6e-1b92-48de-910b-ef415fac5e7c-kube-api-access-dqwfr\") pod \"glance-operator-controller-manager-7bb46cd7d-d58jj\" (UID: \"3f7d0e6e-1b92-48de-910b-ef415fac5e7c\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-d58jj" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.782964 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c48z5\" (UniqueName: \"kubernetes.io/projected/088785b6-72f1-472b-accb-fef0261e024b-kube-api-access-c48z5\") pod \"cinder-operator-controller-manager-59cdc64769-lbnjc\" (UID: \"088785b6-72f1-472b-accb-fef0261e024b\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-lbnjc" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.783073 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqpgq\" (UniqueName: \"kubernetes.io/projected/53fca326-a309-4ff5-b52f-8b547496c069-kube-api-access-sqpgq\") pod \"ironic-operator-controller-manager-74cb5cbc49-zrk9n\" (UID: \"53fca326-a309-4ff5-b52f-8b547496c069\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-zrk9n" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.783736 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-zrk9n"] Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.824923 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-cbc66"] Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.852742 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88852\" (UniqueName: \"kubernetes.io/projected/a2f6fc09-a2cb-46e0-9fe6-e5dad1025bfe-kube-api-access-88852\") pod \"heat-operator-controller-manager-6d9967f8dd-sjsvr\" (UID: \"a2f6fc09-a2cb-46e0-9fe6-e5dad1025bfe\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-sjsvr" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.853412 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c48z5\" (UniqueName: \"kubernetes.io/projected/088785b6-72f1-472b-accb-fef0261e024b-kube-api-access-c48z5\") pod \"cinder-operator-controller-manager-59cdc64769-lbnjc\" (UID: \"088785b6-72f1-472b-accb-fef0261e024b\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-lbnjc" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.882631 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-cbc66" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.885722 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbfv2\" (UniqueName: \"kubernetes.io/projected/f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b-kube-api-access-bbfv2\") pod \"infra-operator-controller-manager-585fc5b659-6pvq9\" (UID: \"f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6pvq9" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.885767 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r92fm\" (UniqueName: \"kubernetes.io/projected/547f1b93-dffd-4c63-964a-e3ea6d29970e-kube-api-access-r92fm\") pod \"horizon-operator-controller-manager-6d74794d9b-5t64t\" (UID: \"547f1b93-dffd-4c63-964a-e3ea6d29970e\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-5t64t" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.885804 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b-cert\") pod \"infra-operator-controller-manager-585fc5b659-6pvq9\" (UID: \"f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6pvq9" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.885837 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqpgq\" (UniqueName: \"kubernetes.io/projected/53fca326-a309-4ff5-b52f-8b547496c069-kube-api-access-sqpgq\") pod \"ironic-operator-controller-manager-74cb5cbc49-zrk9n\" (UID: \"53fca326-a309-4ff5-b52f-8b547496c069\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-zrk9n" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.885908 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkwr4\" (UniqueName: \"kubernetes.io/projected/2726f212-a3ba-48cb-a96f-8d5f117a7f5e-kube-api-access-lkwr4\") pod \"keystone-operator-controller-manager-ddb98f99b-cbc66\" (UID: \"2726f212-a3ba-48cb-a96f-8d5f117a7f5e\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-cbc66" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.915093 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-vvcjm"] Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.915812 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-lbnjc" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.921253 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqwfr\" (UniqueName: \"kubernetes.io/projected/3f7d0e6e-1b92-48de-910b-ef415fac5e7c-kube-api-access-dqwfr\") pod \"glance-operator-controller-manager-7bb46cd7d-d58jj\" (UID: \"3f7d0e6e-1b92-48de-910b-ef415fac5e7c\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-d58jj" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.931173 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-6xlkl" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.972104 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pmks\" (UniqueName: \"kubernetes.io/projected/01d65505-63e1-4355-a1dc-675d22f5bdea-kube-api-access-5pmks\") pod \"designate-operator-controller-manager-687df44cdb-95j97\" (UID: \"01d65505-63e1-4355-a1dc-675d22f5bdea\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-95j97" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.972773 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-59578bc799-vvcjm" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.973429 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqpgq\" (UniqueName: \"kubernetes.io/projected/53fca326-a309-4ff5-b52f-8b547496c069-kube-api-access-sqpgq\") pod \"ironic-operator-controller-manager-74cb5cbc49-zrk9n\" (UID: \"53fca326-a309-4ff5-b52f-8b547496c069\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-zrk9n" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.973854 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-d58jj" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.974347 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4nmq\" (UniqueName: \"kubernetes.io/projected/ed4aa42c-bd83-4fa8-99f2-5d7cde436979-kube-api-access-s4nmq\") pod \"barbican-operator-controller-manager-64f84fcdbb-nfnkk\" (UID: \"ed4aa42c-bd83-4fa8-99f2-5d7cde436979\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-nfnkk" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.975644 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-qs4hf"] Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.978810 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-qs4hf" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.982313 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-sjsvr" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.982794 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-cbc66"] Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.988571 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbfv2\" (UniqueName: \"kubernetes.io/projected/f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b-kube-api-access-bbfv2\") pod \"infra-operator-controller-manager-585fc5b659-6pvq9\" (UID: \"f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6pvq9" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.988614 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r92fm\" (UniqueName: \"kubernetes.io/projected/547f1b93-dffd-4c63-964a-e3ea6d29970e-kube-api-access-r92fm\") pod \"horizon-operator-controller-manager-6d74794d9b-5t64t\" (UID: \"547f1b93-dffd-4c63-964a-e3ea6d29970e\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-5t64t" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.988646 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b-cert\") pod \"infra-operator-controller-manager-585fc5b659-6pvq9\" (UID: \"f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6pvq9" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.988695 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkwr4\" (UniqueName: \"kubernetes.io/projected/2726f212-a3ba-48cb-a96f-8d5f117a7f5e-kube-api-access-lkwr4\") pod \"keystone-operator-controller-manager-ddb98f99b-cbc66\" (UID: \"2726f212-a3ba-48cb-a96f-8d5f117a7f5e\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-cbc66" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.988884 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-f5vln" Oct 11 01:08:59 crc kubenswrapper[4743]: I1011 01:08:59.989007 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-h5h2j" Oct 11 01:08:59 crc kubenswrapper[4743]: E1011 01:08:59.989217 4743 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 11 01:08:59 crc kubenswrapper[4743]: E1011 01:08:59.989263 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b-cert podName:f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b nodeName:}" failed. No retries permitted until 2025-10-11 01:09:00.489246346 +0000 UTC m=+1035.142226743 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b-cert") pod "infra-operator-controller-manager-585fc5b659-6pvq9" (UID: "f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b") : secret "infra-operator-webhook-server-cert" not found Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.007983 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-qs4hf"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.011361 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-vvcjm"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.011585 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkwr4\" (UniqueName: \"kubernetes.io/projected/2726f212-a3ba-48cb-a96f-8d5f117a7f5e-kube-api-access-lkwr4\") pod \"keystone-operator-controller-manager-ddb98f99b-cbc66\" (UID: \"2726f212-a3ba-48cb-a96f-8d5f117a7f5e\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-cbc66" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.020691 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r92fm\" (UniqueName: \"kubernetes.io/projected/547f1b93-dffd-4c63-964a-e3ea6d29970e-kube-api-access-r92fm\") pod \"horizon-operator-controller-manager-6d74794d9b-5t64t\" (UID: \"547f1b93-dffd-4c63-964a-e3ea6d29970e\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-5t64t" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.020765 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-2htrs"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.025206 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-2htrs" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.028783 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbfv2\" (UniqueName: \"kubernetes.io/projected/f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b-kube-api-access-bbfv2\") pod \"infra-operator-controller-manager-585fc5b659-6pvq9\" (UID: \"f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6pvq9" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.028974 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-tjv27" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.039293 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-5t64t" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.053700 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-2htrs"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.062647 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-d29v4"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.064095 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-d29v4" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.067653 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-nt865"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.068391 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-587qz" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.068998 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-nt865" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.070707 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-mgh54" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.075676 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-d29v4"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.080925 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-869cc7797f-vt9w7"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.082326 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-vt9w7" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.084674 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-jjh6f" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.086026 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-nt865"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.090000 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znjjv\" (UniqueName: \"kubernetes.io/projected/af96a190-49a7-4179-be4f-4a636d004cd0-kube-api-access-znjjv\") pod \"manila-operator-controller-manager-59578bc799-vvcjm\" (UID: \"af96a190-49a7-4179-be4f-4a636d004cd0\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-vvcjm" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.090041 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvwd2\" (UniqueName: \"kubernetes.io/projected/be590325-60d2-4f91-9e78-a5520788cfed-kube-api-access-kvwd2\") pod \"mariadb-operator-controller-manager-5777b4f897-qs4hf\" (UID: \"be590325-60d2-4f91-9e78-a5520788cfed\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-qs4hf" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.112101 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-zrk9n" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.118001 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.122162 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-869cc7797f-vt9w7"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.122181 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-6pzz7"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.122632 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.122982 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-6pzz7"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.123191 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.123282 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-664664cb68-6pzz7" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.127638 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-drm6k" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.127835 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.127901 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-cxzw6"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.127959 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-wxbg7" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.129120 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-cxzw6" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.130163 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-hxqbj" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.164808 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-cxzw6"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.183927 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-679ff79844-2dvm2"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.190653 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-679ff79844-2dvm2" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.192071 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-9fxv5" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.192243 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-679ff79844-2dvm2"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.192774 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n8mn\" (UniqueName: \"kubernetes.io/projected/a79f9419-0d04-41bb-b1ab-1615888819df-kube-api-access-2n8mn\") pod \"octavia-operator-controller-manager-6d7c7ddf95-nt865\" (UID: \"a79f9419-0d04-41bb-b1ab-1615888819df\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-nt865" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.192823 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b8570e7-9b29-4a55-95e3-a3a588ba4083-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv\" (UID: \"6b8570e7-9b29-4a55-95e3-a3a588ba4083\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.192848 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q4r7\" (UniqueName: \"kubernetes.io/projected/c39ea94c-0f30-4b04-8a87-848ee9a62740-kube-api-access-2q4r7\") pod \"ovn-operator-controller-manager-869cc7797f-vt9w7\" (UID: \"c39ea94c-0f30-4b04-8a87-848ee9a62740\") " pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-vt9w7" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.192920 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4bvv\" (UniqueName: \"kubernetes.io/projected/035ca3e9-4fd6-4cb1-8e82-8ebcf39146a3-kube-api-access-q4bvv\") pod \"neutron-operator-controller-manager-797d478b46-2htrs\" (UID: \"035ca3e9-4fd6-4cb1-8e82-8ebcf39146a3\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-2htrs" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.192943 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kkr4\" (UniqueName: \"kubernetes.io/projected/6b8570e7-9b29-4a55-95e3-a3a588ba4083-kube-api-access-8kkr4\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv\" (UID: \"6b8570e7-9b29-4a55-95e3-a3a588ba4083\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.192972 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kltrj\" (UniqueName: \"kubernetes.io/projected/4be72879-00a9-4253-9ad9-c266c32b968e-kube-api-access-kltrj\") pod \"nova-operator-controller-manager-57bb74c7bf-d29v4\" (UID: \"4be72879-00a9-4253-9ad9-c266c32b968e\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-d29v4" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.193007 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtmh9\" (UniqueName: \"kubernetes.io/projected/884ddb30-3a64-4bf2-83ac-3acc83e8bd96-kube-api-access-dtmh9\") pod \"placement-operator-controller-manager-664664cb68-6pzz7\" (UID: \"884ddb30-3a64-4bf2-83ac-3acc83e8bd96\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-6pzz7" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.193025 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znjjv\" (UniqueName: \"kubernetes.io/projected/af96a190-49a7-4179-be4f-4a636d004cd0-kube-api-access-znjjv\") pod \"manila-operator-controller-manager-59578bc799-vvcjm\" (UID: \"af96a190-49a7-4179-be4f-4a636d004cd0\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-vvcjm" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.193104 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvwd2\" (UniqueName: \"kubernetes.io/projected/be590325-60d2-4f91-9e78-a5520788cfed-kube-api-access-kvwd2\") pod \"mariadb-operator-controller-manager-5777b4f897-qs4hf\" (UID: \"be590325-60d2-4f91-9e78-a5520788cfed\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-qs4hf" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.196113 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-nfnkk" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.219174 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znjjv\" (UniqueName: \"kubernetes.io/projected/af96a190-49a7-4179-be4f-4a636d004cd0-kube-api-access-znjjv\") pod \"manila-operator-controller-manager-59578bc799-vvcjm\" (UID: \"af96a190-49a7-4179-be4f-4a636d004cd0\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-vvcjm" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.220142 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvwd2\" (UniqueName: \"kubernetes.io/projected/be590325-60d2-4f91-9e78-a5520788cfed-kube-api-access-kvwd2\") pod \"mariadb-operator-controller-manager-5777b4f897-qs4hf\" (UID: \"be590325-60d2-4f91-9e78-a5520788cfed\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-qs4hf" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.234992 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-9qjkk"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.236164 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9qjkk" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.239654 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-95j97" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.239712 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-5xrln" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.243405 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-9qjkk"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.310741 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-cbc66" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.310985 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kltrj\" (UniqueName: \"kubernetes.io/projected/4be72879-00a9-4253-9ad9-c266c32b968e-kube-api-access-kltrj\") pod \"nova-operator-controller-manager-57bb74c7bf-d29v4\" (UID: \"4be72879-00a9-4253-9ad9-c266c32b968e\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-d29v4" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.311063 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4wns\" (UniqueName: \"kubernetes.io/projected/36bd30aa-037d-4e7d-ae0f-fb53fe20f812-kube-api-access-j4wns\") pod \"swift-operator-controller-manager-5f4d5dfdc6-cxzw6\" (UID: \"36bd30aa-037d-4e7d-ae0f-fb53fe20f812\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-cxzw6" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.311150 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtmh9\" (UniqueName: \"kubernetes.io/projected/884ddb30-3a64-4bf2-83ac-3acc83e8bd96-kube-api-access-dtmh9\") pod \"placement-operator-controller-manager-664664cb68-6pzz7\" (UID: \"884ddb30-3a64-4bf2-83ac-3acc83e8bd96\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-6pzz7" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.311205 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g4rw\" (UniqueName: \"kubernetes.io/projected/7967fda3-d5ca-4e28-878e-e50017efc60f-kube-api-access-8g4rw\") pod \"telemetry-operator-controller-manager-679ff79844-2dvm2\" (UID: \"7967fda3-d5ca-4e28-878e-e50017efc60f\") " pod="openstack-operators/telemetry-operator-controller-manager-679ff79844-2dvm2" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.311343 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n8mn\" (UniqueName: \"kubernetes.io/projected/a79f9419-0d04-41bb-b1ab-1615888819df-kube-api-access-2n8mn\") pod \"octavia-operator-controller-manager-6d7c7ddf95-nt865\" (UID: \"a79f9419-0d04-41bb-b1ab-1615888819df\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-nt865" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.311405 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b8570e7-9b29-4a55-95e3-a3a588ba4083-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv\" (UID: \"6b8570e7-9b29-4a55-95e3-a3a588ba4083\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.311451 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q4r7\" (UniqueName: \"kubernetes.io/projected/c39ea94c-0f30-4b04-8a87-848ee9a62740-kube-api-access-2q4r7\") pod \"ovn-operator-controller-manager-869cc7797f-vt9w7\" (UID: \"c39ea94c-0f30-4b04-8a87-848ee9a62740\") " pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-vt9w7" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.311511 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4bvv\" (UniqueName: \"kubernetes.io/projected/035ca3e9-4fd6-4cb1-8e82-8ebcf39146a3-kube-api-access-q4bvv\") pod \"neutron-operator-controller-manager-797d478b46-2htrs\" (UID: \"035ca3e9-4fd6-4cb1-8e82-8ebcf39146a3\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-2htrs" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.311550 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kkr4\" (UniqueName: \"kubernetes.io/projected/6b8570e7-9b29-4a55-95e3-a3a588ba4083-kube-api-access-8kkr4\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv\" (UID: \"6b8570e7-9b29-4a55-95e3-a3a588ba4083\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv" Oct 11 01:09:00 crc kubenswrapper[4743]: E1011 01:09:00.312066 4743 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 11 01:09:00 crc kubenswrapper[4743]: E1011 01:09:00.312113 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b8570e7-9b29-4a55-95e3-a3a588ba4083-cert podName:6b8570e7-9b29-4a55-95e3-a3a588ba4083 nodeName:}" failed. No retries permitted until 2025-10-11 01:09:00.812096418 +0000 UTC m=+1035.465076815 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b8570e7-9b29-4a55-95e3-a3a588ba4083-cert") pod "openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv" (UID: "6b8570e7-9b29-4a55-95e3-a3a588ba4083") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.326575 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-646675d848-ctp5x"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.336554 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-qs4hf" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.341357 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-646675d848-ctp5x" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.349995 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-59578bc799-vvcjm" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.353089 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-xvrrj" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.380529 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kkr4\" (UniqueName: \"kubernetes.io/projected/6b8570e7-9b29-4a55-95e3-a3a588ba4083-kube-api-access-8kkr4\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv\" (UID: \"6b8570e7-9b29-4a55-95e3-a3a588ba4083\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.383406 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n8mn\" (UniqueName: \"kubernetes.io/projected/a79f9419-0d04-41bb-b1ab-1615888819df-kube-api-access-2n8mn\") pod \"octavia-operator-controller-manager-6d7c7ddf95-nt865\" (UID: \"a79f9419-0d04-41bb-b1ab-1615888819df\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-nt865" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.397998 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4bvv\" (UniqueName: \"kubernetes.io/projected/035ca3e9-4fd6-4cb1-8e82-8ebcf39146a3-kube-api-access-q4bvv\") pod \"neutron-operator-controller-manager-797d478b46-2htrs\" (UID: \"035ca3e9-4fd6-4cb1-8e82-8ebcf39146a3\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-2htrs" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.404599 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtmh9\" (UniqueName: \"kubernetes.io/projected/884ddb30-3a64-4bf2-83ac-3acc83e8bd96-kube-api-access-dtmh9\") pod \"placement-operator-controller-manager-664664cb68-6pzz7\" (UID: \"884ddb30-3a64-4bf2-83ac-3acc83e8bd96\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-6pzz7" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.422514 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4wns\" (UniqueName: \"kubernetes.io/projected/36bd30aa-037d-4e7d-ae0f-fb53fe20f812-kube-api-access-j4wns\") pod \"swift-operator-controller-manager-5f4d5dfdc6-cxzw6\" (UID: \"36bd30aa-037d-4e7d-ae0f-fb53fe20f812\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-cxzw6" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.422592 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sgcb\" (UniqueName: \"kubernetes.io/projected/962796b2-2bc0-4db5-84be-df36bbc28121-kube-api-access-4sgcb\") pod \"test-operator-controller-manager-ffcdd6c94-9qjkk\" (UID: \"962796b2-2bc0-4db5-84be-df36bbc28121\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9qjkk" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.422624 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g4rw\" (UniqueName: \"kubernetes.io/projected/7967fda3-d5ca-4e28-878e-e50017efc60f-kube-api-access-8g4rw\") pod \"telemetry-operator-controller-manager-679ff79844-2dvm2\" (UID: \"7967fda3-d5ca-4e28-878e-e50017efc60f\") " pod="openstack-operators/telemetry-operator-controller-manager-679ff79844-2dvm2" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.442558 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q4r7\" (UniqueName: \"kubernetes.io/projected/c39ea94c-0f30-4b04-8a87-848ee9a62740-kube-api-access-2q4r7\") pod \"ovn-operator-controller-manager-869cc7797f-vt9w7\" (UID: \"c39ea94c-0f30-4b04-8a87-848ee9a62740\") " pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-vt9w7" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.443095 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kltrj\" (UniqueName: \"kubernetes.io/projected/4be72879-00a9-4253-9ad9-c266c32b968e-kube-api-access-kltrj\") pod \"nova-operator-controller-manager-57bb74c7bf-d29v4\" (UID: \"4be72879-00a9-4253-9ad9-c266c32b968e\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-d29v4" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.460326 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4wns\" (UniqueName: \"kubernetes.io/projected/36bd30aa-037d-4e7d-ae0f-fb53fe20f812-kube-api-access-j4wns\") pod \"swift-operator-controller-manager-5f4d5dfdc6-cxzw6\" (UID: \"36bd30aa-037d-4e7d-ae0f-fb53fe20f812\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-cxzw6" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.487837 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-646675d848-ctp5x"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.493273 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-664664cb68-6pzz7" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.494569 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g4rw\" (UniqueName: \"kubernetes.io/projected/7967fda3-d5ca-4e28-878e-e50017efc60f-kube-api-access-8g4rw\") pod \"telemetry-operator-controller-manager-679ff79844-2dvm2\" (UID: \"7967fda3-d5ca-4e28-878e-e50017efc60f\") " pod="openstack-operators/telemetry-operator-controller-manager-679ff79844-2dvm2" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.508146 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-cxzw6" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.527369 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-679ff79844-2dvm2" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.536463 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb9dp\" (UniqueName: \"kubernetes.io/projected/f1a6e436-6a36-45a8-a033-f8d307ba12bd-kube-api-access-fb9dp\") pod \"watcher-operator-controller-manager-646675d848-ctp5x\" (UID: \"f1a6e436-6a36-45a8-a033-f8d307ba12bd\") " pod="openstack-operators/watcher-operator-controller-manager-646675d848-ctp5x" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.536519 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sgcb\" (UniqueName: \"kubernetes.io/projected/962796b2-2bc0-4db5-84be-df36bbc28121-kube-api-access-4sgcb\") pod \"test-operator-controller-manager-ffcdd6c94-9qjkk\" (UID: \"962796b2-2bc0-4db5-84be-df36bbc28121\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9qjkk" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.536553 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b-cert\") pod \"infra-operator-controller-manager-585fc5b659-6pvq9\" (UID: \"f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6pvq9" Oct 11 01:09:00 crc kubenswrapper[4743]: E1011 01:09:00.536669 4743 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 11 01:09:00 crc kubenswrapper[4743]: E1011 01:09:00.536721 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b-cert podName:f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b nodeName:}" failed. No retries permitted until 2025-10-11 01:09:01.536706354 +0000 UTC m=+1036.189686751 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b-cert") pod "infra-operator-controller-manager-585fc5b659-6pvq9" (UID: "f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b") : secret "infra-operator-webhook-server-cert" not found Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.561966 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-lbnjc"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.569471 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sgcb\" (UniqueName: \"kubernetes.io/projected/962796b2-2bc0-4db5-84be-df36bbc28121-kube-api-access-4sgcb\") pod \"test-operator-controller-manager-ffcdd6c94-9qjkk\" (UID: \"962796b2-2bc0-4db5-84be-df36bbc28121\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9qjkk" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.583571 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6bc9d748dc-5vs6z"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.584932 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6bc9d748dc-5vs6z" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.589919 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6bc9d748dc-5vs6z"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.592248 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.592333 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-g7dhj" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.606011 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w94wj"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.606960 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w94wj" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.609232 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-g9vvh" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.623735 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w94wj"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.639706 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb9dp\" (UniqueName: \"kubernetes.io/projected/f1a6e436-6a36-45a8-a033-f8d307ba12bd-kube-api-access-fb9dp\") pod \"watcher-operator-controller-manager-646675d848-ctp5x\" (UID: \"f1a6e436-6a36-45a8-a033-f8d307ba12bd\") " pod="openstack-operators/watcher-operator-controller-manager-646675d848-ctp5x" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.647405 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-d58jj"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.659146 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-2htrs" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.668257 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb9dp\" (UniqueName: \"kubernetes.io/projected/f1a6e436-6a36-45a8-a033-f8d307ba12bd-kube-api-access-fb9dp\") pod \"watcher-operator-controller-manager-646675d848-ctp5x\" (UID: \"f1a6e436-6a36-45a8-a033-f8d307ba12bd\") " pod="openstack-operators/watcher-operator-controller-manager-646675d848-ctp5x" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.687440 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-d29v4" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.700353 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-nt865" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.717352 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-vt9w7" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.741471 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cec9bfbd-c515-4bcc-8bf7-63648ecd230b-cert\") pod \"openstack-operator-controller-manager-6bc9d748dc-5vs6z\" (UID: \"cec9bfbd-c515-4bcc-8bf7-63648ecd230b\") " pod="openstack-operators/openstack-operator-controller-manager-6bc9d748dc-5vs6z" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.741558 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7bw2\" (UniqueName: \"kubernetes.io/projected/fd811aaa-ab9b-4d34-a268-ecbfa76bf43a-kube-api-access-t7bw2\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-w94wj\" (UID: \"fd811aaa-ab9b-4d34-a268-ecbfa76bf43a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w94wj" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.741607 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvcdb\" (UniqueName: \"kubernetes.io/projected/cec9bfbd-c515-4bcc-8bf7-63648ecd230b-kube-api-access-cvcdb\") pod \"openstack-operator-controller-manager-6bc9d748dc-5vs6z\" (UID: \"cec9bfbd-c515-4bcc-8bf7-63648ecd230b\") " pod="openstack-operators/openstack-operator-controller-manager-6bc9d748dc-5vs6z" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.761049 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-646675d848-ctp5x" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.783826 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-lbnjc" event={"ID":"088785b6-72f1-472b-accb-fef0261e024b","Type":"ContainerStarted","Data":"dc43b0c93cffd945f776fb3648a99a883dcda5352097a2efa62accf0fa6cc786"} Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.794069 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-d58jj" event={"ID":"3f7d0e6e-1b92-48de-910b-ef415fac5e7c","Type":"ContainerStarted","Data":"5dfbbf9b620d185a3be4c77a6c7d584ed52cab55176c012752b9d8fa876b3f44"} Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.812941 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-sjsvr"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.816824 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-5t64t"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.843912 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cec9bfbd-c515-4bcc-8bf7-63648ecd230b-cert\") pod \"openstack-operator-controller-manager-6bc9d748dc-5vs6z\" (UID: \"cec9bfbd-c515-4bcc-8bf7-63648ecd230b\") " pod="openstack-operators/openstack-operator-controller-manager-6bc9d748dc-5vs6z" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.844328 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7bw2\" (UniqueName: \"kubernetes.io/projected/fd811aaa-ab9b-4d34-a268-ecbfa76bf43a-kube-api-access-t7bw2\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-w94wj\" (UID: \"fd811aaa-ab9b-4d34-a268-ecbfa76bf43a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w94wj" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.844365 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b8570e7-9b29-4a55-95e3-a3a588ba4083-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv\" (UID: \"6b8570e7-9b29-4a55-95e3-a3a588ba4083\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.844429 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvcdb\" (UniqueName: \"kubernetes.io/projected/cec9bfbd-c515-4bcc-8bf7-63648ecd230b-kube-api-access-cvcdb\") pod \"openstack-operator-controller-manager-6bc9d748dc-5vs6z\" (UID: \"cec9bfbd-c515-4bcc-8bf7-63648ecd230b\") " pod="openstack-operators/openstack-operator-controller-manager-6bc9d748dc-5vs6z" Oct 11 01:09:00 crc kubenswrapper[4743]: E1011 01:09:00.844934 4743 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 11 01:09:00 crc kubenswrapper[4743]: E1011 01:09:00.844999 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cec9bfbd-c515-4bcc-8bf7-63648ecd230b-cert podName:cec9bfbd-c515-4bcc-8bf7-63648ecd230b nodeName:}" failed. No retries permitted until 2025-10-11 01:09:01.344983599 +0000 UTC m=+1035.997963996 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cec9bfbd-c515-4bcc-8bf7-63648ecd230b-cert") pod "openstack-operator-controller-manager-6bc9d748dc-5vs6z" (UID: "cec9bfbd-c515-4bcc-8bf7-63648ecd230b") : secret "webhook-server-cert" not found Oct 11 01:09:00 crc kubenswrapper[4743]: E1011 01:09:00.845163 4743 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 11 01:09:00 crc kubenswrapper[4743]: E1011 01:09:00.845209 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b8570e7-9b29-4a55-95e3-a3a588ba4083-cert podName:6b8570e7-9b29-4a55-95e3-a3a588ba4083 nodeName:}" failed. No retries permitted until 2025-10-11 01:09:01.845201645 +0000 UTC m=+1036.498182042 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b8570e7-9b29-4a55-95e3-a3a588ba4083-cert") pod "openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv" (UID: "6b8570e7-9b29-4a55-95e3-a3a588ba4083") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.863504 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7bw2\" (UniqueName: \"kubernetes.io/projected/fd811aaa-ab9b-4d34-a268-ecbfa76bf43a-kube-api-access-t7bw2\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-w94wj\" (UID: \"fd811aaa-ab9b-4d34-a268-ecbfa76bf43a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w94wj" Oct 11 01:09:00 crc kubenswrapper[4743]: W1011 01:09:00.866128 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod547f1b93_dffd_4c63_964a_e3ea6d29970e.slice/crio-5539655564fddc794e3d2ec4fc79e8a961ff4e9a4d5f2db0bc8b7d631b5e07f9 WatchSource:0}: Error finding container 5539655564fddc794e3d2ec4fc79e8a961ff4e9a4d5f2db0bc8b7d631b5e07f9: Status 404 returned error can't find the container with id 5539655564fddc794e3d2ec4fc79e8a961ff4e9a4d5f2db0bc8b7d631b5e07f9 Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.868536 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9qjkk" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.872657 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvcdb\" (UniqueName: \"kubernetes.io/projected/cec9bfbd-c515-4bcc-8bf7-63648ecd230b-kube-api-access-cvcdb\") pod \"openstack-operator-controller-manager-6bc9d748dc-5vs6z\" (UID: \"cec9bfbd-c515-4bcc-8bf7-63648ecd230b\") " pod="openstack-operators/openstack-operator-controller-manager-6bc9d748dc-5vs6z" Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.942265 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-nfnkk"] Oct 11 01:09:00 crc kubenswrapper[4743]: I1011 01:09:00.950374 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w94wj" Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.075319 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-zrk9n"] Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.095619 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-95j97"] Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.209304 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-qs4hf"] Oct 11 01:09:01 crc kubenswrapper[4743]: W1011 01:09:01.211522 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe590325_60d2_4f91_9e78_a5520788cfed.slice/crio-38015dc81609b802fcdab84cc458d2954b05476f84abbb45bbf7ba5543a1e976 WatchSource:0}: Error finding container 38015dc81609b802fcdab84cc458d2954b05476f84abbb45bbf7ba5543a1e976: Status 404 returned error can't find the container with id 38015dc81609b802fcdab84cc458d2954b05476f84abbb45bbf7ba5543a1e976 Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.213575 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-vvcjm"] Oct 11 01:09:01 crc kubenswrapper[4743]: W1011 01:09:01.215297 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf96a190_49a7_4179_be4f_4a636d004cd0.slice/crio-593a4db525211c1202b0a1d5ce899c371368a22e1e9d90fbd453d06a8e864c94 WatchSource:0}: Error finding container 593a4db525211c1202b0a1d5ce899c371368a22e1e9d90fbd453d06a8e864c94: Status 404 returned error can't find the container with id 593a4db525211c1202b0a1d5ce899c371368a22e1e9d90fbd453d06a8e864c94 Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.223571 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-cbc66"] Oct 11 01:09:01 crc kubenswrapper[4743]: W1011 01:09:01.226079 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2726f212_a3ba_48cb_a96f_8d5f117a7f5e.slice/crio-7b74f986f48c3a76ce72a005a24ee11ceaba776f97a44db75063a0ea3d65f398 WatchSource:0}: Error finding container 7b74f986f48c3a76ce72a005a24ee11ceaba776f97a44db75063a0ea3d65f398: Status 404 returned error can't find the container with id 7b74f986f48c3a76ce72a005a24ee11ceaba776f97a44db75063a0ea3d65f398 Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.352484 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cec9bfbd-c515-4bcc-8bf7-63648ecd230b-cert\") pod \"openstack-operator-controller-manager-6bc9d748dc-5vs6z\" (UID: \"cec9bfbd-c515-4bcc-8bf7-63648ecd230b\") " pod="openstack-operators/openstack-operator-controller-manager-6bc9d748dc-5vs6z" Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.358896 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cec9bfbd-c515-4bcc-8bf7-63648ecd230b-cert\") pod \"openstack-operator-controller-manager-6bc9d748dc-5vs6z\" (UID: \"cec9bfbd-c515-4bcc-8bf7-63648ecd230b\") " pod="openstack-operators/openstack-operator-controller-manager-6bc9d748dc-5vs6z" Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.403159 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-6pzz7"] Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.408415 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-2htrs"] Oct 11 01:09:01 crc kubenswrapper[4743]: W1011 01:09:01.411144 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod035ca3e9_4fd6_4cb1_8e82_8ebcf39146a3.slice/crio-d7f53ff98b93842714ce68dbe485275ca929794a91881d2ccfa46f6ac5e39d2f WatchSource:0}: Error finding container d7f53ff98b93842714ce68dbe485275ca929794a91881d2ccfa46f6ac5e39d2f: Status 404 returned error can't find the container with id d7f53ff98b93842714ce68dbe485275ca929794a91881d2ccfa46f6ac5e39d2f Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.412836 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-679ff79844-2dvm2"] Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.417128 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-cxzw6"] Oct 11 01:09:01 crc kubenswrapper[4743]: W1011 01:09:01.421185 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36bd30aa_037d_4e7d_ae0f_fb53fe20f812.slice/crio-a8d7d2d1839e0ec3ebe4e7a289b0def48fdc0fb2b55d0958ae9feacb612d29a3 WatchSource:0}: Error finding container a8d7d2d1839e0ec3ebe4e7a289b0def48fdc0fb2b55d0958ae9feacb612d29a3: Status 404 returned error can't find the container with id a8d7d2d1839e0ec3ebe4e7a289b0def48fdc0fb2b55d0958ae9feacb612d29a3 Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.534019 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6bc9d748dc-5vs6z" Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.558001 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b-cert\") pod \"infra-operator-controller-manager-585fc5b659-6pvq9\" (UID: \"f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6pvq9" Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.575787 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b-cert\") pod \"infra-operator-controller-manager-585fc5b659-6pvq9\" (UID: \"f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6pvq9" Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.594983 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-nt865"] Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.603258 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-d29v4"] Oct 11 01:09:01 crc kubenswrapper[4743]: E1011 01:09:01.605789 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:315e558023b41ac1aa215082096995a03810c5b42910a33b00427ffcac9c6a14,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2q4r7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-869cc7797f-vt9w7_openstack-operators(c39ea94c-0f30-4b04-8a87-848ee9a62740): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 11 01:09:01 crc kubenswrapper[4743]: E1011 01:09:01.608381 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:09deecf840d38ff6af3c924729cf0a9444bc985848bfbe7c918019b88a6bc4d7,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2n8mn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-6d7c7ddf95-nt865_openstack-operators(a79f9419-0d04-41bb-b1ab-1615888819df): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 11 01:09:01 crc kubenswrapper[4743]: E1011 01:09:01.617773 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fb9dp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-646675d848-ctp5x_openstack-operators(f1a6e436-6a36-45a8-a033-f8d307ba12bd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.621723 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-869cc7797f-vt9w7"] Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.628160 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-9qjkk"] Oct 11 01:09:01 crc kubenswrapper[4743]: E1011 01:09:01.631070 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4sgcb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-ffcdd6c94-9qjkk_openstack-operators(962796b2-2bc0-4db5-84be-df36bbc28121): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 11 01:09:01 crc kubenswrapper[4743]: E1011 01:09:01.631249 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t7bw2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-w94wj_openstack-operators(fd811aaa-ab9b-4d34-a268-ecbfa76bf43a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 11 01:09:01 crc kubenswrapper[4743]: E1011 01:09:01.632440 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w94wj" podUID="fd811aaa-ab9b-4d34-a268-ecbfa76bf43a" Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.655972 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-646675d848-ctp5x"] Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.665405 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w94wj"] Oct 11 01:09:01 crc kubenswrapper[4743]: E1011 01:09:01.782062 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-vt9w7" podUID="c39ea94c-0f30-4b04-8a87-848ee9a62740" Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.808353 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-679ff79844-2dvm2" event={"ID":"7967fda3-d5ca-4e28-878e-e50017efc60f","Type":"ContainerStarted","Data":"801e29833db726ec4d2167a551c698451de2be6a54c74d124269a376602b6fc1"} Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.809608 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-5t64t" event={"ID":"547f1b93-dffd-4c63-964a-e3ea6d29970e","Type":"ContainerStarted","Data":"5539655564fddc794e3d2ec4fc79e8a961ff4e9a4d5f2db0bc8b7d631b5e07f9"} Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.811321 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-nt865" event={"ID":"a79f9419-0d04-41bb-b1ab-1615888819df","Type":"ContainerStarted","Data":"e02c3e9555c6e1c327ea88492dd46d1b0220eacdbc5d04e88640614947761500"} Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.813300 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-vvcjm" event={"ID":"af96a190-49a7-4179-be4f-4a636d004cd0","Type":"ContainerStarted","Data":"593a4db525211c1202b0a1d5ce899c371368a22e1e9d90fbd453d06a8e864c94"} Oct 11 01:09:01 crc kubenswrapper[4743]: E1011 01:09:01.823660 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9qjkk" podUID="962796b2-2bc0-4db5-84be-df36bbc28121" Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.826542 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w94wj" event={"ID":"fd811aaa-ab9b-4d34-a268-ecbfa76bf43a","Type":"ContainerStarted","Data":"e22733f92b8564e7c1820cc2839624a3c7f822c63705a0206a11e32e119f6dd6"} Oct 11 01:09:01 crc kubenswrapper[4743]: E1011 01:09:01.828630 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w94wj" podUID="fd811aaa-ab9b-4d34-a268-ecbfa76bf43a" Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.841917 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-6pzz7" event={"ID":"884ddb30-3a64-4bf2-83ac-3acc83e8bd96","Type":"ContainerStarted","Data":"15c6e276015295c31d833ae416255a52bbcbe48ae53047b3969054d8887425ea"} Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.844320 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-d29v4" event={"ID":"4be72879-00a9-4253-9ad9-c266c32b968e","Type":"ContainerStarted","Data":"1716559111b1a9e263f8d1223442ae53f00dfd9789499936f2ae7d866950f58d"} Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.845595 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-sjsvr" event={"ID":"a2f6fc09-a2cb-46e0-9fe6-e5dad1025bfe","Type":"ContainerStarted","Data":"803c2f897a0dcf7e5ef81df52db06e4c6664bf5611f1c9a6c8348f3231f0cf97"} Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.846657 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6pvq9" Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.847317 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-cxzw6" event={"ID":"36bd30aa-037d-4e7d-ae0f-fb53fe20f812","Type":"ContainerStarted","Data":"a8d7d2d1839e0ec3ebe4e7a289b0def48fdc0fb2b55d0958ae9feacb612d29a3"} Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.849053 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-95j97" event={"ID":"01d65505-63e1-4355-a1dc-675d22f5bdea","Type":"ContainerStarted","Data":"2468fdb8cd73ec94a9e4b1b4711b939e8d9bf01f0b976d65a8d3f6f812f56919"} Oct 11 01:09:01 crc kubenswrapper[4743]: E1011 01:09:01.850195 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-646675d848-ctp5x" podUID="f1a6e436-6a36-45a8-a033-f8d307ba12bd" Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.850431 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-nfnkk" event={"ID":"ed4aa42c-bd83-4fa8-99f2-5d7cde436979","Type":"ContainerStarted","Data":"cfbde479f200157e0940307cb4b6aec376ed206e6ec9ff90a56e56c8963ac7d3"} Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.862226 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-qs4hf" event={"ID":"be590325-60d2-4f91-9e78-a5520788cfed","Type":"ContainerStarted","Data":"38015dc81609b802fcdab84cc458d2954b05476f84abbb45bbf7ba5543a1e976"} Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.864242 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9qjkk" event={"ID":"962796b2-2bc0-4db5-84be-df36bbc28121","Type":"ContainerStarted","Data":"98b0326f5bcc528281225095d64e3c3149ed65e45399f5ea37ea1ab474da6707"} Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.875971 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b8570e7-9b29-4a55-95e3-a3a588ba4083-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv\" (UID: \"6b8570e7-9b29-4a55-95e3-a3a588ba4083\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv" Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.876969 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-vt9w7" event={"ID":"c39ea94c-0f30-4b04-8a87-848ee9a62740","Type":"ContainerStarted","Data":"74795a5f28780f4c984cd42556238b16c27cd03449c7a3cd0ed1a5d29c966d7f"} Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.877013 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-vt9w7" event={"ID":"c39ea94c-0f30-4b04-8a87-848ee9a62740","Type":"ContainerStarted","Data":"c6bbda6564ec1ab42fd204d2def4f1bae2a3e3915d73c9020302b8e6c0164cab"} Oct 11 01:09:01 crc kubenswrapper[4743]: E1011 01:09:01.885669 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9qjkk" podUID="962796b2-2bc0-4db5-84be-df36bbc28121" Oct 11 01:09:01 crc kubenswrapper[4743]: E1011 01:09:01.885879 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:315e558023b41ac1aa215082096995a03810c5b42910a33b00427ffcac9c6a14\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-vt9w7" podUID="c39ea94c-0f30-4b04-8a87-848ee9a62740" Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.886509 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-zrk9n" event={"ID":"53fca326-a309-4ff5-b52f-8b547496c069","Type":"ContainerStarted","Data":"a087b5635dae1333cb645db4cebff4d1ba784190468ad419c7891e31410fbb8f"} Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.887599 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-cbc66" event={"ID":"2726f212-a3ba-48cb-a96f-8d5f117a7f5e","Type":"ContainerStarted","Data":"7b74f986f48c3a76ce72a005a24ee11ceaba776f97a44db75063a0ea3d65f398"} Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.900366 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b8570e7-9b29-4a55-95e3-a3a588ba4083-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv\" (UID: \"6b8570e7-9b29-4a55-95e3-a3a588ba4083\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv" Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.905173 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-646675d848-ctp5x" event={"ID":"f1a6e436-6a36-45a8-a033-f8d307ba12bd","Type":"ContainerStarted","Data":"57bd3506152e94279683c3846eb035bea119515275392c333ef39df933e55cb8"} Oct 11 01:09:01 crc kubenswrapper[4743]: E1011 01:09:01.907536 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-646675d848-ctp5x" podUID="f1a6e436-6a36-45a8-a033-f8d307ba12bd" Oct 11 01:09:01 crc kubenswrapper[4743]: E1011 01:09:01.908533 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-nt865" podUID="a79f9419-0d04-41bb-b1ab-1615888819df" Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.909052 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-2htrs" event={"ID":"035ca3e9-4fd6-4cb1-8e82-8ebcf39146a3","Type":"ContainerStarted","Data":"d7f53ff98b93842714ce68dbe485275ca929794a91881d2ccfa46f6ac5e39d2f"} Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.963517 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv" Oct 11 01:09:01 crc kubenswrapper[4743]: I1011 01:09:01.983295 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6bc9d748dc-5vs6z"] Oct 11 01:09:02 crc kubenswrapper[4743]: I1011 01:09:02.443905 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-6pvq9"] Oct 11 01:09:02 crc kubenswrapper[4743]: I1011 01:09:02.525287 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv"] Oct 11 01:09:02 crc kubenswrapper[4743]: I1011 01:09:02.923269 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6bc9d748dc-5vs6z" event={"ID":"cec9bfbd-c515-4bcc-8bf7-63648ecd230b","Type":"ContainerStarted","Data":"d94802fc5991c12533e88caee38b806e1753c14470748dc6ff606137d5cab065"} Oct 11 01:09:02 crc kubenswrapper[4743]: I1011 01:09:02.923493 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6bc9d748dc-5vs6z" Oct 11 01:09:02 crc kubenswrapper[4743]: I1011 01:09:02.923507 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6bc9d748dc-5vs6z" event={"ID":"cec9bfbd-c515-4bcc-8bf7-63648ecd230b","Type":"ContainerStarted","Data":"01733dfce76c26de070edbdd21a5a99b55b61c0ba9dcf2b135d90a56f14f65ee"} Oct 11 01:09:02 crc kubenswrapper[4743]: I1011 01:09:02.923519 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6bc9d748dc-5vs6z" event={"ID":"cec9bfbd-c515-4bcc-8bf7-63648ecd230b","Type":"ContainerStarted","Data":"75d4eb6a526188e41a357e5f9e9670fe2036aadce7d22f053564bb8d07125a59"} Oct 11 01:09:02 crc kubenswrapper[4743]: I1011 01:09:02.928772 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-nt865" event={"ID":"a79f9419-0d04-41bb-b1ab-1615888819df","Type":"ContainerStarted","Data":"23fcb5af5762dba781bd719f13b808f12d24874e15b5eb916b6344417fbdf69e"} Oct 11 01:09:02 crc kubenswrapper[4743]: E1011 01:09:02.930928 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:09deecf840d38ff6af3c924729cf0a9444bc985848bfbe7c918019b88a6bc4d7\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-nt865" podUID="a79f9419-0d04-41bb-b1ab-1615888819df" Oct 11 01:09:02 crc kubenswrapper[4743]: I1011 01:09:02.933287 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9qjkk" event={"ID":"962796b2-2bc0-4db5-84be-df36bbc28121","Type":"ContainerStarted","Data":"6aa4c195fbea35100d2b65991a9a40498791665fa545b3268da12164d9d7a86e"} Oct 11 01:09:02 crc kubenswrapper[4743]: E1011 01:09:02.935383 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9qjkk" podUID="962796b2-2bc0-4db5-84be-df36bbc28121" Oct 11 01:09:02 crc kubenswrapper[4743]: I1011 01:09:02.936563 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6pvq9" event={"ID":"f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b","Type":"ContainerStarted","Data":"d170f5828b9d7dd9170405193e4c2523bd89dfa9c42a3ca07afd14e2004b15e6"} Oct 11 01:09:02 crc kubenswrapper[4743]: I1011 01:09:02.939455 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-646675d848-ctp5x" event={"ID":"f1a6e436-6a36-45a8-a033-f8d307ba12bd","Type":"ContainerStarted","Data":"645c985f7d7fecac5db62d099439259a5e9ccdcefc02c205d1d4523a6740bcf0"} Oct 11 01:09:02 crc kubenswrapper[4743]: E1011 01:09:02.941455 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-646675d848-ctp5x" podUID="f1a6e436-6a36-45a8-a033-f8d307ba12bd" Oct 11 01:09:02 crc kubenswrapper[4743]: I1011 01:09:02.943344 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv" event={"ID":"6b8570e7-9b29-4a55-95e3-a3a588ba4083","Type":"ContainerStarted","Data":"30e0a587171ab4a70853361f4f57dbe72560070c9f2ecb0455e6ef35f8e31f2b"} Oct 11 01:09:02 crc kubenswrapper[4743]: E1011 01:09:02.945073 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:315e558023b41ac1aa215082096995a03810c5b42910a33b00427ffcac9c6a14\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-vt9w7" podUID="c39ea94c-0f30-4b04-8a87-848ee9a62740" Oct 11 01:09:02 crc kubenswrapper[4743]: E1011 01:09:02.951350 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w94wj" podUID="fd811aaa-ab9b-4d34-a268-ecbfa76bf43a" Oct 11 01:09:02 crc kubenswrapper[4743]: I1011 01:09:02.955658 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6bc9d748dc-5vs6z" podStartSLOduration=2.9556405850000003 podStartE2EDuration="2.955640585s" podCreationTimestamp="2025-10-11 01:09:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:09:02.94507221 +0000 UTC m=+1037.598052607" watchObservedRunningTime="2025-10-11 01:09:02.955640585 +0000 UTC m=+1037.608620982" Oct 11 01:09:03 crc kubenswrapper[4743]: E1011 01:09:03.954690 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:09deecf840d38ff6af3c924729cf0a9444bc985848bfbe7c918019b88a6bc4d7\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-nt865" podUID="a79f9419-0d04-41bb-b1ab-1615888819df" Oct 11 01:09:03 crc kubenswrapper[4743]: E1011 01:09:03.955081 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-646675d848-ctp5x" podUID="f1a6e436-6a36-45a8-a033-f8d307ba12bd" Oct 11 01:09:03 crc kubenswrapper[4743]: E1011 01:09:03.955425 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9qjkk" podUID="962796b2-2bc0-4db5-84be-df36bbc28121" Oct 11 01:09:11 crc kubenswrapper[4743]: I1011 01:09:11.540311 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6bc9d748dc-5vs6z" Oct 11 01:09:13 crc kubenswrapper[4743]: I1011 01:09:13.022184 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-cbc66" event={"ID":"2726f212-a3ba-48cb-a96f-8d5f117a7f5e","Type":"ContainerStarted","Data":"00f4ad3c3f3855a1a79d910998054df3964de869847a1c5790fda05a3654868e"} Oct 11 01:09:13 crc kubenswrapper[4743]: I1011 01:09:13.025762 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-cxzw6" event={"ID":"36bd30aa-037d-4e7d-ae0f-fb53fe20f812","Type":"ContainerStarted","Data":"dec5e8a74ae6101f2ed21bc4fc89874b3d3afb941debe4c949ae35ce6972d843"} Oct 11 01:09:13 crc kubenswrapper[4743]: I1011 01:09:13.027702 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-2htrs" event={"ID":"035ca3e9-4fd6-4cb1-8e82-8ebcf39146a3","Type":"ContainerStarted","Data":"514cfe5b783a16150bac96e78b0a33ba596cd728e76f7cbda04ba5e2c222b724"} Oct 11 01:09:13 crc kubenswrapper[4743]: I1011 01:09:13.029098 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-95j97" event={"ID":"01d65505-63e1-4355-a1dc-675d22f5bdea","Type":"ContainerStarted","Data":"1c78296004ecb58e8da55bfb9c81249df64ed137c96f2e66bf881f1f9575069e"} Oct 11 01:09:13 crc kubenswrapper[4743]: I1011 01:09:13.038281 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-6pzz7" event={"ID":"884ddb30-3a64-4bf2-83ac-3acc83e8bd96","Type":"ContainerStarted","Data":"e7a8da88d7e98984a1b7e813eb27fc6810c85ce88cd616d45337e2853b1df00a"} Oct 11 01:09:13 crc kubenswrapper[4743]: I1011 01:09:13.058287 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv" event={"ID":"6b8570e7-9b29-4a55-95e3-a3a588ba4083","Type":"ContainerStarted","Data":"58f1cfe246de6e86b0f420557a5bc61b5bc80be4d385057bc872a2658cd2ee2e"} Oct 11 01:09:13 crc kubenswrapper[4743]: I1011 01:09:13.066567 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-nfnkk" event={"ID":"ed4aa42c-bd83-4fa8-99f2-5d7cde436979","Type":"ContainerStarted","Data":"f02db5c01add858ed7ba9932e63aa528e1b90b1d77ffad9d748edabbf224c9e5"} Oct 11 01:09:13 crc kubenswrapper[4743]: I1011 01:09:13.070835 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-lbnjc" event={"ID":"088785b6-72f1-472b-accb-fef0261e024b","Type":"ContainerStarted","Data":"3160555609c5fd9d894deebb039baca37056372fbe9112acb8289bc7d0fdebda"} Oct 11 01:09:13 crc kubenswrapper[4743]: I1011 01:09:13.070893 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-lbnjc" event={"ID":"088785b6-72f1-472b-accb-fef0261e024b","Type":"ContainerStarted","Data":"ecf9b369b0e2ab218c4630dd460fcee0675cc96144d4b5d6a01ef81c2615c659"} Oct 11 01:09:13 crc kubenswrapper[4743]: I1011 01:09:13.070909 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-lbnjc" Oct 11 01:09:13 crc kubenswrapper[4743]: I1011 01:09:13.084445 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-vvcjm" event={"ID":"af96a190-49a7-4179-be4f-4a636d004cd0","Type":"ContainerStarted","Data":"657d8fc61b6673cf74fc106154a2175072a1c6dd423de44dc6653382883fbb84"} Oct 11 01:09:13 crc kubenswrapper[4743]: I1011 01:09:13.105140 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-679ff79844-2dvm2" event={"ID":"7967fda3-d5ca-4e28-878e-e50017efc60f","Type":"ContainerStarted","Data":"74206c3998ef1786e603f66a1a39551ee6d7bd107958daf310c8c547199ee3cc"} Oct 11 01:09:13 crc kubenswrapper[4743]: I1011 01:09:13.105440 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-lbnjc" podStartSLOduration=2.765797847 podStartE2EDuration="14.105419086s" podCreationTimestamp="2025-10-11 01:08:59 +0000 UTC" firstStartedPulling="2025-10-11 01:09:00.395842209 +0000 UTC m=+1035.048822606" lastFinishedPulling="2025-10-11 01:09:11.735463428 +0000 UTC m=+1046.388443845" observedRunningTime="2025-10-11 01:09:13.095784594 +0000 UTC m=+1047.748764991" watchObservedRunningTime="2025-10-11 01:09:13.105419086 +0000 UTC m=+1047.758399493" Oct 11 01:09:13 crc kubenswrapper[4743]: I1011 01:09:13.111068 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-d29v4" event={"ID":"4be72879-00a9-4253-9ad9-c266c32b968e","Type":"ContainerStarted","Data":"83d07c43d2deee5979056657b2f445eb9fc62aeffb861df736864acc49ec8ee7"} Oct 11 01:09:13 crc kubenswrapper[4743]: I1011 01:09:13.130827 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-sjsvr" event={"ID":"a2f6fc09-a2cb-46e0-9fe6-e5dad1025bfe","Type":"ContainerStarted","Data":"bc38689a12536824c61cb461b1d9e59d314292f17feb7fee76384e427627a964"} Oct 11 01:09:13 crc kubenswrapper[4743]: I1011 01:09:13.159697 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-d58jj" event={"ID":"3f7d0e6e-1b92-48de-910b-ef415fac5e7c","Type":"ContainerStarted","Data":"47463273b0c53cbcb81491168694084f5f4d65e2df185c96678587d5ce6060cd"} Oct 11 01:09:13 crc kubenswrapper[4743]: I1011 01:09:13.169249 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-5t64t" event={"ID":"547f1b93-dffd-4c63-964a-e3ea6d29970e","Type":"ContainerStarted","Data":"49b96e5bb24a0db60d598c1e66a932f1ed732d935693f20f0e2eb20bf047aba1"} Oct 11 01:09:13 crc kubenswrapper[4743]: I1011 01:09:13.171009 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-zrk9n" event={"ID":"53fca326-a309-4ff5-b52f-8b547496c069","Type":"ContainerStarted","Data":"3b8f1c638dc87cf21554292cc858dfbe6c649a336d278bef4e1b95c425d4cb26"} Oct 11 01:09:13 crc kubenswrapper[4743]: I1011 01:09:13.172679 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-qs4hf" event={"ID":"be590325-60d2-4f91-9e78-a5520788cfed","Type":"ContainerStarted","Data":"b3a763a9fc5853672e6330d27ffc0c6dfe1ca8b62a50ece025bd8ae27538e917"} Oct 11 01:09:13 crc kubenswrapper[4743]: I1011 01:09:13.175438 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6pvq9" event={"ID":"f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b","Type":"ContainerStarted","Data":"a703c8c82a87e204fe3a168755586b51fe1743dea2d753e9ec9258a0b9906851"} Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.184889 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-cxzw6" event={"ID":"36bd30aa-037d-4e7d-ae0f-fb53fe20f812","Type":"ContainerStarted","Data":"964ee25c7284d63ac57b6c08f0895d6a90a84e83fa5ee52690ae79ad7e291da4"} Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.185193 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-cxzw6" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.186234 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-679ff79844-2dvm2" event={"ID":"7967fda3-d5ca-4e28-878e-e50017efc60f","Type":"ContainerStarted","Data":"9694ffd9d43a9eeddf8ff96d64d167a7ce446f4e1f0fff4f7cfee22dabb2b746"} Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.186295 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-679ff79844-2dvm2" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.188399 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-95j97" event={"ID":"01d65505-63e1-4355-a1dc-675d22f5bdea","Type":"ContainerStarted","Data":"1134aec51776434d51f3dc15352fee76c9aafea3410113fd3a11383f5657eb10"} Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.188493 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-95j97" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.190682 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-sjsvr" event={"ID":"a2f6fc09-a2cb-46e0-9fe6-e5dad1025bfe","Type":"ContainerStarted","Data":"cf577c4b8902a3ca42ae6e063d5c6a6b35871097531aaeba682d137caec0a81e"} Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.190797 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-sjsvr" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.192362 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-vvcjm" event={"ID":"af96a190-49a7-4179-be4f-4a636d004cd0","Type":"ContainerStarted","Data":"c6d482c20ba642e1a4818148a9bd66b7f20863a45965ea86285bc764e6b55ba2"} Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.192484 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-59578bc799-vvcjm" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.193845 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-d58jj" event={"ID":"3f7d0e6e-1b92-48de-910b-ef415fac5e7c","Type":"ContainerStarted","Data":"fca14642991c0d28cc02239b5d6ffa836f52f0b3bd989d69c1b35c524c685b7a"} Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.193949 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-d58jj" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.196764 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-qs4hf" event={"ID":"be590325-60d2-4f91-9e78-a5520788cfed","Type":"ContainerStarted","Data":"4db2a4f011918a407a344493d76f8f8a8567bbfb130e1735ed29f8c59e337833"} Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.196851 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-qs4hf" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.199145 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-cbc66" event={"ID":"2726f212-a3ba-48cb-a96f-8d5f117a7f5e","Type":"ContainerStarted","Data":"193df1493fbc25739b600fa65665d950261d32ec6f46fcd731972e14ecf0c68c"} Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.199261 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-cbc66" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.199707 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-cxzw6" podStartSLOduration=4.877367415 podStartE2EDuration="15.199696156s" podCreationTimestamp="2025-10-11 01:08:59 +0000 UTC" firstStartedPulling="2025-10-11 01:09:01.42680702 +0000 UTC m=+1036.079787417" lastFinishedPulling="2025-10-11 01:09:11.749135751 +0000 UTC m=+1046.402116158" observedRunningTime="2025-10-11 01:09:14.199301047 +0000 UTC m=+1048.852281444" watchObservedRunningTime="2025-10-11 01:09:14.199696156 +0000 UTC m=+1048.852676553" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.201372 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv" event={"ID":"6b8570e7-9b29-4a55-95e3-a3a588ba4083","Type":"ContainerStarted","Data":"0eab8786056f0675174404a8501e98d39348dad177df1eb55780b123c989de30"} Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.202099 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.203733 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-5t64t" event={"ID":"547f1b93-dffd-4c63-964a-e3ea6d29970e","Type":"ContainerStarted","Data":"ec4e0386c663eae800b0ce991c2660f945aad498b2b5e8aefcd369962f681d77"} Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.204063 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-5t64t" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.205611 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-nfnkk" event={"ID":"ed4aa42c-bd83-4fa8-99f2-5d7cde436979","Type":"ContainerStarted","Data":"56ef9552165a8c4dbe1d934447a57909a9f787a39d0731e53294f0925f0b6bb8"} Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.206373 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-nfnkk" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.207590 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-d29v4" event={"ID":"4be72879-00a9-4253-9ad9-c266c32b968e","Type":"ContainerStarted","Data":"7dcfe4da737dfe58157444bb946f48ab31169167a940139663816e3045654c83"} Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.207933 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-d29v4" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.209745 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6pvq9" event={"ID":"f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b","Type":"ContainerStarted","Data":"58498d4126e7004d6fd720abce6a04ca1e4df48d6cd3bf47bf42561e1c4671e8"} Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.209835 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6pvq9" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.211507 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-6pzz7" event={"ID":"884ddb30-3a64-4bf2-83ac-3acc83e8bd96","Type":"ContainerStarted","Data":"7e0a9638d866365450b2b684c96f629c7356eb5103a54115d1edbf533861406f"} Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.211657 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-664664cb68-6pzz7" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.212683 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-zrk9n" event={"ID":"53fca326-a309-4ff5-b52f-8b547496c069","Type":"ContainerStarted","Data":"70f3e997f82f50fcaf91603651a8ed0de9a42e0b94a7e3b4caab0209ee4334b2"} Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.212817 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-zrk9n" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.214511 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-2htrs" event={"ID":"035ca3e9-4fd6-4cb1-8e82-8ebcf39146a3","Type":"ContainerStarted","Data":"3ab12c913eae96f8fa8b59fefefe5c32ab149c815c2ac4d54e6d1c85401ba245"} Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.214547 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-2htrs" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.216344 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-cbc66" podStartSLOduration=4.667527739 podStartE2EDuration="15.216334864s" podCreationTimestamp="2025-10-11 01:08:59 +0000 UTC" firstStartedPulling="2025-10-11 01:09:01.230164955 +0000 UTC m=+1035.883145352" lastFinishedPulling="2025-10-11 01:09:11.77897207 +0000 UTC m=+1046.431952477" observedRunningTime="2025-10-11 01:09:14.213642886 +0000 UTC m=+1048.866623283" watchObservedRunningTime="2025-10-11 01:09:14.216334864 +0000 UTC m=+1048.869315261" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.234453 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-d58jj" podStartSLOduration=4.112233825 podStartE2EDuration="15.234436928s" podCreationTimestamp="2025-10-11 01:08:59 +0000 UTC" firstStartedPulling="2025-10-11 01:09:00.656758156 +0000 UTC m=+1035.309738553" lastFinishedPulling="2025-10-11 01:09:11.778961219 +0000 UTC m=+1046.431941656" observedRunningTime="2025-10-11 01:09:14.22694601 +0000 UTC m=+1048.879926407" watchObservedRunningTime="2025-10-11 01:09:14.234436928 +0000 UTC m=+1048.887417325" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.242511 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-59578bc799-vvcjm" podStartSLOduration=4.628789707 podStartE2EDuration="15.24249353s" podCreationTimestamp="2025-10-11 01:08:59 +0000 UTC" firstStartedPulling="2025-10-11 01:09:01.218828741 +0000 UTC m=+1035.871809138" lastFinishedPulling="2025-10-11 01:09:11.832532564 +0000 UTC m=+1046.485512961" observedRunningTime="2025-10-11 01:09:14.238683565 +0000 UTC m=+1048.891663962" watchObservedRunningTime="2025-10-11 01:09:14.24249353 +0000 UTC m=+1048.895473927" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.254174 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-qs4hf" podStartSLOduration=4.646517362 podStartE2EDuration="15.254158693s" podCreationTimestamp="2025-10-11 01:08:59 +0000 UTC" firstStartedPulling="2025-10-11 01:09:01.214794039 +0000 UTC m=+1035.867774436" lastFinishedPulling="2025-10-11 01:09:11.82243537 +0000 UTC m=+1046.475415767" observedRunningTime="2025-10-11 01:09:14.251068466 +0000 UTC m=+1048.904048863" watchObservedRunningTime="2025-10-11 01:09:14.254158693 +0000 UTC m=+1048.907139090" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.271884 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-sjsvr" podStartSLOduration=4.2949365 podStartE2EDuration="15.271867758s" podCreationTimestamp="2025-10-11 01:08:59 +0000 UTC" firstStartedPulling="2025-10-11 01:09:00.851302828 +0000 UTC m=+1035.504283225" lastFinishedPulling="2025-10-11 01:09:11.828234086 +0000 UTC m=+1046.481214483" observedRunningTime="2025-10-11 01:09:14.26599681 +0000 UTC m=+1048.918977207" watchObservedRunningTime="2025-10-11 01:09:14.271867758 +0000 UTC m=+1048.924848165" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.285186 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-679ff79844-2dvm2" podStartSLOduration=4.962161083 podStartE2EDuration="15.285172161s" podCreationTimestamp="2025-10-11 01:08:59 +0000 UTC" firstStartedPulling="2025-10-11 01:09:01.421606539 +0000 UTC m=+1036.074586936" lastFinishedPulling="2025-10-11 01:09:11.744617607 +0000 UTC m=+1046.397598014" observedRunningTime="2025-10-11 01:09:14.282866034 +0000 UTC m=+1048.935846431" watchObservedRunningTime="2025-10-11 01:09:14.285172161 +0000 UTC m=+1048.938152558" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.322121 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-95j97" podStartSLOduration=4.614025217 podStartE2EDuration="15.322103088s" podCreationTimestamp="2025-10-11 01:08:59 +0000 UTC" firstStartedPulling="2025-10-11 01:09:01.119701193 +0000 UTC m=+1035.772681590" lastFinishedPulling="2025-10-11 01:09:11.827779044 +0000 UTC m=+1046.480759461" observedRunningTime="2025-10-11 01:09:14.302005814 +0000 UTC m=+1048.954986211" watchObservedRunningTime="2025-10-11 01:09:14.322103088 +0000 UTC m=+1048.975083485" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.337125 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6pvq9" podStartSLOduration=5.954374652 podStartE2EDuration="15.337114985s" podCreationTimestamp="2025-10-11 01:08:59 +0000 UTC" firstStartedPulling="2025-10-11 01:09:02.450010886 +0000 UTC m=+1037.102991283" lastFinishedPulling="2025-10-11 01:09:11.832751199 +0000 UTC m=+1046.485731616" observedRunningTime="2025-10-11 01:09:14.336221542 +0000 UTC m=+1048.989201939" watchObservedRunningTime="2025-10-11 01:09:14.337114985 +0000 UTC m=+1048.990095382" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.339004 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv" podStartSLOduration=6.049847438 podStartE2EDuration="15.338998692s" podCreationTimestamp="2025-10-11 01:08:59 +0000 UTC" firstStartedPulling="2025-10-11 01:09:02.542076577 +0000 UTC m=+1037.195056964" lastFinishedPulling="2025-10-11 01:09:11.831227821 +0000 UTC m=+1046.484208218" observedRunningTime="2025-10-11 01:09:14.323498403 +0000 UTC m=+1048.976478800" watchObservedRunningTime="2025-10-11 01:09:14.338998692 +0000 UTC m=+1048.991979089" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.360805 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-nfnkk" podStartSLOduration=4.553705534 podStartE2EDuration="15.360788289s" podCreationTimestamp="2025-10-11 01:08:59 +0000 UTC" firstStartedPulling="2025-10-11 01:09:01.027557231 +0000 UTC m=+1035.680537628" lastFinishedPulling="2025-10-11 01:09:11.834639986 +0000 UTC m=+1046.487620383" observedRunningTime="2025-10-11 01:09:14.357540277 +0000 UTC m=+1049.010520674" watchObservedRunningTime="2025-10-11 01:09:14.360788289 +0000 UTC m=+1049.013768696" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.403159 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-2htrs" podStartSLOduration=5.022853657 podStartE2EDuration="15.403142452s" podCreationTimestamp="2025-10-11 01:08:59 +0000 UTC" firstStartedPulling="2025-10-11 01:09:01.414934612 +0000 UTC m=+1036.067915009" lastFinishedPulling="2025-10-11 01:09:11.795223397 +0000 UTC m=+1046.448203804" observedRunningTime="2025-10-11 01:09:14.385896109 +0000 UTC m=+1049.038876506" watchObservedRunningTime="2025-10-11 01:09:14.403142452 +0000 UTC m=+1049.056122849" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.404114 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-5t64t" podStartSLOduration=4.511029062 podStartE2EDuration="15.404110056s" podCreationTimestamp="2025-10-11 01:08:59 +0000 UTC" firstStartedPulling="2025-10-11 01:09:00.886161172 +0000 UTC m=+1035.539141569" lastFinishedPulling="2025-10-11 01:09:11.779242156 +0000 UTC m=+1046.432222563" observedRunningTime="2025-10-11 01:09:14.399032159 +0000 UTC m=+1049.052012566" watchObservedRunningTime="2025-10-11 01:09:14.404110056 +0000 UTC m=+1049.057090453" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.420292 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-664664cb68-6pzz7" podStartSLOduration=5.088657058 podStartE2EDuration="15.420277562s" podCreationTimestamp="2025-10-11 01:08:59 +0000 UTC" firstStartedPulling="2025-10-11 01:09:01.413033584 +0000 UTC m=+1036.066013981" lastFinishedPulling="2025-10-11 01:09:11.744654078 +0000 UTC m=+1046.397634485" observedRunningTime="2025-10-11 01:09:14.414301502 +0000 UTC m=+1049.067281899" watchObservedRunningTime="2025-10-11 01:09:14.420277562 +0000 UTC m=+1049.073257959" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.434530 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-d29v4" podStartSLOduration=5.202999007 podStartE2EDuration="15.434512729s" podCreationTimestamp="2025-10-11 01:08:59 +0000 UTC" firstStartedPulling="2025-10-11 01:09:01.598365455 +0000 UTC m=+1036.251345842" lastFinishedPulling="2025-10-11 01:09:11.829879167 +0000 UTC m=+1046.482859564" observedRunningTime="2025-10-11 01:09:14.433185846 +0000 UTC m=+1049.086166253" watchObservedRunningTime="2025-10-11 01:09:14.434512729 +0000 UTC m=+1049.087493126" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.459519 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.459594 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:09:14 crc kubenswrapper[4743]: I1011 01:09:14.480645 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-zrk9n" podStartSLOduration=4.787413407 podStartE2EDuration="15.480626206s" podCreationTimestamp="2025-10-11 01:08:59 +0000 UTC" firstStartedPulling="2025-10-11 01:09:01.085087924 +0000 UTC m=+1035.738068321" lastFinishedPulling="2025-10-11 01:09:11.778300713 +0000 UTC m=+1046.431281120" observedRunningTime="2025-10-11 01:09:14.449060944 +0000 UTC m=+1049.102041341" watchObservedRunningTime="2025-10-11 01:09:14.480626206 +0000 UTC m=+1049.133606603" Oct 11 01:09:19 crc kubenswrapper[4743]: I1011 01:09:19.920755 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-lbnjc" Oct 11 01:09:19 crc kubenswrapper[4743]: I1011 01:09:19.979356 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-d58jj" Oct 11 01:09:19 crc kubenswrapper[4743]: I1011 01:09:19.985775 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-sjsvr" Oct 11 01:09:20 crc kubenswrapper[4743]: I1011 01:09:20.042608 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-5t64t" Oct 11 01:09:20 crc kubenswrapper[4743]: I1011 01:09:20.114628 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-zrk9n" Oct 11 01:09:20 crc kubenswrapper[4743]: I1011 01:09:20.199178 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-nfnkk" Oct 11 01:09:20 crc kubenswrapper[4743]: I1011 01:09:20.246584 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-95j97" Oct 11 01:09:20 crc kubenswrapper[4743]: I1011 01:09:20.314105 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-cbc66" Oct 11 01:09:20 crc kubenswrapper[4743]: I1011 01:09:20.339143 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-qs4hf" Oct 11 01:09:20 crc kubenswrapper[4743]: I1011 01:09:20.354661 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-59578bc799-vvcjm" Oct 11 01:09:20 crc kubenswrapper[4743]: I1011 01:09:20.495763 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-664664cb68-6pzz7" Oct 11 01:09:20 crc kubenswrapper[4743]: I1011 01:09:20.511826 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-cxzw6" Oct 11 01:09:20 crc kubenswrapper[4743]: I1011 01:09:20.534228 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-679ff79844-2dvm2" Oct 11 01:09:20 crc kubenswrapper[4743]: I1011 01:09:20.663619 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-2htrs" Oct 11 01:09:20 crc kubenswrapper[4743]: I1011 01:09:20.691091 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-d29v4" Oct 11 01:09:21 crc kubenswrapper[4743]: I1011 01:09:21.853241 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6pvq9" Oct 11 01:09:21 crc kubenswrapper[4743]: I1011 01:09:21.971925 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv" Oct 11 01:09:24 crc kubenswrapper[4743]: I1011 01:09:24.322407 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w94wj" event={"ID":"fd811aaa-ab9b-4d34-a268-ecbfa76bf43a","Type":"ContainerStarted","Data":"54782f34c3a828b409feb5bc8f32c33d219700b3ebbdd7fa3d53bf9fca235c70"} Oct 11 01:09:24 crc kubenswrapper[4743]: I1011 01:09:24.325288 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-vt9w7" event={"ID":"c39ea94c-0f30-4b04-8a87-848ee9a62740","Type":"ContainerStarted","Data":"2ae4baec005d88bb45b2dd063dad28e308bdd0107f139c9744c0edf5e997228c"} Oct 11 01:09:24 crc kubenswrapper[4743]: I1011 01:09:24.325438 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-vt9w7" Oct 11 01:09:24 crc kubenswrapper[4743]: I1011 01:09:24.354388 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-w94wj" podStartSLOduration=2.362729576 podStartE2EDuration="24.35435746s" podCreationTimestamp="2025-10-11 01:09:00 +0000 UTC" firstStartedPulling="2025-10-11 01:09:01.631170908 +0000 UTC m=+1036.284151305" lastFinishedPulling="2025-10-11 01:09:23.622798792 +0000 UTC m=+1058.275779189" observedRunningTime="2025-10-11 01:09:24.345764214 +0000 UTC m=+1058.998744681" watchObservedRunningTime="2025-10-11 01:09:24.35435746 +0000 UTC m=+1059.007337897" Oct 11 01:09:24 crc kubenswrapper[4743]: I1011 01:09:24.369176 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-vt9w7" podStartSLOduration=3.3180077629999998 podStartE2EDuration="25.369150131s" podCreationTimestamp="2025-10-11 01:08:59 +0000 UTC" firstStartedPulling="2025-10-11 01:09:01.605679218 +0000 UTC m=+1036.258659605" lastFinishedPulling="2025-10-11 01:09:23.656821576 +0000 UTC m=+1058.309801973" observedRunningTime="2025-10-11 01:09:24.363626523 +0000 UTC m=+1059.016606920" watchObservedRunningTime="2025-10-11 01:09:24.369150131 +0000 UTC m=+1059.022130528" Oct 11 01:09:27 crc kubenswrapper[4743]: I1011 01:09:27.354704 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-nt865" event={"ID":"a79f9419-0d04-41bb-b1ab-1615888819df","Type":"ContainerStarted","Data":"22f9e7a16170e8b89b1727d1cc9f1321bdfc40abb8949e32fb25eaffcf146c5c"} Oct 11 01:09:27 crc kubenswrapper[4743]: I1011 01:09:27.356533 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9qjkk" event={"ID":"962796b2-2bc0-4db5-84be-df36bbc28121","Type":"ContainerStarted","Data":"496e3955d1862d00dd93c067ef2493906b2431912d1e2fdba87050ff93333917"} Oct 11 01:09:27 crc kubenswrapper[4743]: I1011 01:09:27.356585 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-nt865" Oct 11 01:09:27 crc kubenswrapper[4743]: I1011 01:09:27.356888 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9qjkk" Oct 11 01:09:27 crc kubenswrapper[4743]: I1011 01:09:27.358914 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-646675d848-ctp5x" event={"ID":"f1a6e436-6a36-45a8-a033-f8d307ba12bd","Type":"ContainerStarted","Data":"313ec97b82d47579125ffb0a0c9f8f0f4cb424d1935bc62f9ffca85856ef0eae"} Oct 11 01:09:27 crc kubenswrapper[4743]: I1011 01:09:27.359142 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-646675d848-ctp5x" Oct 11 01:09:27 crc kubenswrapper[4743]: I1011 01:09:27.386132 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-nt865" podStartSLOduration=3.797973599 podStartE2EDuration="28.38611148s" podCreationTimestamp="2025-10-11 01:08:59 +0000 UTC" firstStartedPulling="2025-10-11 01:09:01.608265073 +0000 UTC m=+1036.261245470" lastFinishedPulling="2025-10-11 01:09:26.196402944 +0000 UTC m=+1060.849383351" observedRunningTime="2025-10-11 01:09:27.38335822 +0000 UTC m=+1062.036338677" watchObservedRunningTime="2025-10-11 01:09:27.38611148 +0000 UTC m=+1062.039091887" Oct 11 01:09:27 crc kubenswrapper[4743]: I1011 01:09:27.400980 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9qjkk" podStartSLOduration=2.830882494 podStartE2EDuration="27.400951392s" podCreationTimestamp="2025-10-11 01:09:00 +0000 UTC" firstStartedPulling="2025-10-11 01:09:01.630905621 +0000 UTC m=+1036.283886018" lastFinishedPulling="2025-10-11 01:09:26.200974509 +0000 UTC m=+1060.853954916" observedRunningTime="2025-10-11 01:09:27.399436104 +0000 UTC m=+1062.052416501" watchObservedRunningTime="2025-10-11 01:09:27.400951392 +0000 UTC m=+1062.053931809" Oct 11 01:09:27 crc kubenswrapper[4743]: I1011 01:09:27.419916 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-646675d848-ctp5x" podStartSLOduration=2.845288206 podStartE2EDuration="27.419887807s" podCreationTimestamp="2025-10-11 01:09:00 +0000 UTC" firstStartedPulling="2025-10-11 01:09:01.61767884 +0000 UTC m=+1036.270659237" lastFinishedPulling="2025-10-11 01:09:26.192278441 +0000 UTC m=+1060.845258838" observedRunningTime="2025-10-11 01:09:27.417087617 +0000 UTC m=+1062.070068044" watchObservedRunningTime="2025-10-11 01:09:27.419887807 +0000 UTC m=+1062.072868224" Oct 11 01:09:30 crc kubenswrapper[4743]: I1011 01:09:30.721990 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-vt9w7" Oct 11 01:09:40 crc kubenswrapper[4743]: I1011 01:09:40.705417 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-nt865" Oct 11 01:09:40 crc kubenswrapper[4743]: I1011 01:09:40.784975 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-646675d848-ctp5x" Oct 11 01:09:40 crc kubenswrapper[4743]: I1011 01:09:40.873535 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9qjkk" Oct 11 01:09:44 crc kubenswrapper[4743]: I1011 01:09:44.457921 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:09:44 crc kubenswrapper[4743]: I1011 01:09:44.458306 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.487069 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bcz4v"] Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.488827 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bcz4v" Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.491333 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.491568 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-5jjkc" Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.491734 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.507786 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.511783 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bcz4v"] Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.549977 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx5gc\" (UniqueName: \"kubernetes.io/projected/3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c-kube-api-access-bx5gc\") pod \"dnsmasq-dns-675f4bcbfc-bcz4v\" (UID: \"3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bcz4v" Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.550341 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c-config\") pod \"dnsmasq-dns-675f4bcbfc-bcz4v\" (UID: \"3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bcz4v" Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.593483 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qgj2b"] Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.594687 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-qgj2b" Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.596435 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.602242 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qgj2b"] Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.651882 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06e0f9f3-e27f-485a-b45b-32a8a41c4c1f-config\") pod \"dnsmasq-dns-78dd6ddcc-qgj2b\" (UID: \"06e0f9f3-e27f-485a-b45b-32a8a41c4c1f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qgj2b" Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.651945 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mwx2\" (UniqueName: \"kubernetes.io/projected/06e0f9f3-e27f-485a-b45b-32a8a41c4c1f-kube-api-access-9mwx2\") pod \"dnsmasq-dns-78dd6ddcc-qgj2b\" (UID: \"06e0f9f3-e27f-485a-b45b-32a8a41c4c1f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qgj2b" Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.651982 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06e0f9f3-e27f-485a-b45b-32a8a41c4c1f-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-qgj2b\" (UID: \"06e0f9f3-e27f-485a-b45b-32a8a41c4c1f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qgj2b" Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.652015 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c-config\") pod \"dnsmasq-dns-675f4bcbfc-bcz4v\" (UID: \"3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bcz4v" Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.652056 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx5gc\" (UniqueName: \"kubernetes.io/projected/3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c-kube-api-access-bx5gc\") pod \"dnsmasq-dns-675f4bcbfc-bcz4v\" (UID: \"3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bcz4v" Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.653132 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c-config\") pod \"dnsmasq-dns-675f4bcbfc-bcz4v\" (UID: \"3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bcz4v" Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.667708 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx5gc\" (UniqueName: \"kubernetes.io/projected/3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c-kube-api-access-bx5gc\") pod \"dnsmasq-dns-675f4bcbfc-bcz4v\" (UID: \"3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bcz4v" Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.753196 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06e0f9f3-e27f-485a-b45b-32a8a41c4c1f-config\") pod \"dnsmasq-dns-78dd6ddcc-qgj2b\" (UID: \"06e0f9f3-e27f-485a-b45b-32a8a41c4c1f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qgj2b" Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.753250 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mwx2\" (UniqueName: \"kubernetes.io/projected/06e0f9f3-e27f-485a-b45b-32a8a41c4c1f-kube-api-access-9mwx2\") pod \"dnsmasq-dns-78dd6ddcc-qgj2b\" (UID: \"06e0f9f3-e27f-485a-b45b-32a8a41c4c1f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qgj2b" Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.753294 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06e0f9f3-e27f-485a-b45b-32a8a41c4c1f-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-qgj2b\" (UID: \"06e0f9f3-e27f-485a-b45b-32a8a41c4c1f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qgj2b" Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.754047 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06e0f9f3-e27f-485a-b45b-32a8a41c4c1f-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-qgj2b\" (UID: \"06e0f9f3-e27f-485a-b45b-32a8a41c4c1f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qgj2b" Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.754570 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06e0f9f3-e27f-485a-b45b-32a8a41c4c1f-config\") pod \"dnsmasq-dns-78dd6ddcc-qgj2b\" (UID: \"06e0f9f3-e27f-485a-b45b-32a8a41c4c1f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qgj2b" Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.769206 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mwx2\" (UniqueName: \"kubernetes.io/projected/06e0f9f3-e27f-485a-b45b-32a8a41c4c1f-kube-api-access-9mwx2\") pod \"dnsmasq-dns-78dd6ddcc-qgj2b\" (UID: \"06e0f9f3-e27f-485a-b45b-32a8a41c4c1f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qgj2b" Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.810587 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bcz4v" Oct 11 01:10:03 crc kubenswrapper[4743]: I1011 01:10:03.915636 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-qgj2b" Oct 11 01:10:04 crc kubenswrapper[4743]: I1011 01:10:04.273023 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bcz4v"] Oct 11 01:10:04 crc kubenswrapper[4743]: I1011 01:10:04.383303 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qgj2b"] Oct 11 01:10:04 crc kubenswrapper[4743]: I1011 01:10:04.735870 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-bcz4v" event={"ID":"3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c","Type":"ContainerStarted","Data":"b9eedfd6c8ed3a523993b8b6bc407b0711f2217857934d4c20abdaa392971355"} Oct 11 01:10:04 crc kubenswrapper[4743]: I1011 01:10:04.736972 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-qgj2b" event={"ID":"06e0f9f3-e27f-485a-b45b-32a8a41c4c1f","Type":"ContainerStarted","Data":"e3227cf1efa5c0c451f066cd525a6ffe68c189bdb9ef02174800382499c4748d"} Oct 11 01:10:06 crc kubenswrapper[4743]: I1011 01:10:06.639356 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bcz4v"] Oct 11 01:10:06 crc kubenswrapper[4743]: I1011 01:10:06.666270 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9hdp4"] Oct 11 01:10:06 crc kubenswrapper[4743]: I1011 01:10:06.667546 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-9hdp4" Oct 11 01:10:06 crc kubenswrapper[4743]: I1011 01:10:06.687556 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9hdp4"] Oct 11 01:10:06 crc kubenswrapper[4743]: I1011 01:10:06.809672 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqgvr\" (UniqueName: \"kubernetes.io/projected/719b7532-9517-409c-8870-62a1c92dd38c-kube-api-access-jqgvr\") pod \"dnsmasq-dns-666b6646f7-9hdp4\" (UID: \"719b7532-9517-409c-8870-62a1c92dd38c\") " pod="openstack/dnsmasq-dns-666b6646f7-9hdp4" Oct 11 01:10:06 crc kubenswrapper[4743]: I1011 01:10:06.809716 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/719b7532-9517-409c-8870-62a1c92dd38c-config\") pod \"dnsmasq-dns-666b6646f7-9hdp4\" (UID: \"719b7532-9517-409c-8870-62a1c92dd38c\") " pod="openstack/dnsmasq-dns-666b6646f7-9hdp4" Oct 11 01:10:06 crc kubenswrapper[4743]: I1011 01:10:06.809754 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/719b7532-9517-409c-8870-62a1c92dd38c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-9hdp4\" (UID: \"719b7532-9517-409c-8870-62a1c92dd38c\") " pod="openstack/dnsmasq-dns-666b6646f7-9hdp4" Oct 11 01:10:06 crc kubenswrapper[4743]: I1011 01:10:06.900068 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qgj2b"] Oct 11 01:10:06 crc kubenswrapper[4743]: I1011 01:10:06.912630 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/719b7532-9517-409c-8870-62a1c92dd38c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-9hdp4\" (UID: \"719b7532-9517-409c-8870-62a1c92dd38c\") " pod="openstack/dnsmasq-dns-666b6646f7-9hdp4" Oct 11 01:10:06 crc kubenswrapper[4743]: I1011 01:10:06.912781 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqgvr\" (UniqueName: \"kubernetes.io/projected/719b7532-9517-409c-8870-62a1c92dd38c-kube-api-access-jqgvr\") pod \"dnsmasq-dns-666b6646f7-9hdp4\" (UID: \"719b7532-9517-409c-8870-62a1c92dd38c\") " pod="openstack/dnsmasq-dns-666b6646f7-9hdp4" Oct 11 01:10:06 crc kubenswrapper[4743]: I1011 01:10:06.912800 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/719b7532-9517-409c-8870-62a1c92dd38c-config\") pod \"dnsmasq-dns-666b6646f7-9hdp4\" (UID: \"719b7532-9517-409c-8870-62a1c92dd38c\") " pod="openstack/dnsmasq-dns-666b6646f7-9hdp4" Oct 11 01:10:06 crc kubenswrapper[4743]: I1011 01:10:06.913913 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/719b7532-9517-409c-8870-62a1c92dd38c-config\") pod \"dnsmasq-dns-666b6646f7-9hdp4\" (UID: \"719b7532-9517-409c-8870-62a1c92dd38c\") " pod="openstack/dnsmasq-dns-666b6646f7-9hdp4" Oct 11 01:10:06 crc kubenswrapper[4743]: I1011 01:10:06.914476 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/719b7532-9517-409c-8870-62a1c92dd38c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-9hdp4\" (UID: \"719b7532-9517-409c-8870-62a1c92dd38c\") " pod="openstack/dnsmasq-dns-666b6646f7-9hdp4" Oct 11 01:10:06 crc kubenswrapper[4743]: I1011 01:10:06.927366 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xfzvh"] Oct 11 01:10:06 crc kubenswrapper[4743]: I1011 01:10:06.930134 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xfzvh" Oct 11 01:10:06 crc kubenswrapper[4743]: I1011 01:10:06.939681 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xfzvh"] Oct 11 01:10:06 crc kubenswrapper[4743]: I1011 01:10:06.960753 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqgvr\" (UniqueName: \"kubernetes.io/projected/719b7532-9517-409c-8870-62a1c92dd38c-kube-api-access-jqgvr\") pod \"dnsmasq-dns-666b6646f7-9hdp4\" (UID: \"719b7532-9517-409c-8870-62a1c92dd38c\") " pod="openstack/dnsmasq-dns-666b6646f7-9hdp4" Oct 11 01:10:06 crc kubenswrapper[4743]: I1011 01:10:06.988833 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-9hdp4" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.014547 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5052563c-4805-4a28-8aec-0fccc620857e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-xfzvh\" (UID: \"5052563c-4805-4a28-8aec-0fccc620857e\") " pod="openstack/dnsmasq-dns-57d769cc4f-xfzvh" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.014673 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5052563c-4805-4a28-8aec-0fccc620857e-config\") pod \"dnsmasq-dns-57d769cc4f-xfzvh\" (UID: \"5052563c-4805-4a28-8aec-0fccc620857e\") " pod="openstack/dnsmasq-dns-57d769cc4f-xfzvh" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.014706 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8qbm\" (UniqueName: \"kubernetes.io/projected/5052563c-4805-4a28-8aec-0fccc620857e-kube-api-access-d8qbm\") pod \"dnsmasq-dns-57d769cc4f-xfzvh\" (UID: \"5052563c-4805-4a28-8aec-0fccc620857e\") " pod="openstack/dnsmasq-dns-57d769cc4f-xfzvh" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.116119 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5052563c-4805-4a28-8aec-0fccc620857e-config\") pod \"dnsmasq-dns-57d769cc4f-xfzvh\" (UID: \"5052563c-4805-4a28-8aec-0fccc620857e\") " pod="openstack/dnsmasq-dns-57d769cc4f-xfzvh" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.116174 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8qbm\" (UniqueName: \"kubernetes.io/projected/5052563c-4805-4a28-8aec-0fccc620857e-kube-api-access-d8qbm\") pod \"dnsmasq-dns-57d769cc4f-xfzvh\" (UID: \"5052563c-4805-4a28-8aec-0fccc620857e\") " pod="openstack/dnsmasq-dns-57d769cc4f-xfzvh" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.116223 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5052563c-4805-4a28-8aec-0fccc620857e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-xfzvh\" (UID: \"5052563c-4805-4a28-8aec-0fccc620857e\") " pod="openstack/dnsmasq-dns-57d769cc4f-xfzvh" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.117285 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5052563c-4805-4a28-8aec-0fccc620857e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-xfzvh\" (UID: \"5052563c-4805-4a28-8aec-0fccc620857e\") " pod="openstack/dnsmasq-dns-57d769cc4f-xfzvh" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.117803 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5052563c-4805-4a28-8aec-0fccc620857e-config\") pod \"dnsmasq-dns-57d769cc4f-xfzvh\" (UID: \"5052563c-4805-4a28-8aec-0fccc620857e\") " pod="openstack/dnsmasq-dns-57d769cc4f-xfzvh" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.138555 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8qbm\" (UniqueName: \"kubernetes.io/projected/5052563c-4805-4a28-8aec-0fccc620857e-kube-api-access-d8qbm\") pod \"dnsmasq-dns-57d769cc4f-xfzvh\" (UID: \"5052563c-4805-4a28-8aec-0fccc620857e\") " pod="openstack/dnsmasq-dns-57d769cc4f-xfzvh" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.279686 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xfzvh" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.522655 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9hdp4"] Oct 11 01:10:07 crc kubenswrapper[4743]: W1011 01:10:07.533038 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod719b7532_9517_409c_8870_62a1c92dd38c.slice/crio-026fbccac4f567952edb1430cf2884f8a4da4d49d4d28b7ed5320888b90310e4 WatchSource:0}: Error finding container 026fbccac4f567952edb1430cf2884f8a4da4d49d4d28b7ed5320888b90310e4: Status 404 returned error can't find the container with id 026fbccac4f567952edb1430cf2884f8a4da4d49d4d28b7ed5320888b90310e4 Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.717206 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xfzvh"] Oct 11 01:10:07 crc kubenswrapper[4743]: W1011 01:10:07.726072 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5052563c_4805_4a28_8aec_0fccc620857e.slice/crio-3dab0e2f837f1f65cc09aad53bc4cf8fb95632c60cc835f8142635d86949d13d WatchSource:0}: Error finding container 3dab0e2f837f1f65cc09aad53bc4cf8fb95632c60cc835f8142635d86949d13d: Status 404 returned error can't find the container with id 3dab0e2f837f1f65cc09aad53bc4cf8fb95632c60cc835f8142635d86949d13d Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.774936 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xfzvh" event={"ID":"5052563c-4805-4a28-8aec-0fccc620857e","Type":"ContainerStarted","Data":"3dab0e2f837f1f65cc09aad53bc4cf8fb95632c60cc835f8142635d86949d13d"} Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.777510 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-9hdp4" event={"ID":"719b7532-9517-409c-8870-62a1c92dd38c","Type":"ContainerStarted","Data":"026fbccac4f567952edb1430cf2884f8a4da4d49d4d28b7ed5320888b90310e4"} Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.789012 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.790377 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.793778 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.794551 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.794671 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.794873 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.794998 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.795166 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-qgqm2" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.798375 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.825315 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.929892 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.929975 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.930046 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mklbt\" (UniqueName: \"kubernetes.io/projected/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-kube-api-access-mklbt\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.930119 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.930175 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.930203 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.930282 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.930323 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.930350 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.930402 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:07 crc kubenswrapper[4743]: I1011 01:10:07.930437 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-config-data\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.031710 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-config-data\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.032005 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.032043 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.032072 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mklbt\" (UniqueName: \"kubernetes.io/projected/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-kube-api-access-mklbt\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.032092 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.032125 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.032148 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.032180 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.032197 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.032219 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.032236 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.032600 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.032669 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-config-data\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.032974 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.033733 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.035257 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.035972 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.038435 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.041715 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.048380 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.052115 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.052129 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.054733 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.058277 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mklbt\" (UniqueName: \"kubernetes.io/projected/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-kube-api-access-mklbt\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.058612 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.058802 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.060048 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.061045 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.061065 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-5sk4m" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.061798 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.062068 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.063118 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.074211 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.126148 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.134708 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9f596550-b88a-49d7-9cff-cbc2d4149a2e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.134797 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9f596550-b88a-49d7-9cff-cbc2d4149a2e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.134890 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9f596550-b88a-49d7-9cff-cbc2d4149a2e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.134970 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9f596550-b88a-49d7-9cff-cbc2d4149a2e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.135025 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9f596550-b88a-49d7-9cff-cbc2d4149a2e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.135058 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9f596550-b88a-49d7-9cff-cbc2d4149a2e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.135084 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8px2z\" (UniqueName: \"kubernetes.io/projected/9f596550-b88a-49d7-9cff-cbc2d4149a2e-kube-api-access-8px2z\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.135238 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9f596550-b88a-49d7-9cff-cbc2d4149a2e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.135305 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.135344 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9f596550-b88a-49d7-9cff-cbc2d4149a2e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.135408 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f596550-b88a-49d7-9cff-cbc2d4149a2e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.238735 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9f596550-b88a-49d7-9cff-cbc2d4149a2e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.238802 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.238829 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9f596550-b88a-49d7-9cff-cbc2d4149a2e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.238847 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f596550-b88a-49d7-9cff-cbc2d4149a2e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.238913 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9f596550-b88a-49d7-9cff-cbc2d4149a2e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.238954 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9f596550-b88a-49d7-9cff-cbc2d4149a2e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.239013 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9f596550-b88a-49d7-9cff-cbc2d4149a2e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.239037 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9f596550-b88a-49d7-9cff-cbc2d4149a2e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.239062 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9f596550-b88a-49d7-9cff-cbc2d4149a2e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.239089 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9f596550-b88a-49d7-9cff-cbc2d4149a2e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.239105 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8px2z\" (UniqueName: \"kubernetes.io/projected/9f596550-b88a-49d7-9cff-cbc2d4149a2e-kube-api-access-8px2z\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.239738 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9f596550-b88a-49d7-9cff-cbc2d4149a2e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.239789 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9f596550-b88a-49d7-9cff-cbc2d4149a2e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.239844 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.240511 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9f596550-b88a-49d7-9cff-cbc2d4149a2e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.243673 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f596550-b88a-49d7-9cff-cbc2d4149a2e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.243845 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9f596550-b88a-49d7-9cff-cbc2d4149a2e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.244841 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9f596550-b88a-49d7-9cff-cbc2d4149a2e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.246533 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9f596550-b88a-49d7-9cff-cbc2d4149a2e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.246774 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9f596550-b88a-49d7-9cff-cbc2d4149a2e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.251375 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9f596550-b88a-49d7-9cff-cbc2d4149a2e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.256658 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8px2z\" (UniqueName: \"kubernetes.io/projected/9f596550-b88a-49d7-9cff-cbc2d4149a2e-kube-api-access-8px2z\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.277965 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:08 crc kubenswrapper[4743]: I1011 01:10:08.437723 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.493871 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.495902 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.497983 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.498381 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.499199 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.499342 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-rff99" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.499475 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.503789 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.507228 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.563187 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32c52bf9-36b5-4a75-8991-e76f4dd87fb3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.563509 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkwnr\" (UniqueName: \"kubernetes.io/projected/32c52bf9-36b5-4a75-8991-e76f4dd87fb3-kube-api-access-tkwnr\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.563653 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/32c52bf9-36b5-4a75-8991-e76f4dd87fb3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.563837 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/32c52bf9-36b5-4a75-8991-e76f4dd87fb3-config-data-default\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.564028 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.564157 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/32c52bf9-36b5-4a75-8991-e76f4dd87fb3-kolla-config\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.564300 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/32c52bf9-36b5-4a75-8991-e76f4dd87fb3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.564465 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32c52bf9-36b5-4a75-8991-e76f4dd87fb3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.564778 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/32c52bf9-36b5-4a75-8991-e76f4dd87fb3-secrets\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.666428 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.666493 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/32c52bf9-36b5-4a75-8991-e76f4dd87fb3-kolla-config\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.666542 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/32c52bf9-36b5-4a75-8991-e76f4dd87fb3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.666583 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32c52bf9-36b5-4a75-8991-e76f4dd87fb3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.666639 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/32c52bf9-36b5-4a75-8991-e76f4dd87fb3-secrets\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.666689 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32c52bf9-36b5-4a75-8991-e76f4dd87fb3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.666710 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkwnr\" (UniqueName: \"kubernetes.io/projected/32c52bf9-36b5-4a75-8991-e76f4dd87fb3-kube-api-access-tkwnr\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.666730 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/32c52bf9-36b5-4a75-8991-e76f4dd87fb3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.666754 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/32c52bf9-36b5-4a75-8991-e76f4dd87fb3-config-data-default\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.667936 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/32c52bf9-36b5-4a75-8991-e76f4dd87fb3-config-data-default\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.668900 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.669098 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/32c52bf9-36b5-4a75-8991-e76f4dd87fb3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.669623 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/32c52bf9-36b5-4a75-8991-e76f4dd87fb3-kolla-config\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.671322 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32c52bf9-36b5-4a75-8991-e76f4dd87fb3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.683321 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32c52bf9-36b5-4a75-8991-e76f4dd87fb3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.683810 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/32c52bf9-36b5-4a75-8991-e76f4dd87fb3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.695818 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/32c52bf9-36b5-4a75-8991-e76f4dd87fb3-secrets\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.696250 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkwnr\" (UniqueName: \"kubernetes.io/projected/32c52bf9-36b5-4a75-8991-e76f4dd87fb3-kube-api-access-tkwnr\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.707836 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"32c52bf9-36b5-4a75-8991-e76f4dd87fb3\") " pod="openstack/openstack-galera-0" Oct 11 01:10:09 crc kubenswrapper[4743]: I1011 01:10:09.830977 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 11 01:10:10 crc kubenswrapper[4743]: I1011 01:10:10.995915 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 11 01:10:10 crc kubenswrapper[4743]: I1011 01:10:10.998020 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.006131 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-dl5d8" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.006384 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.007401 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.007553 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.012764 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.033639 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp9kg\" (UniqueName: \"kubernetes.io/projected/f604069e-dff8-4f02-a5e8-d3ba38d87625-kube-api-access-lp9kg\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.033688 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f604069e-dff8-4f02-a5e8-d3ba38d87625-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.033738 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f604069e-dff8-4f02-a5e8-d3ba38d87625-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.033929 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f604069e-dff8-4f02-a5e8-d3ba38d87625-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.034060 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.034112 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f604069e-dff8-4f02-a5e8-d3ba38d87625-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.034278 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f604069e-dff8-4f02-a5e8-d3ba38d87625-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.034376 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f604069e-dff8-4f02-a5e8-d3ba38d87625-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.034477 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f604069e-dff8-4f02-a5e8-d3ba38d87625-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.136773 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f604069e-dff8-4f02-a5e8-d3ba38d87625-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.136883 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp9kg\" (UniqueName: \"kubernetes.io/projected/f604069e-dff8-4f02-a5e8-d3ba38d87625-kube-api-access-lp9kg\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.136921 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f604069e-dff8-4f02-a5e8-d3ba38d87625-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.136975 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f604069e-dff8-4f02-a5e8-d3ba38d87625-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.137001 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f604069e-dff8-4f02-a5e8-d3ba38d87625-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.137040 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.137066 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f604069e-dff8-4f02-a5e8-d3ba38d87625-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.137171 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f604069e-dff8-4f02-a5e8-d3ba38d87625-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.137211 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f604069e-dff8-4f02-a5e8-d3ba38d87625-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.137240 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.138097 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f604069e-dff8-4f02-a5e8-d3ba38d87625-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.138822 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f604069e-dff8-4f02-a5e8-d3ba38d87625-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.138998 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f604069e-dff8-4f02-a5e8-d3ba38d87625-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.145549 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f604069e-dff8-4f02-a5e8-d3ba38d87625-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.146452 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f604069e-dff8-4f02-a5e8-d3ba38d87625-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.154263 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f604069e-dff8-4f02-a5e8-d3ba38d87625-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.160435 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f604069e-dff8-4f02-a5e8-d3ba38d87625-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.161127 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.174777 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp9kg\" (UniqueName: \"kubernetes.io/projected/f604069e-dff8-4f02-a5e8-d3ba38d87625-kube-api-access-lp9kg\") pod \"openstack-cell1-galera-0\" (UID: \"f604069e-dff8-4f02-a5e8-d3ba38d87625\") " pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.319756 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.320737 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.324197 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.324472 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-q7bds" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.324775 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.325917 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.340488 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1159224-8c5f-43ae-8aa3-ca628c69914e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f1159224-8c5f-43ae-8aa3-ca628c69914e\") " pod="openstack/memcached-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.340662 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f1159224-8c5f-43ae-8aa3-ca628c69914e-config-data\") pod \"memcached-0\" (UID: \"f1159224-8c5f-43ae-8aa3-ca628c69914e\") " pod="openstack/memcached-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.340706 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1159224-8c5f-43ae-8aa3-ca628c69914e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f1159224-8c5f-43ae-8aa3-ca628c69914e\") " pod="openstack/memcached-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.340845 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f1159224-8c5f-43ae-8aa3-ca628c69914e-kolla-config\") pod \"memcached-0\" (UID: \"f1159224-8c5f-43ae-8aa3-ca628c69914e\") " pod="openstack/memcached-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.340994 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.341007 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jbj8\" (UniqueName: \"kubernetes.io/projected/f1159224-8c5f-43ae-8aa3-ca628c69914e-kube-api-access-6jbj8\") pod \"memcached-0\" (UID: \"f1159224-8c5f-43ae-8aa3-ca628c69914e\") " pod="openstack/memcached-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.442330 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f1159224-8c5f-43ae-8aa3-ca628c69914e-config-data\") pod \"memcached-0\" (UID: \"f1159224-8c5f-43ae-8aa3-ca628c69914e\") " pod="openstack/memcached-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.442386 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1159224-8c5f-43ae-8aa3-ca628c69914e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f1159224-8c5f-43ae-8aa3-ca628c69914e\") " pod="openstack/memcached-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.442410 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f1159224-8c5f-43ae-8aa3-ca628c69914e-kolla-config\") pod \"memcached-0\" (UID: \"f1159224-8c5f-43ae-8aa3-ca628c69914e\") " pod="openstack/memcached-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.442445 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jbj8\" (UniqueName: \"kubernetes.io/projected/f1159224-8c5f-43ae-8aa3-ca628c69914e-kube-api-access-6jbj8\") pod \"memcached-0\" (UID: \"f1159224-8c5f-43ae-8aa3-ca628c69914e\") " pod="openstack/memcached-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.442495 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1159224-8c5f-43ae-8aa3-ca628c69914e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f1159224-8c5f-43ae-8aa3-ca628c69914e\") " pod="openstack/memcached-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.443222 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f1159224-8c5f-43ae-8aa3-ca628c69914e-config-data\") pod \"memcached-0\" (UID: \"f1159224-8c5f-43ae-8aa3-ca628c69914e\") " pod="openstack/memcached-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.443311 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f1159224-8c5f-43ae-8aa3-ca628c69914e-kolla-config\") pod \"memcached-0\" (UID: \"f1159224-8c5f-43ae-8aa3-ca628c69914e\") " pod="openstack/memcached-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.452552 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1159224-8c5f-43ae-8aa3-ca628c69914e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f1159224-8c5f-43ae-8aa3-ca628c69914e\") " pod="openstack/memcached-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.459984 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1159224-8c5f-43ae-8aa3-ca628c69914e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f1159224-8c5f-43ae-8aa3-ca628c69914e\") " pod="openstack/memcached-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.471385 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jbj8\" (UniqueName: \"kubernetes.io/projected/f1159224-8c5f-43ae-8aa3-ca628c69914e-kube-api-access-6jbj8\") pod \"memcached-0\" (UID: \"f1159224-8c5f-43ae-8aa3-ca628c69914e\") " pod="openstack/memcached-0" Oct 11 01:10:11 crc kubenswrapper[4743]: I1011 01:10:11.640713 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 11 01:10:14 crc kubenswrapper[4743]: I1011 01:10:14.458237 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:10:14 crc kubenswrapper[4743]: I1011 01:10:14.458766 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:10:14 crc kubenswrapper[4743]: I1011 01:10:14.458812 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 01:10:14 crc kubenswrapper[4743]: I1011 01:10:14.459415 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d141abb12335a71090b8204b0a7206f68b485cc9db85f994938ef978a23ae624"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 01:10:14 crc kubenswrapper[4743]: I1011 01:10:14.459471 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://d141abb12335a71090b8204b0a7206f68b485cc9db85f994938ef978a23ae624" gracePeriod=600 Oct 11 01:10:14 crc kubenswrapper[4743]: I1011 01:10:14.638382 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 11 01:10:14 crc kubenswrapper[4743]: I1011 01:10:14.639677 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 11 01:10:14 crc kubenswrapper[4743]: I1011 01:10:14.645232 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-6bjrg" Oct 11 01:10:14 crc kubenswrapper[4743]: I1011 01:10:14.658296 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 11 01:10:14 crc kubenswrapper[4743]: I1011 01:10:14.705825 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45tn7\" (UniqueName: \"kubernetes.io/projected/5cbb33ea-578f-4987-94cf-d6bf069a2953-kube-api-access-45tn7\") pod \"kube-state-metrics-0\" (UID: \"5cbb33ea-578f-4987-94cf-d6bf069a2953\") " pod="openstack/kube-state-metrics-0" Oct 11 01:10:14 crc kubenswrapper[4743]: I1011 01:10:14.807887 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45tn7\" (UniqueName: \"kubernetes.io/projected/5cbb33ea-578f-4987-94cf-d6bf069a2953-kube-api-access-45tn7\") pod \"kube-state-metrics-0\" (UID: \"5cbb33ea-578f-4987-94cf-d6bf069a2953\") " pod="openstack/kube-state-metrics-0" Oct 11 01:10:14 crc kubenswrapper[4743]: I1011 01:10:14.848219 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45tn7\" (UniqueName: \"kubernetes.io/projected/5cbb33ea-578f-4987-94cf-d6bf069a2953-kube-api-access-45tn7\") pod \"kube-state-metrics-0\" (UID: \"5cbb33ea-578f-4987-94cf-d6bf069a2953\") " pod="openstack/kube-state-metrics-0" Oct 11 01:10:14 crc kubenswrapper[4743]: I1011 01:10:14.891567 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="d141abb12335a71090b8204b0a7206f68b485cc9db85f994938ef978a23ae624" exitCode=0 Oct 11 01:10:14 crc kubenswrapper[4743]: I1011 01:10:14.891602 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"d141abb12335a71090b8204b0a7206f68b485cc9db85f994938ef978a23ae624"} Oct 11 01:10:14 crc kubenswrapper[4743]: I1011 01:10:14.891635 4743 scope.go:117] "RemoveContainer" containerID="06d63da6139508ac6d7d3ccf51eec7dcc1dbdfea0379b704f2d1844d8e86a974" Oct 11 01:10:14 crc kubenswrapper[4743]: I1011 01:10:14.982726 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.223246 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-6584dc9448-tzxl5"] Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.224976 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-6584dc9448-tzxl5" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.227419 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.227816 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-sx7pb" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.230315 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-6584dc9448-tzxl5"] Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.316059 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4bbc3ff-d45e-46ac-a8fc-6b75e1f5d342-serving-cert\") pod \"observability-ui-dashboards-6584dc9448-tzxl5\" (UID: \"c4bbc3ff-d45e-46ac-a8fc-6b75e1f5d342\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-tzxl5" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.316102 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn76x\" (UniqueName: \"kubernetes.io/projected/c4bbc3ff-d45e-46ac-a8fc-6b75e1f5d342-kube-api-access-tn76x\") pod \"observability-ui-dashboards-6584dc9448-tzxl5\" (UID: \"c4bbc3ff-d45e-46ac-a8fc-6b75e1f5d342\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-tzxl5" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.417409 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4bbc3ff-d45e-46ac-a8fc-6b75e1f5d342-serving-cert\") pod \"observability-ui-dashboards-6584dc9448-tzxl5\" (UID: \"c4bbc3ff-d45e-46ac-a8fc-6b75e1f5d342\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-tzxl5" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.417446 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn76x\" (UniqueName: \"kubernetes.io/projected/c4bbc3ff-d45e-46ac-a8fc-6b75e1f5d342-kube-api-access-tn76x\") pod \"observability-ui-dashboards-6584dc9448-tzxl5\" (UID: \"c4bbc3ff-d45e-46ac-a8fc-6b75e1f5d342\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-tzxl5" Oct 11 01:10:15 crc kubenswrapper[4743]: E1011 01:10:15.417579 4743 secret.go:188] Couldn't get secret openshift-operators/observability-ui-dashboards: secret "observability-ui-dashboards" not found Oct 11 01:10:15 crc kubenswrapper[4743]: E1011 01:10:15.417658 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4bbc3ff-d45e-46ac-a8fc-6b75e1f5d342-serving-cert podName:c4bbc3ff-d45e-46ac-a8fc-6b75e1f5d342 nodeName:}" failed. No retries permitted until 2025-10-11 01:10:15.917636404 +0000 UTC m=+1110.570616801 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c4bbc3ff-d45e-46ac-a8fc-6b75e1f5d342-serving-cert") pod "observability-ui-dashboards-6584dc9448-tzxl5" (UID: "c4bbc3ff-d45e-46ac-a8fc-6b75e1f5d342") : secret "observability-ui-dashboards" not found Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.442107 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn76x\" (UniqueName: \"kubernetes.io/projected/c4bbc3ff-d45e-46ac-a8fc-6b75e1f5d342-kube-api-access-tn76x\") pod \"observability-ui-dashboards-6584dc9448-tzxl5\" (UID: \"c4bbc3ff-d45e-46ac-a8fc-6b75e1f5d342\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-tzxl5" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.687033 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-748c5b5875-pmrqh"] Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.688092 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.703674 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-748c5b5875-pmrqh"] Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.721731 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b68870e2-fd28-4422-a232-b673325abeec-console-serving-cert\") pod \"console-748c5b5875-pmrqh\" (UID: \"b68870e2-fd28-4422-a232-b673325abeec\") " pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.721771 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b68870e2-fd28-4422-a232-b673325abeec-service-ca\") pod \"console-748c5b5875-pmrqh\" (UID: \"b68870e2-fd28-4422-a232-b673325abeec\") " pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.721827 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b68870e2-fd28-4422-a232-b673325abeec-console-oauth-config\") pod \"console-748c5b5875-pmrqh\" (UID: \"b68870e2-fd28-4422-a232-b673325abeec\") " pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.721846 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b68870e2-fd28-4422-a232-b673325abeec-trusted-ca-bundle\") pod \"console-748c5b5875-pmrqh\" (UID: \"b68870e2-fd28-4422-a232-b673325abeec\") " pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.721894 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b68870e2-fd28-4422-a232-b673325abeec-oauth-serving-cert\") pod \"console-748c5b5875-pmrqh\" (UID: \"b68870e2-fd28-4422-a232-b673325abeec\") " pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.721914 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b68870e2-fd28-4422-a232-b673325abeec-console-config\") pod \"console-748c5b5875-pmrqh\" (UID: \"b68870e2-fd28-4422-a232-b673325abeec\") " pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.721929 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw4dk\" (UniqueName: \"kubernetes.io/projected/b68870e2-fd28-4422-a232-b673325abeec-kube-api-access-dw4dk\") pod \"console-748c5b5875-pmrqh\" (UID: \"b68870e2-fd28-4422-a232-b673325abeec\") " pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.823139 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b68870e2-fd28-4422-a232-b673325abeec-console-oauth-config\") pod \"console-748c5b5875-pmrqh\" (UID: \"b68870e2-fd28-4422-a232-b673325abeec\") " pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.823183 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b68870e2-fd28-4422-a232-b673325abeec-trusted-ca-bundle\") pod \"console-748c5b5875-pmrqh\" (UID: \"b68870e2-fd28-4422-a232-b673325abeec\") " pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.823221 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b68870e2-fd28-4422-a232-b673325abeec-oauth-serving-cert\") pod \"console-748c5b5875-pmrqh\" (UID: \"b68870e2-fd28-4422-a232-b673325abeec\") " pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.823242 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw4dk\" (UniqueName: \"kubernetes.io/projected/b68870e2-fd28-4422-a232-b673325abeec-kube-api-access-dw4dk\") pod \"console-748c5b5875-pmrqh\" (UID: \"b68870e2-fd28-4422-a232-b673325abeec\") " pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.823259 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b68870e2-fd28-4422-a232-b673325abeec-console-config\") pod \"console-748c5b5875-pmrqh\" (UID: \"b68870e2-fd28-4422-a232-b673325abeec\") " pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.823346 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b68870e2-fd28-4422-a232-b673325abeec-console-serving-cert\") pod \"console-748c5b5875-pmrqh\" (UID: \"b68870e2-fd28-4422-a232-b673325abeec\") " pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.823363 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b68870e2-fd28-4422-a232-b673325abeec-service-ca\") pod \"console-748c5b5875-pmrqh\" (UID: \"b68870e2-fd28-4422-a232-b673325abeec\") " pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.824228 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b68870e2-fd28-4422-a232-b673325abeec-oauth-serving-cert\") pod \"console-748c5b5875-pmrqh\" (UID: \"b68870e2-fd28-4422-a232-b673325abeec\") " pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.824295 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b68870e2-fd28-4422-a232-b673325abeec-service-ca\") pod \"console-748c5b5875-pmrqh\" (UID: \"b68870e2-fd28-4422-a232-b673325abeec\") " pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.824418 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b68870e2-fd28-4422-a232-b673325abeec-trusted-ca-bundle\") pod \"console-748c5b5875-pmrqh\" (UID: \"b68870e2-fd28-4422-a232-b673325abeec\") " pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.824487 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b68870e2-fd28-4422-a232-b673325abeec-console-config\") pod \"console-748c5b5875-pmrqh\" (UID: \"b68870e2-fd28-4422-a232-b673325abeec\") " pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.830410 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b68870e2-fd28-4422-a232-b673325abeec-console-oauth-config\") pod \"console-748c5b5875-pmrqh\" (UID: \"b68870e2-fd28-4422-a232-b673325abeec\") " pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.832522 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b68870e2-fd28-4422-a232-b673325abeec-console-serving-cert\") pod \"console-748c5b5875-pmrqh\" (UID: \"b68870e2-fd28-4422-a232-b673325abeec\") " pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.858608 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw4dk\" (UniqueName: \"kubernetes.io/projected/b68870e2-fd28-4422-a232-b673325abeec-kube-api-access-dw4dk\") pod \"console-748c5b5875-pmrqh\" (UID: \"b68870e2-fd28-4422-a232-b673325abeec\") " pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.905261 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.907143 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.910622 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.910633 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.915995 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-tkdpc" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.916203 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.918221 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.923226 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.925661 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4bbc3ff-d45e-46ac-a8fc-6b75e1f5d342-serving-cert\") pod \"observability-ui-dashboards-6584dc9448-tzxl5\" (UID: \"c4bbc3ff-d45e-46ac-a8fc-6b75e1f5d342\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-tzxl5" Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.952839 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 11 01:10:15 crc kubenswrapper[4743]: I1011 01:10:15.959802 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4bbc3ff-d45e-46ac-a8fc-6b75e1f5d342-serving-cert\") pod \"observability-ui-dashboards-6584dc9448-tzxl5\" (UID: \"c4bbc3ff-d45e-46ac-a8fc-6b75e1f5d342\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-tzxl5" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.026763 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e1a1779f-127f-4ea2-a937-b97f329e3878-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.026811 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e1a1779f-127f-4ea2-a937-b97f329e3878-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.026838 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e1a1779f-127f-4ea2-a937-b97f329e3878-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.026933 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1a1779f-127f-4ea2-a937-b97f329e3878-config\") pod \"prometheus-metric-storage-0\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.026965 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gsk9\" (UniqueName: \"kubernetes.io/projected/e1a1779f-127f-4ea2-a937-b97f329e3878-kube-api-access-6gsk9\") pod \"prometheus-metric-storage-0\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.027105 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5f28cf97-999f-4f71-812b-e7f9e9accdc6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f28cf97-999f-4f71-812b-e7f9e9accdc6\") pod \"prometheus-metric-storage-0\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.027150 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e1a1779f-127f-4ea2-a937-b97f329e3878-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.027178 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e1a1779f-127f-4ea2-a937-b97f329e3878-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.044389 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.131942 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5f28cf97-999f-4f71-812b-e7f9e9accdc6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f28cf97-999f-4f71-812b-e7f9e9accdc6\") pod \"prometheus-metric-storage-0\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.131997 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e1a1779f-127f-4ea2-a937-b97f329e3878-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.132019 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e1a1779f-127f-4ea2-a937-b97f329e3878-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.132049 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e1a1779f-127f-4ea2-a937-b97f329e3878-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.132066 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e1a1779f-127f-4ea2-a937-b97f329e3878-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.132087 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e1a1779f-127f-4ea2-a937-b97f329e3878-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.132108 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1a1779f-127f-4ea2-a937-b97f329e3878-config\") pod \"prometheus-metric-storage-0\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.132129 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gsk9\" (UniqueName: \"kubernetes.io/projected/e1a1779f-127f-4ea2-a937-b97f329e3878-kube-api-access-6gsk9\") pod \"prometheus-metric-storage-0\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.135605 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e1a1779f-127f-4ea2-a937-b97f329e3878-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.142326 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e1a1779f-127f-4ea2-a937-b97f329e3878-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.143048 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e1a1779f-127f-4ea2-a937-b97f329e3878-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.143821 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e1a1779f-127f-4ea2-a937-b97f329e3878-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.144408 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e1a1779f-127f-4ea2-a937-b97f329e3878-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.145762 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-6584dc9448-tzxl5" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.151263 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.151307 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gsk9\" (UniqueName: \"kubernetes.io/projected/e1a1779f-127f-4ea2-a937-b97f329e3878-kube-api-access-6gsk9\") pod \"prometheus-metric-storage-0\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.151314 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5f28cf97-999f-4f71-812b-e7f9e9accdc6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f28cf97-999f-4f71-812b-e7f9e9accdc6\") pod \"prometheus-metric-storage-0\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/20eacae98d4e3ade30240db8ae2c9a452ab5c4cf715521e04f6c7bc8a9fb59e6/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.191342 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1a1779f-127f-4ea2-a937-b97f329e3878-config\") pod \"prometheus-metric-storage-0\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.217451 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5f28cf97-999f-4f71-812b-e7f9e9accdc6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f28cf97-999f-4f71-812b-e7f9e9accdc6\") pod \"prometheus-metric-storage-0\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:16 crc kubenswrapper[4743]: I1011 01:10:16.307480 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.474895 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mwtxs"] Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.476302 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mwtxs" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.479488 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.479602 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-5xxql" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.479603 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.483786 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mwtxs"] Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.543573 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-6g6xb"] Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.545579 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-6g6xb" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.560167 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ab33e99-afb5-4b67-89bd-a2eb540bf194-ovn-controller-tls-certs\") pod \"ovn-controller-mwtxs\" (UID: \"1ab33e99-afb5-4b67-89bd-a2eb540bf194\") " pod="openstack/ovn-controller-mwtxs" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.560202 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab33e99-afb5-4b67-89bd-a2eb540bf194-combined-ca-bundle\") pod \"ovn-controller-mwtxs\" (UID: \"1ab33e99-afb5-4b67-89bd-a2eb540bf194\") " pod="openstack/ovn-controller-mwtxs" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.560346 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1ab33e99-afb5-4b67-89bd-a2eb540bf194-var-log-ovn\") pod \"ovn-controller-mwtxs\" (UID: \"1ab33e99-afb5-4b67-89bd-a2eb540bf194\") " pod="openstack/ovn-controller-mwtxs" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.560379 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ab33e99-afb5-4b67-89bd-a2eb540bf194-var-run\") pod \"ovn-controller-mwtxs\" (UID: \"1ab33e99-afb5-4b67-89bd-a2eb540bf194\") " pod="openstack/ovn-controller-mwtxs" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.560407 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ab33e99-afb5-4b67-89bd-a2eb540bf194-var-run-ovn\") pod \"ovn-controller-mwtxs\" (UID: \"1ab33e99-afb5-4b67-89bd-a2eb540bf194\") " pod="openstack/ovn-controller-mwtxs" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.560423 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ab33e99-afb5-4b67-89bd-a2eb540bf194-scripts\") pod \"ovn-controller-mwtxs\" (UID: \"1ab33e99-afb5-4b67-89bd-a2eb540bf194\") " pod="openstack/ovn-controller-mwtxs" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.560462 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm8cz\" (UniqueName: \"kubernetes.io/projected/1ab33e99-afb5-4b67-89bd-a2eb540bf194-kube-api-access-dm8cz\") pod \"ovn-controller-mwtxs\" (UID: \"1ab33e99-afb5-4b67-89bd-a2eb540bf194\") " pod="openstack/ovn-controller-mwtxs" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.565675 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-6g6xb"] Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.661692 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387-scripts\") pod \"ovn-controller-ovs-6g6xb\" (UID: \"ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387\") " pod="openstack/ovn-controller-ovs-6g6xb" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.661816 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1ab33e99-afb5-4b67-89bd-a2eb540bf194-var-log-ovn\") pod \"ovn-controller-mwtxs\" (UID: \"1ab33e99-afb5-4b67-89bd-a2eb540bf194\") " pod="openstack/ovn-controller-mwtxs" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.661886 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ab33e99-afb5-4b67-89bd-a2eb540bf194-var-run\") pod \"ovn-controller-mwtxs\" (UID: \"1ab33e99-afb5-4b67-89bd-a2eb540bf194\") " pod="openstack/ovn-controller-mwtxs" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.661935 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm8cz\" (UniqueName: \"kubernetes.io/projected/1ab33e99-afb5-4b67-89bd-a2eb540bf194-kube-api-access-dm8cz\") pod \"ovn-controller-mwtxs\" (UID: \"1ab33e99-afb5-4b67-89bd-a2eb540bf194\") " pod="openstack/ovn-controller-mwtxs" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.661961 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ab33e99-afb5-4b67-89bd-a2eb540bf194-var-run-ovn\") pod \"ovn-controller-mwtxs\" (UID: \"1ab33e99-afb5-4b67-89bd-a2eb540bf194\") " pod="openstack/ovn-controller-mwtxs" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.661983 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ab33e99-afb5-4b67-89bd-a2eb540bf194-scripts\") pod \"ovn-controller-mwtxs\" (UID: \"1ab33e99-afb5-4b67-89bd-a2eb540bf194\") " pod="openstack/ovn-controller-mwtxs" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.662042 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387-var-log\") pod \"ovn-controller-ovs-6g6xb\" (UID: \"ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387\") " pod="openstack/ovn-controller-ovs-6g6xb" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.662098 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387-var-run\") pod \"ovn-controller-ovs-6g6xb\" (UID: \"ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387\") " pod="openstack/ovn-controller-ovs-6g6xb" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.662126 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387-var-lib\") pod \"ovn-controller-ovs-6g6xb\" (UID: \"ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387\") " pod="openstack/ovn-controller-ovs-6g6xb" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.662195 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387-etc-ovs\") pod \"ovn-controller-ovs-6g6xb\" (UID: \"ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387\") " pod="openstack/ovn-controller-ovs-6g6xb" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.662236 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwxlr\" (UniqueName: \"kubernetes.io/projected/ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387-kube-api-access-vwxlr\") pod \"ovn-controller-ovs-6g6xb\" (UID: \"ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387\") " pod="openstack/ovn-controller-ovs-6g6xb" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.662306 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ab33e99-afb5-4b67-89bd-a2eb540bf194-ovn-controller-tls-certs\") pod \"ovn-controller-mwtxs\" (UID: \"1ab33e99-afb5-4b67-89bd-a2eb540bf194\") " pod="openstack/ovn-controller-mwtxs" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.662348 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab33e99-afb5-4b67-89bd-a2eb540bf194-combined-ca-bundle\") pod \"ovn-controller-mwtxs\" (UID: \"1ab33e99-afb5-4b67-89bd-a2eb540bf194\") " pod="openstack/ovn-controller-mwtxs" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.662619 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ab33e99-afb5-4b67-89bd-a2eb540bf194-var-run\") pod \"ovn-controller-mwtxs\" (UID: \"1ab33e99-afb5-4b67-89bd-a2eb540bf194\") " pod="openstack/ovn-controller-mwtxs" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.663044 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1ab33e99-afb5-4b67-89bd-a2eb540bf194-var-log-ovn\") pod \"ovn-controller-mwtxs\" (UID: \"1ab33e99-afb5-4b67-89bd-a2eb540bf194\") " pod="openstack/ovn-controller-mwtxs" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.663042 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ab33e99-afb5-4b67-89bd-a2eb540bf194-var-run-ovn\") pod \"ovn-controller-mwtxs\" (UID: \"1ab33e99-afb5-4b67-89bd-a2eb540bf194\") " pod="openstack/ovn-controller-mwtxs" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.664843 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ab33e99-afb5-4b67-89bd-a2eb540bf194-scripts\") pod \"ovn-controller-mwtxs\" (UID: \"1ab33e99-afb5-4b67-89bd-a2eb540bf194\") " pod="openstack/ovn-controller-mwtxs" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.668806 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab33e99-afb5-4b67-89bd-a2eb540bf194-combined-ca-bundle\") pod \"ovn-controller-mwtxs\" (UID: \"1ab33e99-afb5-4b67-89bd-a2eb540bf194\") " pod="openstack/ovn-controller-mwtxs" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.676290 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ab33e99-afb5-4b67-89bd-a2eb540bf194-ovn-controller-tls-certs\") pod \"ovn-controller-mwtxs\" (UID: \"1ab33e99-afb5-4b67-89bd-a2eb540bf194\") " pod="openstack/ovn-controller-mwtxs" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.677874 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm8cz\" (UniqueName: \"kubernetes.io/projected/1ab33e99-afb5-4b67-89bd-a2eb540bf194-kube-api-access-dm8cz\") pod \"ovn-controller-mwtxs\" (UID: \"1ab33e99-afb5-4b67-89bd-a2eb540bf194\") " pod="openstack/ovn-controller-mwtxs" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.764029 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387-scripts\") pod \"ovn-controller-ovs-6g6xb\" (UID: \"ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387\") " pod="openstack/ovn-controller-ovs-6g6xb" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.764098 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387-var-log\") pod \"ovn-controller-ovs-6g6xb\" (UID: \"ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387\") " pod="openstack/ovn-controller-ovs-6g6xb" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.764132 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387-var-run\") pod \"ovn-controller-ovs-6g6xb\" (UID: \"ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387\") " pod="openstack/ovn-controller-ovs-6g6xb" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.764156 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387-var-lib\") pod \"ovn-controller-ovs-6g6xb\" (UID: \"ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387\") " pod="openstack/ovn-controller-ovs-6g6xb" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.764188 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387-etc-ovs\") pod \"ovn-controller-ovs-6g6xb\" (UID: \"ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387\") " pod="openstack/ovn-controller-ovs-6g6xb" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.764233 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwxlr\" (UniqueName: \"kubernetes.io/projected/ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387-kube-api-access-vwxlr\") pod \"ovn-controller-ovs-6g6xb\" (UID: \"ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387\") " pod="openstack/ovn-controller-ovs-6g6xb" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.764299 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387-var-run\") pod \"ovn-controller-ovs-6g6xb\" (UID: \"ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387\") " pod="openstack/ovn-controller-ovs-6g6xb" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.764405 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387-var-lib\") pod \"ovn-controller-ovs-6g6xb\" (UID: \"ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387\") " pod="openstack/ovn-controller-ovs-6g6xb" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.764493 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387-etc-ovs\") pod \"ovn-controller-ovs-6g6xb\" (UID: \"ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387\") " pod="openstack/ovn-controller-ovs-6g6xb" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.764538 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387-var-log\") pod \"ovn-controller-ovs-6g6xb\" (UID: \"ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387\") " pod="openstack/ovn-controller-ovs-6g6xb" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.766077 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387-scripts\") pod \"ovn-controller-ovs-6g6xb\" (UID: \"ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387\") " pod="openstack/ovn-controller-ovs-6g6xb" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.778632 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwxlr\" (UniqueName: \"kubernetes.io/projected/ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387-kube-api-access-vwxlr\") pod \"ovn-controller-ovs-6g6xb\" (UID: \"ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387\") " pod="openstack/ovn-controller-ovs-6g6xb" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.799052 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mwtxs" Oct 11 01:10:17 crc kubenswrapper[4743]: I1011 01:10:17.867834 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-6g6xb" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.574123 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.576280 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.585300 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.585704 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.586840 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.587153 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-w9p2t" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.587292 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.594625 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.750321 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzlgt\" (UniqueName: \"kubernetes.io/projected/7c8284ba-a2b2-4f9f-a692-b372e8294d6b-kube-api-access-vzlgt\") pod \"ovsdbserver-nb-0\" (UID: \"7c8284ba-a2b2-4f9f-a692-b372e8294d6b\") " pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.750370 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c8284ba-a2b2-4f9f-a692-b372e8294d6b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7c8284ba-a2b2-4f9f-a692-b372e8294d6b\") " pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.750531 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7c8284ba-a2b2-4f9f-a692-b372e8294d6b\") " pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.750574 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c8284ba-a2b2-4f9f-a692-b372e8294d6b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7c8284ba-a2b2-4f9f-a692-b372e8294d6b\") " pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.750624 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c8284ba-a2b2-4f9f-a692-b372e8294d6b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7c8284ba-a2b2-4f9f-a692-b372e8294d6b\") " pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.750653 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7c8284ba-a2b2-4f9f-a692-b372e8294d6b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7c8284ba-a2b2-4f9f-a692-b372e8294d6b\") " pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.750772 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c8284ba-a2b2-4f9f-a692-b372e8294d6b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7c8284ba-a2b2-4f9f-a692-b372e8294d6b\") " pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.750832 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c8284ba-a2b2-4f9f-a692-b372e8294d6b-config\") pod \"ovsdbserver-nb-0\" (UID: \"7c8284ba-a2b2-4f9f-a692-b372e8294d6b\") " pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: E1011 01:10:21.841056 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 11 01:10:21 crc kubenswrapper[4743]: E1011 01:10:21.841274 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bx5gc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-bcz4v_openstack(3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 11 01:10:21 crc kubenswrapper[4743]: E1011 01:10:21.842617 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-bcz4v" podUID="3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.846487 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.848901 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.851661 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.851904 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.852134 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-z9kwm" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.852285 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.854155 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c8284ba-a2b2-4f9f-a692-b372e8294d6b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7c8284ba-a2b2-4f9f-a692-b372e8294d6b\") " pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.854196 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7c8284ba-a2b2-4f9f-a692-b372e8294d6b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7c8284ba-a2b2-4f9f-a692-b372e8294d6b\") " pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.854241 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c8284ba-a2b2-4f9f-a692-b372e8294d6b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7c8284ba-a2b2-4f9f-a692-b372e8294d6b\") " pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.854260 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c8284ba-a2b2-4f9f-a692-b372e8294d6b-config\") pod \"ovsdbserver-nb-0\" (UID: \"7c8284ba-a2b2-4f9f-a692-b372e8294d6b\") " pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.854302 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzlgt\" (UniqueName: \"kubernetes.io/projected/7c8284ba-a2b2-4f9f-a692-b372e8294d6b-kube-api-access-vzlgt\") pod \"ovsdbserver-nb-0\" (UID: \"7c8284ba-a2b2-4f9f-a692-b372e8294d6b\") " pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.854317 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c8284ba-a2b2-4f9f-a692-b372e8294d6b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7c8284ba-a2b2-4f9f-a692-b372e8294d6b\") " pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.854369 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7c8284ba-a2b2-4f9f-a692-b372e8294d6b\") " pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.854394 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c8284ba-a2b2-4f9f-a692-b372e8294d6b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7c8284ba-a2b2-4f9f-a692-b372e8294d6b\") " pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.857276 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7c8284ba-a2b2-4f9f-a692-b372e8294d6b\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.857658 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c8284ba-a2b2-4f9f-a692-b372e8294d6b-config\") pod \"ovsdbserver-nb-0\" (UID: \"7c8284ba-a2b2-4f9f-a692-b372e8294d6b\") " pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.857961 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7c8284ba-a2b2-4f9f-a692-b372e8294d6b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7c8284ba-a2b2-4f9f-a692-b372e8294d6b\") " pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.861515 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c8284ba-a2b2-4f9f-a692-b372e8294d6b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7c8284ba-a2b2-4f9f-a692-b372e8294d6b\") " pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.863571 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.867841 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c8284ba-a2b2-4f9f-a692-b372e8294d6b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7c8284ba-a2b2-4f9f-a692-b372e8294d6b\") " pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.872226 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c8284ba-a2b2-4f9f-a692-b372e8294d6b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7c8284ba-a2b2-4f9f-a692-b372e8294d6b\") " pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.873658 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c8284ba-a2b2-4f9f-a692-b372e8294d6b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7c8284ba-a2b2-4f9f-a692-b372e8294d6b\") " pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.885815 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzlgt\" (UniqueName: \"kubernetes.io/projected/7c8284ba-a2b2-4f9f-a692-b372e8294d6b-kube-api-access-vzlgt\") pod \"ovsdbserver-nb-0\" (UID: \"7c8284ba-a2b2-4f9f-a692-b372e8294d6b\") " pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: E1011 01:10:21.890089 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 11 01:10:21 crc kubenswrapper[4743]: E1011 01:10:21.890269 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9mwx2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-qgj2b_openstack(06e0f9f3-e27f-485a-b45b-32a8a41c4c1f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 11 01:10:21 crc kubenswrapper[4743]: E1011 01:10:21.894497 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-qgj2b" podUID="06e0f9f3-e27f-485a-b45b-32a8a41c4c1f" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.899050 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7c8284ba-a2b2-4f9f-a692-b372e8294d6b\") " pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.956088 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74c19249-ff95-4e49-96bb-1135e7aa1b08-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"74c19249-ff95-4e49-96bb-1135e7aa1b08\") " pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.956166 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c19249-ff95-4e49-96bb-1135e7aa1b08-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"74c19249-ff95-4e49-96bb-1135e7aa1b08\") " pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.956204 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vb2k\" (UniqueName: \"kubernetes.io/projected/74c19249-ff95-4e49-96bb-1135e7aa1b08-kube-api-access-4vb2k\") pod \"ovsdbserver-sb-0\" (UID: \"74c19249-ff95-4e49-96bb-1135e7aa1b08\") " pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.956232 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/74c19249-ff95-4e49-96bb-1135e7aa1b08-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"74c19249-ff95-4e49-96bb-1135e7aa1b08\") " pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.956270 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c19249-ff95-4e49-96bb-1135e7aa1b08-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"74c19249-ff95-4e49-96bb-1135e7aa1b08\") " pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.956290 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"74c19249-ff95-4e49-96bb-1135e7aa1b08\") " pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.956322 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c19249-ff95-4e49-96bb-1135e7aa1b08-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"74c19249-ff95-4e49-96bb-1135e7aa1b08\") " pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:21 crc kubenswrapper[4743]: I1011 01:10:21.956346 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74c19249-ff95-4e49-96bb-1135e7aa1b08-config\") pod \"ovsdbserver-sb-0\" (UID: \"74c19249-ff95-4e49-96bb-1135e7aa1b08\") " pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.058000 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/74c19249-ff95-4e49-96bb-1135e7aa1b08-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"74c19249-ff95-4e49-96bb-1135e7aa1b08\") " pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.058065 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c19249-ff95-4e49-96bb-1135e7aa1b08-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"74c19249-ff95-4e49-96bb-1135e7aa1b08\") " pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.058090 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"74c19249-ff95-4e49-96bb-1135e7aa1b08\") " pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.058120 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c19249-ff95-4e49-96bb-1135e7aa1b08-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"74c19249-ff95-4e49-96bb-1135e7aa1b08\") " pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.058145 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74c19249-ff95-4e49-96bb-1135e7aa1b08-config\") pod \"ovsdbserver-sb-0\" (UID: \"74c19249-ff95-4e49-96bb-1135e7aa1b08\") " pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.058184 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74c19249-ff95-4e49-96bb-1135e7aa1b08-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"74c19249-ff95-4e49-96bb-1135e7aa1b08\") " pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.058237 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c19249-ff95-4e49-96bb-1135e7aa1b08-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"74c19249-ff95-4e49-96bb-1135e7aa1b08\") " pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.058272 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vb2k\" (UniqueName: \"kubernetes.io/projected/74c19249-ff95-4e49-96bb-1135e7aa1b08-kube-api-access-4vb2k\") pod \"ovsdbserver-sb-0\" (UID: \"74c19249-ff95-4e49-96bb-1135e7aa1b08\") " pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.059325 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"74c19249-ff95-4e49-96bb-1135e7aa1b08\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.059499 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/74c19249-ff95-4e49-96bb-1135e7aa1b08-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"74c19249-ff95-4e49-96bb-1135e7aa1b08\") " pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.059832 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74c19249-ff95-4e49-96bb-1135e7aa1b08-config\") pod \"ovsdbserver-sb-0\" (UID: \"74c19249-ff95-4e49-96bb-1135e7aa1b08\") " pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.060498 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74c19249-ff95-4e49-96bb-1135e7aa1b08-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"74c19249-ff95-4e49-96bb-1135e7aa1b08\") " pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.065007 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c19249-ff95-4e49-96bb-1135e7aa1b08-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"74c19249-ff95-4e49-96bb-1135e7aa1b08\") " pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.072114 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c19249-ff95-4e49-96bb-1135e7aa1b08-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"74c19249-ff95-4e49-96bb-1135e7aa1b08\") " pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.073577 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c19249-ff95-4e49-96bb-1135e7aa1b08-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"74c19249-ff95-4e49-96bb-1135e7aa1b08\") " pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.102850 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vb2k\" (UniqueName: \"kubernetes.io/projected/74c19249-ff95-4e49-96bb-1135e7aa1b08-kube-api-access-4vb2k\") pod \"ovsdbserver-sb-0\" (UID: \"74c19249-ff95-4e49-96bb-1135e7aa1b08\") " pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.128559 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"74c19249-ff95-4e49-96bb-1135e7aa1b08\") " pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.194870 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.344592 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.592348 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-qgj2b" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.606732 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bcz4v" Oct 11 01:10:22 crc kubenswrapper[4743]: W1011 01:10:22.665420 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1159224_8c5f_43ae_8aa3_ca628c69914e.slice/crio-207875bfefa7a117163c82d9ac25fbebf6f11fc73f9ef2a1de4acc03e26f34da WatchSource:0}: Error finding container 207875bfefa7a117163c82d9ac25fbebf6f11fc73f9ef2a1de4acc03e26f34da: Status 404 returned error can't find the container with id 207875bfefa7a117163c82d9ac25fbebf6f11fc73f9ef2a1de4acc03e26f34da Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.666369 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.674995 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mwx2\" (UniqueName: \"kubernetes.io/projected/06e0f9f3-e27f-485a-b45b-32a8a41c4c1f-kube-api-access-9mwx2\") pod \"06e0f9f3-e27f-485a-b45b-32a8a41c4c1f\" (UID: \"06e0f9f3-e27f-485a-b45b-32a8a41c4c1f\") " Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.675051 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06e0f9f3-e27f-485a-b45b-32a8a41c4c1f-config\") pod \"06e0f9f3-e27f-485a-b45b-32a8a41c4c1f\" (UID: \"06e0f9f3-e27f-485a-b45b-32a8a41c4c1f\") " Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.675082 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx5gc\" (UniqueName: \"kubernetes.io/projected/3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c-kube-api-access-bx5gc\") pod \"3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c\" (UID: \"3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c\") " Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.675225 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c-config\") pod \"3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c\" (UID: \"3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c\") " Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.675316 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06e0f9f3-e27f-485a-b45b-32a8a41c4c1f-dns-svc\") pod \"06e0f9f3-e27f-485a-b45b-32a8a41c4c1f\" (UID: \"06e0f9f3-e27f-485a-b45b-32a8a41c4c1f\") " Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.675521 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06e0f9f3-e27f-485a-b45b-32a8a41c4c1f-config" (OuterVolumeSpecName: "config") pod "06e0f9f3-e27f-485a-b45b-32a8a41c4c1f" (UID: "06e0f9f3-e27f-485a-b45b-32a8a41c4c1f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.675894 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06e0f9f3-e27f-485a-b45b-32a8a41c4c1f-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.675940 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06e0f9f3-e27f-485a-b45b-32a8a41c4c1f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "06e0f9f3-e27f-485a-b45b-32a8a41c4c1f" (UID: "06e0f9f3-e27f-485a-b45b-32a8a41c4c1f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.676223 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c-config" (OuterVolumeSpecName: "config") pod "3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c" (UID: "3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.680237 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c-kube-api-access-bx5gc" (OuterVolumeSpecName: "kube-api-access-bx5gc") pod "3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c" (UID: "3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c"). InnerVolumeSpecName "kube-api-access-bx5gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.680664 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06e0f9f3-e27f-485a-b45b-32a8a41c4c1f-kube-api-access-9mwx2" (OuterVolumeSpecName: "kube-api-access-9mwx2") pod "06e0f9f3-e27f-485a-b45b-32a8a41c4c1f" (UID: "06e0f9f3-e27f-485a-b45b-32a8a41c4c1f"). InnerVolumeSpecName "kube-api-access-9mwx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.777204 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.777442 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06e0f9f3-e27f-485a-b45b-32a8a41c4c1f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.777451 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mwx2\" (UniqueName: \"kubernetes.io/projected/06e0f9f3-e27f-485a-b45b-32a8a41c4c1f-kube-api-access-9mwx2\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.777462 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx5gc\" (UniqueName: \"kubernetes.io/projected/3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c-kube-api-access-bx5gc\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.981082 4743 generic.go:334] "Generic (PLEG): container finished" podID="719b7532-9517-409c-8870-62a1c92dd38c" containerID="2ea763fbffdbc80c516607df229d6e571a0581a21485e5651339b770a567e435" exitCode=0 Oct 11 01:10:22 crc kubenswrapper[4743]: I1011 01:10:22.981153 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-9hdp4" event={"ID":"719b7532-9517-409c-8870-62a1c92dd38c","Type":"ContainerDied","Data":"2ea763fbffdbc80c516607df229d6e571a0581a21485e5651339b770a567e435"} Oct 11 01:10:23 crc kubenswrapper[4743]: I1011 01:10:23.014320 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"bdc2fd3e645a7f36140a058209779fdbf1154f0849a37453796b08adc03a7cc1"} Oct 11 01:10:23 crc kubenswrapper[4743]: I1011 01:10:23.016605 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f1159224-8c5f-43ae-8aa3-ca628c69914e","Type":"ContainerStarted","Data":"207875bfefa7a117163c82d9ac25fbebf6f11fc73f9ef2a1de4acc03e26f34da"} Oct 11 01:10:23 crc kubenswrapper[4743]: I1011 01:10:23.018471 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bcz4v" Oct 11 01:10:23 crc kubenswrapper[4743]: I1011 01:10:23.018472 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-bcz4v" event={"ID":"3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c","Type":"ContainerDied","Data":"b9eedfd6c8ed3a523993b8b6bc407b0711f2217857934d4c20abdaa392971355"} Oct 11 01:10:23 crc kubenswrapper[4743]: I1011 01:10:23.024553 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-qgj2b" event={"ID":"06e0f9f3-e27f-485a-b45b-32a8a41c4c1f","Type":"ContainerDied","Data":"e3227cf1efa5c0c451f066cd525a6ffe68c189bdb9ef02174800382499c4748d"} Oct 11 01:10:23 crc kubenswrapper[4743]: I1011 01:10:23.024876 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-qgj2b" Oct 11 01:10:23 crc kubenswrapper[4743]: I1011 01:10:23.073200 4743 generic.go:334] "Generic (PLEG): container finished" podID="5052563c-4805-4a28-8aec-0fccc620857e" containerID="b5674495de013435f6d4143555e6d0f21c6e2bd5bb78edf50529a27e3c8fe754" exitCode=0 Oct 11 01:10:23 crc kubenswrapper[4743]: I1011 01:10:23.073248 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xfzvh" event={"ID":"5052563c-4805-4a28-8aec-0fccc620857e","Type":"ContainerDied","Data":"b5674495de013435f6d4143555e6d0f21c6e2bd5bb78edf50529a27e3c8fe754"} Oct 11 01:10:23 crc kubenswrapper[4743]: I1011 01:10:23.164497 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bcz4v"] Oct 11 01:10:23 crc kubenswrapper[4743]: I1011 01:10:23.171400 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bcz4v"] Oct 11 01:10:23 crc kubenswrapper[4743]: I1011 01:10:23.337706 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qgj2b"] Oct 11 01:10:23 crc kubenswrapper[4743]: I1011 01:10:23.394832 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qgj2b"] Oct 11 01:10:23 crc kubenswrapper[4743]: E1011 01:10:23.396078 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ec33fb3_c2bd_4887_ad14_7aadfa9faf3c.slice/crio-b9eedfd6c8ed3a523993b8b6bc407b0711f2217857934d4c20abdaa392971355\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ec33fb3_c2bd_4887_ad14_7aadfa9faf3c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06e0f9f3_e27f_485a_b45b_32a8a41c4c1f.slice/crio-e3227cf1efa5c0c451f066cd525a6ffe68c189bdb9ef02174800382499c4748d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06e0f9f3_e27f_485a_b45b_32a8a41c4c1f.slice\": RecentStats: unable to find data in memory cache]" Oct 11 01:10:23 crc kubenswrapper[4743]: I1011 01:10:23.421917 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 11 01:10:23 crc kubenswrapper[4743]: I1011 01:10:23.422216 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 11 01:10:23 crc kubenswrapper[4743]: W1011 01:10:23.474194 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1a1779f_127f_4ea2_a937_b97f329e3878.slice/crio-3878092d2f66d34b4df8c35f9e72877aa38718294bd86162acfee83fd2e0420a WatchSource:0}: Error finding container 3878092d2f66d34b4df8c35f9e72877aa38718294bd86162acfee83fd2e0420a: Status 404 returned error can't find the container with id 3878092d2f66d34b4df8c35f9e72877aa38718294bd86162acfee83fd2e0420a Oct 11 01:10:23 crc kubenswrapper[4743]: I1011 01:10:23.499485 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 11 01:10:23 crc kubenswrapper[4743]: W1011 01:10:23.507364 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cbb33ea_578f_4987_94cf_d6bf069a2953.slice/crio-9291f4f91ed2ee5c893ac2b941c2e074511507b55442ae5f2bebd97bc81a9c4d WatchSource:0}: Error finding container 9291f4f91ed2ee5c893ac2b941c2e074511507b55442ae5f2bebd97bc81a9c4d: Status 404 returned error can't find the container with id 9291f4f91ed2ee5c893ac2b941c2e074511507b55442ae5f2bebd97bc81a9c4d Oct 11 01:10:23 crc kubenswrapper[4743]: W1011 01:10:23.512871 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4bbc3ff_d45e_46ac_a8fc_6b75e1f5d342.slice/crio-2161e7184a4fc1912e3d4ba8a1b9cffadebedf3602027e8210a36471b58fa19d WatchSource:0}: Error finding container 2161e7184a4fc1912e3d4ba8a1b9cffadebedf3602027e8210a36471b58fa19d: Status 404 returned error can't find the container with id 2161e7184a4fc1912e3d4ba8a1b9cffadebedf3602027e8210a36471b58fa19d Oct 11 01:10:23 crc kubenswrapper[4743]: I1011 01:10:23.515742 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-6584dc9448-tzxl5"] Oct 11 01:10:23 crc kubenswrapper[4743]: I1011 01:10:23.539015 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 11 01:10:23 crc kubenswrapper[4743]: W1011 01:10:23.573556 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f596550_b88a_49d7_9cff_cbc2d4149a2e.slice/crio-ea76cbc044cc60d25786766100c9f98b838a44f7b163f230250f6c671a788183 WatchSource:0}: Error finding container ea76cbc044cc60d25786766100c9f98b838a44f7b163f230250f6c671a788183: Status 404 returned error can't find the container with id ea76cbc044cc60d25786766100c9f98b838a44f7b163f230250f6c671a788183 Oct 11 01:10:23 crc kubenswrapper[4743]: I1011 01:10:23.584544 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mwtxs"] Oct 11 01:10:23 crc kubenswrapper[4743]: I1011 01:10:23.593117 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-748c5b5875-pmrqh"] Oct 11 01:10:23 crc kubenswrapper[4743]: I1011 01:10:23.599368 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 11 01:10:23 crc kubenswrapper[4743]: I1011 01:10:23.610990 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 11 01:10:23 crc kubenswrapper[4743]: I1011 01:10:23.614519 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-6g6xb"] Oct 11 01:10:24 crc kubenswrapper[4743]: I1011 01:10:24.084973 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6g6xb" event={"ID":"ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387","Type":"ContainerStarted","Data":"ec828e87b3309cb262b83aa2e768ffc019ddd210bcd0fd88d84d7f4d8848b168"} Oct 11 01:10:24 crc kubenswrapper[4743]: I1011 01:10:24.090761 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xfzvh" event={"ID":"5052563c-4805-4a28-8aec-0fccc620857e","Type":"ContainerStarted","Data":"8312b49fb0e108d182a24807f43d44a0671d33db69ae4c00e8bf979eb2f3e7c5"} Oct 11 01:10:24 crc kubenswrapper[4743]: I1011 01:10:24.092155 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-xfzvh" Oct 11 01:10:24 crc kubenswrapper[4743]: I1011 01:10:24.098022 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-9hdp4" event={"ID":"719b7532-9517-409c-8870-62a1c92dd38c","Type":"ContainerStarted","Data":"734d70cf7d56750c6508b134074db339f927964600711ad6a1467ba67149b540"} Oct 11 01:10:24 crc kubenswrapper[4743]: I1011 01:10:24.098608 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-9hdp4" Oct 11 01:10:24 crc kubenswrapper[4743]: I1011 01:10:24.100778 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"73d7bbf0-dc76-4572-857e-fd0fb59d95cc","Type":"ContainerStarted","Data":"bc12e6fe570e1b9fed4f696e94239d3b186f62e7adf30d9c2e4c008c659037a6"} Oct 11 01:10:24 crc kubenswrapper[4743]: I1011 01:10:24.129231 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06e0f9f3-e27f-485a-b45b-32a8a41c4c1f" path="/var/lib/kubelet/pods/06e0f9f3-e27f-485a-b45b-32a8a41c4c1f/volumes" Oct 11 01:10:24 crc kubenswrapper[4743]: I1011 01:10:24.129791 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c" path="/var/lib/kubelet/pods/3ec33fb3-c2bd-4887-ad14-7aadfa9faf3c/volumes" Oct 11 01:10:24 crc kubenswrapper[4743]: I1011 01:10:24.130144 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9f596550-b88a-49d7-9cff-cbc2d4149a2e","Type":"ContainerStarted","Data":"ea76cbc044cc60d25786766100c9f98b838a44f7b163f230250f6c671a788183"} Oct 11 01:10:24 crc kubenswrapper[4743]: I1011 01:10:24.130179 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 11 01:10:24 crc kubenswrapper[4743]: I1011 01:10:24.130197 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-748c5b5875-pmrqh" event={"ID":"b68870e2-fd28-4422-a232-b673325abeec","Type":"ContainerStarted","Data":"636d758a2d386ec58f4687e665eb6a4eb629a4199c1828f53803c34b15f01261"} Oct 11 01:10:24 crc kubenswrapper[4743]: I1011 01:10:24.130210 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-748c5b5875-pmrqh" event={"ID":"b68870e2-fd28-4422-a232-b673325abeec","Type":"ContainerStarted","Data":"b8463ad514cf44dead8b1039d95d01802205fa0f1d5120a5ac2e6535d448ce41"} Oct 11 01:10:24 crc kubenswrapper[4743]: I1011 01:10:24.130222 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e1a1779f-127f-4ea2-a937-b97f329e3878","Type":"ContainerStarted","Data":"3878092d2f66d34b4df8c35f9e72877aa38718294bd86162acfee83fd2e0420a"} Oct 11 01:10:24 crc kubenswrapper[4743]: I1011 01:10:24.130340 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f604069e-dff8-4f02-a5e8-d3ba38d87625","Type":"ContainerStarted","Data":"3691201b1b942c61d743ef86c142e98530c4f9cf3a4359698d3fd38ac8f93d30"} Oct 11 01:10:24 crc kubenswrapper[4743]: I1011 01:10:24.130434 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-xfzvh" podStartSLOduration=3.788565068 podStartE2EDuration="18.130402814s" podCreationTimestamp="2025-10-11 01:10:06 +0000 UTC" firstStartedPulling="2025-10-11 01:10:07.727920505 +0000 UTC m=+1102.380900902" lastFinishedPulling="2025-10-11 01:10:22.069758251 +0000 UTC m=+1116.722738648" observedRunningTime="2025-10-11 01:10:24.106305332 +0000 UTC m=+1118.759285729" watchObservedRunningTime="2025-10-11 01:10:24.130402814 +0000 UTC m=+1118.783383211" Oct 11 01:10:24 crc kubenswrapper[4743]: I1011 01:10:24.132797 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5cbb33ea-578f-4987-94cf-d6bf069a2953","Type":"ContainerStarted","Data":"9291f4f91ed2ee5c893ac2b941c2e074511507b55442ae5f2bebd97bc81a9c4d"} Oct 11 01:10:24 crc kubenswrapper[4743]: I1011 01:10:24.135032 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mwtxs" event={"ID":"1ab33e99-afb5-4b67-89bd-a2eb540bf194","Type":"ContainerStarted","Data":"7c8bc3d5918b0fb3d25d8fae269675334b07cfb2687c03c6d4452fb16511f80f"} Oct 11 01:10:24 crc kubenswrapper[4743]: I1011 01:10:24.135892 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-9hdp4" podStartSLOduration=3.564660308 podStartE2EDuration="18.135879503s" podCreationTimestamp="2025-10-11 01:10:06 +0000 UTC" firstStartedPulling="2025-10-11 01:10:07.551409737 +0000 UTC m=+1102.204390134" lastFinishedPulling="2025-10-11 01:10:22.122628932 +0000 UTC m=+1116.775609329" observedRunningTime="2025-10-11 01:10:24.128347281 +0000 UTC m=+1118.781327698" watchObservedRunningTime="2025-10-11 01:10:24.135879503 +0000 UTC m=+1118.788859900" Oct 11 01:10:24 crc kubenswrapper[4743]: I1011 01:10:24.137931 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-6584dc9448-tzxl5" event={"ID":"c4bbc3ff-d45e-46ac-a8fc-6b75e1f5d342","Type":"ContainerStarted","Data":"2161e7184a4fc1912e3d4ba8a1b9cffadebedf3602027e8210a36471b58fa19d"} Oct 11 01:10:24 crc kubenswrapper[4743]: I1011 01:10:24.145809 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"32c52bf9-36b5-4a75-8991-e76f4dd87fb3","Type":"ContainerStarted","Data":"39dd6775f7e265e0f14b18c3971a9783406cd03a5684e7c2b22114d7dd72ffd1"} Oct 11 01:10:24 crc kubenswrapper[4743]: W1011 01:10:24.157740 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74c19249_ff95_4e49_96bb_1135e7aa1b08.slice/crio-5d5a93274e8de128f10b27a9cb95a085b27d3ffb4fec965a7afa0d476ca7b0a2 WatchSource:0}: Error finding container 5d5a93274e8de128f10b27a9cb95a085b27d3ffb4fec965a7afa0d476ca7b0a2: Status 404 returned error can't find the container with id 5d5a93274e8de128f10b27a9cb95a085b27d3ffb4fec965a7afa0d476ca7b0a2 Oct 11 01:10:24 crc kubenswrapper[4743]: I1011 01:10:24.161775 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-748c5b5875-pmrqh" podStartSLOduration=9.161757079 podStartE2EDuration="9.161757079s" podCreationTimestamp="2025-10-11 01:10:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:10:24.153305485 +0000 UTC m=+1118.806285902" watchObservedRunningTime="2025-10-11 01:10:24.161757079 +0000 UTC m=+1118.814737476" Oct 11 01:10:24 crc kubenswrapper[4743]: I1011 01:10:24.527435 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 11 01:10:25 crc kubenswrapper[4743]: I1011 01:10:25.157673 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"74c19249-ff95-4e49-96bb-1135e7aa1b08","Type":"ContainerStarted","Data":"5d5a93274e8de128f10b27a9cb95a085b27d3ffb4fec965a7afa0d476ca7b0a2"} Oct 11 01:10:26 crc kubenswrapper[4743]: I1011 01:10:26.045556 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:26 crc kubenswrapper[4743]: I1011 01:10:26.045826 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:26 crc kubenswrapper[4743]: I1011 01:10:26.050227 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:26 crc kubenswrapper[4743]: I1011 01:10:26.166866 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7c8284ba-a2b2-4f9f-a692-b372e8294d6b","Type":"ContainerStarted","Data":"3f01d315147a063ea9f2a1aac13a30e57cacba8f6331a8f455418754752f8bc0"} Oct 11 01:10:26 crc kubenswrapper[4743]: I1011 01:10:26.171046 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-748c5b5875-pmrqh" Oct 11 01:10:26 crc kubenswrapper[4743]: I1011 01:10:26.379906 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7bff76c5fd-k9d6v"] Oct 11 01:10:31 crc kubenswrapper[4743]: I1011 01:10:31.991108 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-9hdp4" Oct 11 01:10:32 crc kubenswrapper[4743]: I1011 01:10:32.280993 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-xfzvh" Oct 11 01:10:32 crc kubenswrapper[4743]: I1011 01:10:32.345636 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9hdp4"] Oct 11 01:10:32 crc kubenswrapper[4743]: I1011 01:10:32.345904 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-9hdp4" podUID="719b7532-9517-409c-8870-62a1c92dd38c" containerName="dnsmasq-dns" containerID="cri-o://734d70cf7d56750c6508b134074db339f927964600711ad6a1467ba67149b540" gracePeriod=10 Oct 11 01:10:33 crc kubenswrapper[4743]: I1011 01:10:33.249161 4743 generic.go:334] "Generic (PLEG): container finished" podID="719b7532-9517-409c-8870-62a1c92dd38c" containerID="734d70cf7d56750c6508b134074db339f927964600711ad6a1467ba67149b540" exitCode=0 Oct 11 01:10:33 crc kubenswrapper[4743]: I1011 01:10:33.249245 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-9hdp4" event={"ID":"719b7532-9517-409c-8870-62a1c92dd38c","Type":"ContainerDied","Data":"734d70cf7d56750c6508b134074db339f927964600711ad6a1467ba67149b540"} Oct 11 01:10:33 crc kubenswrapper[4743]: I1011 01:10:33.980014 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-9hdp4" Oct 11 01:10:34 crc kubenswrapper[4743]: I1011 01:10:34.119445 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/719b7532-9517-409c-8870-62a1c92dd38c-config\") pod \"719b7532-9517-409c-8870-62a1c92dd38c\" (UID: \"719b7532-9517-409c-8870-62a1c92dd38c\") " Oct 11 01:10:34 crc kubenswrapper[4743]: I1011 01:10:34.119510 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqgvr\" (UniqueName: \"kubernetes.io/projected/719b7532-9517-409c-8870-62a1c92dd38c-kube-api-access-jqgvr\") pod \"719b7532-9517-409c-8870-62a1c92dd38c\" (UID: \"719b7532-9517-409c-8870-62a1c92dd38c\") " Oct 11 01:10:34 crc kubenswrapper[4743]: I1011 01:10:34.119761 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/719b7532-9517-409c-8870-62a1c92dd38c-dns-svc\") pod \"719b7532-9517-409c-8870-62a1c92dd38c\" (UID: \"719b7532-9517-409c-8870-62a1c92dd38c\") " Oct 11 01:10:34 crc kubenswrapper[4743]: I1011 01:10:34.131662 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/719b7532-9517-409c-8870-62a1c92dd38c-kube-api-access-jqgvr" (OuterVolumeSpecName: "kube-api-access-jqgvr") pod "719b7532-9517-409c-8870-62a1c92dd38c" (UID: "719b7532-9517-409c-8870-62a1c92dd38c"). InnerVolumeSpecName "kube-api-access-jqgvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:10:34 crc kubenswrapper[4743]: I1011 01:10:34.184373 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/719b7532-9517-409c-8870-62a1c92dd38c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "719b7532-9517-409c-8870-62a1c92dd38c" (UID: "719b7532-9517-409c-8870-62a1c92dd38c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:10:34 crc kubenswrapper[4743]: I1011 01:10:34.185308 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/719b7532-9517-409c-8870-62a1c92dd38c-config" (OuterVolumeSpecName: "config") pod "719b7532-9517-409c-8870-62a1c92dd38c" (UID: "719b7532-9517-409c-8870-62a1c92dd38c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:10:34 crc kubenswrapper[4743]: I1011 01:10:34.231890 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/719b7532-9517-409c-8870-62a1c92dd38c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:34 crc kubenswrapper[4743]: I1011 01:10:34.231922 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/719b7532-9517-409c-8870-62a1c92dd38c-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:34 crc kubenswrapper[4743]: I1011 01:10:34.231932 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqgvr\" (UniqueName: \"kubernetes.io/projected/719b7532-9517-409c-8870-62a1c92dd38c-kube-api-access-jqgvr\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:34 crc kubenswrapper[4743]: I1011 01:10:34.259731 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-9hdp4" event={"ID":"719b7532-9517-409c-8870-62a1c92dd38c","Type":"ContainerDied","Data":"026fbccac4f567952edb1430cf2884f8a4da4d49d4d28b7ed5320888b90310e4"} Oct 11 01:10:34 crc kubenswrapper[4743]: I1011 01:10:34.259785 4743 scope.go:117] "RemoveContainer" containerID="734d70cf7d56750c6508b134074db339f927964600711ad6a1467ba67149b540" Oct 11 01:10:34 crc kubenswrapper[4743]: I1011 01:10:34.259933 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-9hdp4" Oct 11 01:10:34 crc kubenswrapper[4743]: I1011 01:10:34.262683 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f1159224-8c5f-43ae-8aa3-ca628c69914e","Type":"ContainerStarted","Data":"a1f4a43187423dff1cfb28089e857c160e08daf12a1eb0e00a2d498e6cd7d882"} Oct 11 01:10:34 crc kubenswrapper[4743]: I1011 01:10:34.263061 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 11 01:10:34 crc kubenswrapper[4743]: I1011 01:10:34.284912 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=14.233992149 podStartE2EDuration="23.284894307s" podCreationTimestamp="2025-10-11 01:10:11 +0000 UTC" firstStartedPulling="2025-10-11 01:10:22.66892299 +0000 UTC m=+1117.321903387" lastFinishedPulling="2025-10-11 01:10:31.719825108 +0000 UTC m=+1126.372805545" observedRunningTime="2025-10-11 01:10:34.274933464 +0000 UTC m=+1128.927913871" watchObservedRunningTime="2025-10-11 01:10:34.284894307 +0000 UTC m=+1128.937874704" Oct 11 01:10:34 crc kubenswrapper[4743]: I1011 01:10:34.311533 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9hdp4"] Oct 11 01:10:34 crc kubenswrapper[4743]: I1011 01:10:34.317895 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9hdp4"] Oct 11 01:10:34 crc kubenswrapper[4743]: I1011 01:10:34.585056 4743 scope.go:117] "RemoveContainer" containerID="2ea763fbffdbc80c516607df229d6e571a0581a21485e5651339b770a567e435" Oct 11 01:10:35 crc kubenswrapper[4743]: I1011 01:10:35.278314 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7c8284ba-a2b2-4f9f-a692-b372e8294d6b","Type":"ContainerStarted","Data":"0ab7f61cf48ae2196002e61b81d5c61714e5b9cf4dd3359fefe5de2048058c4d"} Oct 11 01:10:35 crc kubenswrapper[4743]: I1011 01:10:35.280961 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f604069e-dff8-4f02-a5e8-d3ba38d87625","Type":"ContainerStarted","Data":"a3f9f16bff8341f0545e438b874414ad7f09f80e91f9b5cb18eb3ce582635a87"} Oct 11 01:10:36 crc kubenswrapper[4743]: I1011 01:10:36.103413 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="719b7532-9517-409c-8870-62a1c92dd38c" path="/var/lib/kubelet/pods/719b7532-9517-409c-8870-62a1c92dd38c/volumes" Oct 11 01:10:36 crc kubenswrapper[4743]: I1011 01:10:36.293948 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mwtxs" event={"ID":"1ab33e99-afb5-4b67-89bd-a2eb540bf194","Type":"ContainerStarted","Data":"e581e92624315b6828a6ca4435879fa50787dd5692817e385fcb0755d8771daf"} Oct 11 01:10:36 crc kubenswrapper[4743]: I1011 01:10:36.294242 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-mwtxs" Oct 11 01:10:36 crc kubenswrapper[4743]: I1011 01:10:36.295293 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"74c19249-ff95-4e49-96bb-1135e7aa1b08","Type":"ContainerStarted","Data":"81a770832e9ce80a038339409afdb026e767eee6b946e6df5136c08dca930263"} Oct 11 01:10:36 crc kubenswrapper[4743]: I1011 01:10:36.296959 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-6584dc9448-tzxl5" event={"ID":"c4bbc3ff-d45e-46ac-a8fc-6b75e1f5d342","Type":"ContainerStarted","Data":"96e1817569768391a1e9a2eca17e446044ad1edd6e8616192da1f41cbfac4ff1"} Oct 11 01:10:36 crc kubenswrapper[4743]: I1011 01:10:36.299722 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"32c52bf9-36b5-4a75-8991-e76f4dd87fb3","Type":"ContainerStarted","Data":"ab56204985fd2151dcac627a23c735b4078585e8fb746e7939ea8b7baff04a9b"} Oct 11 01:10:36 crc kubenswrapper[4743]: I1011 01:10:36.301551 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9f596550-b88a-49d7-9cff-cbc2d4149a2e","Type":"ContainerStarted","Data":"71bd50ca24278810cfdfc1d2d9fe23b63e4ca59dbc1c1013a7228649e54927a6"} Oct 11 01:10:36 crc kubenswrapper[4743]: I1011 01:10:36.304133 4743 generic.go:334] "Generic (PLEG): container finished" podID="ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387" containerID="315c8f96e57047779bc9ba1729d7b3c069a8d7f440f95358813b3bb2344c909e" exitCode=0 Oct 11 01:10:36 crc kubenswrapper[4743]: I1011 01:10:36.304177 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6g6xb" event={"ID":"ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387","Type":"ContainerDied","Data":"315c8f96e57047779bc9ba1729d7b3c069a8d7f440f95358813b3bb2344c909e"} Oct 11 01:10:36 crc kubenswrapper[4743]: I1011 01:10:36.307324 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5cbb33ea-578f-4987-94cf-d6bf069a2953","Type":"ContainerStarted","Data":"45f7e650696ec5856a270b233a567caa087f6fd7ff31b79a6e5744eae8f8752c"} Oct 11 01:10:36 crc kubenswrapper[4743]: I1011 01:10:36.307362 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 11 01:10:36 crc kubenswrapper[4743]: I1011 01:10:36.313461 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-mwtxs" podStartSLOduration=9.378980214 podStartE2EDuration="19.313445307s" podCreationTimestamp="2025-10-11 01:10:17 +0000 UTC" firstStartedPulling="2025-10-11 01:10:23.560556538 +0000 UTC m=+1118.213536935" lastFinishedPulling="2025-10-11 01:10:33.495021631 +0000 UTC m=+1128.148002028" observedRunningTime="2025-10-11 01:10:36.311446036 +0000 UTC m=+1130.964426453" watchObservedRunningTime="2025-10-11 01:10:36.313445307 +0000 UTC m=+1130.966425704" Oct 11 01:10:36 crc kubenswrapper[4743]: I1011 01:10:36.346633 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-6584dc9448-tzxl5" podStartSLOduration=10.931640516 podStartE2EDuration="21.346619798s" podCreationTimestamp="2025-10-11 01:10:15 +0000 UTC" firstStartedPulling="2025-10-11 01:10:23.547514287 +0000 UTC m=+1118.200494684" lastFinishedPulling="2025-10-11 01:10:33.962493569 +0000 UTC m=+1128.615473966" observedRunningTime="2025-10-11 01:10:36.344567686 +0000 UTC m=+1130.997548093" watchObservedRunningTime="2025-10-11 01:10:36.346619798 +0000 UTC m=+1130.999600195" Oct 11 01:10:36 crc kubenswrapper[4743]: I1011 01:10:36.362197 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=11.163679886 podStartE2EDuration="22.362176333s" podCreationTimestamp="2025-10-11 01:10:14 +0000 UTC" firstStartedPulling="2025-10-11 01:10:23.51529868 +0000 UTC m=+1118.168279077" lastFinishedPulling="2025-10-11 01:10:34.713795127 +0000 UTC m=+1129.366775524" observedRunningTime="2025-10-11 01:10:36.356305844 +0000 UTC m=+1131.009286261" watchObservedRunningTime="2025-10-11 01:10:36.362176333 +0000 UTC m=+1131.015156730" Oct 11 01:10:37 crc kubenswrapper[4743]: I1011 01:10:37.315114 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"73d7bbf0-dc76-4572-857e-fd0fb59d95cc","Type":"ContainerStarted","Data":"481a1258a6422d0adb2d8063cebd0064a3027d185956e75d1ac262e1e7ae0dbb"} Oct 11 01:10:37 crc kubenswrapper[4743]: I1011 01:10:37.318623 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6g6xb" event={"ID":"ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387","Type":"ContainerStarted","Data":"c20de8809a4c8a420c51bf984d8d7e571157921f52756b37c72515903bbb1d32"} Oct 11 01:10:38 crc kubenswrapper[4743]: I1011 01:10:38.329372 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6g6xb" event={"ID":"ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387","Type":"ContainerStarted","Data":"7ce91b3da1855cc6ee84c1e0cec2022cff901bc983579d3c4da72a095f0f5591"} Oct 11 01:10:38 crc kubenswrapper[4743]: I1011 01:10:38.330702 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-6g6xb" Oct 11 01:10:38 crc kubenswrapper[4743]: I1011 01:10:38.330727 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-6g6xb" Oct 11 01:10:38 crc kubenswrapper[4743]: I1011 01:10:38.333170 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e1a1779f-127f-4ea2-a937-b97f329e3878","Type":"ContainerStarted","Data":"3ca7a025605c798f9dce3f44e26b3545c887fbf30de417079fadcd6d371106ca"} Oct 11 01:10:38 crc kubenswrapper[4743]: I1011 01:10:38.352293 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-6g6xb" podStartSLOduration=11.489648626 podStartE2EDuration="21.352278686s" podCreationTimestamp="2025-10-11 01:10:17 +0000 UTC" firstStartedPulling="2025-10-11 01:10:23.633403676 +0000 UTC m=+1118.286384073" lastFinishedPulling="2025-10-11 01:10:33.496033736 +0000 UTC m=+1128.149014133" observedRunningTime="2025-10-11 01:10:38.345254308 +0000 UTC m=+1132.998234705" watchObservedRunningTime="2025-10-11 01:10:38.352278686 +0000 UTC m=+1133.005259083" Oct 11 01:10:40 crc kubenswrapper[4743]: I1011 01:10:40.349155 4743 generic.go:334] "Generic (PLEG): container finished" podID="f604069e-dff8-4f02-a5e8-d3ba38d87625" containerID="a3f9f16bff8341f0545e438b874414ad7f09f80e91f9b5cb18eb3ce582635a87" exitCode=0 Oct 11 01:10:40 crc kubenswrapper[4743]: I1011 01:10:40.349677 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f604069e-dff8-4f02-a5e8-d3ba38d87625","Type":"ContainerDied","Data":"a3f9f16bff8341f0545e438b874414ad7f09f80e91f9b5cb18eb3ce582635a87"} Oct 11 01:10:40 crc kubenswrapper[4743]: I1011 01:10:40.354115 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"74c19249-ff95-4e49-96bb-1135e7aa1b08","Type":"ContainerStarted","Data":"8d8acc57a6c7df5a3081f74fd6f7e5cd25f7a0f7d91dcee9833661fae37c4880"} Oct 11 01:10:40 crc kubenswrapper[4743]: I1011 01:10:40.358796 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7c8284ba-a2b2-4f9f-a692-b372e8294d6b","Type":"ContainerStarted","Data":"162b2d5b71d8d285a7f03f4985f51d96c06f37ddae6e97be915ceeec606414b9"} Oct 11 01:10:40 crc kubenswrapper[4743]: I1011 01:10:40.469789 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=6.000730729 podStartE2EDuration="20.469768522s" podCreationTimestamp="2025-10-11 01:10:20 +0000 UTC" firstStartedPulling="2025-10-11 01:10:25.371520588 +0000 UTC m=+1120.024500985" lastFinishedPulling="2025-10-11 01:10:39.840558371 +0000 UTC m=+1134.493538778" observedRunningTime="2025-10-11 01:10:40.446196454 +0000 UTC m=+1135.099176851" watchObservedRunningTime="2025-10-11 01:10:40.469768522 +0000 UTC m=+1135.122748919" Oct 11 01:10:40 crc kubenswrapper[4743]: I1011 01:10:40.495814 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.810003304 podStartE2EDuration="20.495795133s" podCreationTimestamp="2025-10-11 01:10:20 +0000 UTC" firstStartedPulling="2025-10-11 01:10:24.160768064 +0000 UTC m=+1118.813748461" lastFinishedPulling="2025-10-11 01:10:39.846559853 +0000 UTC m=+1134.499540290" observedRunningTime="2025-10-11 01:10:40.493104894 +0000 UTC m=+1135.146085291" watchObservedRunningTime="2025-10-11 01:10:40.495795133 +0000 UTC m=+1135.148775530" Oct 11 01:10:41 crc kubenswrapper[4743]: I1011 01:10:41.370380 4743 generic.go:334] "Generic (PLEG): container finished" podID="32c52bf9-36b5-4a75-8991-e76f4dd87fb3" containerID="ab56204985fd2151dcac627a23c735b4078585e8fb746e7939ea8b7baff04a9b" exitCode=0 Oct 11 01:10:41 crc kubenswrapper[4743]: I1011 01:10:41.370465 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"32c52bf9-36b5-4a75-8991-e76f4dd87fb3","Type":"ContainerDied","Data":"ab56204985fd2151dcac627a23c735b4078585e8fb746e7939ea8b7baff04a9b"} Oct 11 01:10:41 crc kubenswrapper[4743]: I1011 01:10:41.373901 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f604069e-dff8-4f02-a5e8-d3ba38d87625","Type":"ContainerStarted","Data":"28ac79ddb2ca31ead9bacaf682333c92773d598a8260f11591048cba96c7949b"} Oct 11 01:10:41 crc kubenswrapper[4743]: I1011 01:10:41.441003 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=22.496120033 podStartE2EDuration="32.440930848s" podCreationTimestamp="2025-10-11 01:10:09 +0000 UTC" firstStartedPulling="2025-10-11 01:10:23.435004103 +0000 UTC m=+1118.087984500" lastFinishedPulling="2025-10-11 01:10:33.379814908 +0000 UTC m=+1128.032795315" observedRunningTime="2025-10-11 01:10:41.427171069 +0000 UTC m=+1136.080151476" watchObservedRunningTime="2025-10-11 01:10:41.440930848 +0000 UTC m=+1136.093911285" Oct 11 01:10:41 crc kubenswrapper[4743]: I1011 01:10:41.642340 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 11 01:10:42 crc kubenswrapper[4743]: I1011 01:10:42.196466 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:42 crc kubenswrapper[4743]: I1011 01:10:42.345618 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:42 crc kubenswrapper[4743]: I1011 01:10:42.388031 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"32c52bf9-36b5-4a75-8991-e76f4dd87fb3","Type":"ContainerStarted","Data":"e371de680ba52a633b108ece73d8a463dfdf0016102c005eee017ddf9c8c82ca"} Oct 11 01:10:42 crc kubenswrapper[4743]: I1011 01:10:42.419022 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=24.076843545 podStartE2EDuration="34.419005199s" podCreationTimestamp="2025-10-11 01:10:08 +0000 UTC" firstStartedPulling="2025-10-11 01:10:23.554803972 +0000 UTC m=+1118.207784369" lastFinishedPulling="2025-10-11 01:10:33.896965626 +0000 UTC m=+1128.549946023" observedRunningTime="2025-10-11 01:10:42.410662597 +0000 UTC m=+1137.063643004" watchObservedRunningTime="2025-10-11 01:10:42.419005199 +0000 UTC m=+1137.071985596" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.196273 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.244792 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.345529 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.387441 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.433506 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.437565 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.601576 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-hwcsx"] Oct 11 01:10:43 crc kubenswrapper[4743]: E1011 01:10:43.601921 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719b7532-9517-409c-8870-62a1c92dd38c" containerName="init" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.601936 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="719b7532-9517-409c-8870-62a1c92dd38c" containerName="init" Oct 11 01:10:43 crc kubenswrapper[4743]: E1011 01:10:43.601948 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719b7532-9517-409c-8870-62a1c92dd38c" containerName="dnsmasq-dns" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.601955 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="719b7532-9517-409c-8870-62a1c92dd38c" containerName="dnsmasq-dns" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.602122 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="719b7532-9517-409c-8870-62a1c92dd38c" containerName="dnsmasq-dns" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.603043 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-hwcsx" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.605176 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.657916 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-hwcsx"] Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.702027 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-fkcc9"] Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.703461 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-fkcc9" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.705166 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44kcn\" (UniqueName: \"kubernetes.io/projected/09e6a1ce-a4c0-46e2-9bef-0e8a996b069d-kube-api-access-44kcn\") pod \"dnsmasq-dns-5bf47b49b7-hwcsx\" (UID: \"09e6a1ce-a4c0-46e2-9bef-0e8a996b069d\") " pod="openstack/dnsmasq-dns-5bf47b49b7-hwcsx" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.705296 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09e6a1ce-a4c0-46e2-9bef-0e8a996b069d-config\") pod \"dnsmasq-dns-5bf47b49b7-hwcsx\" (UID: \"09e6a1ce-a4c0-46e2-9bef-0e8a996b069d\") " pod="openstack/dnsmasq-dns-5bf47b49b7-hwcsx" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.705358 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09e6a1ce-a4c0-46e2-9bef-0e8a996b069d-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-hwcsx\" (UID: \"09e6a1ce-a4c0-46e2-9bef-0e8a996b069d\") " pod="openstack/dnsmasq-dns-5bf47b49b7-hwcsx" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.705382 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09e6a1ce-a4c0-46e2-9bef-0e8a996b069d-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-hwcsx\" (UID: \"09e6a1ce-a4c0-46e2-9bef-0e8a996b069d\") " pod="openstack/dnsmasq-dns-5bf47b49b7-hwcsx" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.705764 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.711207 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-fkcc9"] Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.765035 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-hwcsx"] Oct 11 01:10:43 crc kubenswrapper[4743]: E1011 01:10:43.765741 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-44kcn ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5bf47b49b7-hwcsx" podUID="09e6a1ce-a4c0-46e2-9bef-0e8a996b069d" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.794988 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-mhqx7"] Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.796476 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-mhqx7" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.802968 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.803682 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.807665 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a-ovn-rundir\") pod \"ovn-controller-metrics-fkcc9\" (UID: \"145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a\") " pod="openstack/ovn-controller-metrics-fkcc9" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.807715 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-fkcc9\" (UID: \"145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a\") " pod="openstack/ovn-controller-metrics-fkcc9" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.807751 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cwdf\" (UniqueName: \"kubernetes.io/projected/145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a-kube-api-access-6cwdf\") pod \"ovn-controller-metrics-fkcc9\" (UID: \"145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a\") " pod="openstack/ovn-controller-metrics-fkcc9" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.807783 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09e6a1ce-a4c0-46e2-9bef-0e8a996b069d-config\") pod \"dnsmasq-dns-5bf47b49b7-hwcsx\" (UID: \"09e6a1ce-a4c0-46e2-9bef-0e8a996b069d\") " pod="openstack/dnsmasq-dns-5bf47b49b7-hwcsx" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.807800 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a-combined-ca-bundle\") pod \"ovn-controller-metrics-fkcc9\" (UID: \"145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a\") " pod="openstack/ovn-controller-metrics-fkcc9" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.807840 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09e6a1ce-a4c0-46e2-9bef-0e8a996b069d-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-hwcsx\" (UID: \"09e6a1ce-a4c0-46e2-9bef-0e8a996b069d\") " pod="openstack/dnsmasq-dns-5bf47b49b7-hwcsx" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.807880 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09e6a1ce-a4c0-46e2-9bef-0e8a996b069d-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-hwcsx\" (UID: \"09e6a1ce-a4c0-46e2-9bef-0e8a996b069d\") " pod="openstack/dnsmasq-dns-5bf47b49b7-hwcsx" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.807900 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a-config\") pod \"ovn-controller-metrics-fkcc9\" (UID: \"145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a\") " pod="openstack/ovn-controller-metrics-fkcc9" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.807960 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44kcn\" (UniqueName: \"kubernetes.io/projected/09e6a1ce-a4c0-46e2-9bef-0e8a996b069d-kube-api-access-44kcn\") pod \"dnsmasq-dns-5bf47b49b7-hwcsx\" (UID: \"09e6a1ce-a4c0-46e2-9bef-0e8a996b069d\") " pod="openstack/dnsmasq-dns-5bf47b49b7-hwcsx" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.807978 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a-ovs-rundir\") pod \"ovn-controller-metrics-fkcc9\" (UID: \"145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a\") " pod="openstack/ovn-controller-metrics-fkcc9" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.809436 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09e6a1ce-a4c0-46e2-9bef-0e8a996b069d-config\") pod \"dnsmasq-dns-5bf47b49b7-hwcsx\" (UID: \"09e6a1ce-a4c0-46e2-9bef-0e8a996b069d\") " pod="openstack/dnsmasq-dns-5bf47b49b7-hwcsx" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.809988 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09e6a1ce-a4c0-46e2-9bef-0e8a996b069d-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-hwcsx\" (UID: \"09e6a1ce-a4c0-46e2-9bef-0e8a996b069d\") " pod="openstack/dnsmasq-dns-5bf47b49b7-hwcsx" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.810479 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09e6a1ce-a4c0-46e2-9bef-0e8a996b069d-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-hwcsx\" (UID: \"09e6a1ce-a4c0-46e2-9bef-0e8a996b069d\") " pod="openstack/dnsmasq-dns-5bf47b49b7-hwcsx" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.812499 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.816194 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.816207 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.816387 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.816439 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-vq84d" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.823234 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-mhqx7"] Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.838932 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44kcn\" (UniqueName: \"kubernetes.io/projected/09e6a1ce-a4c0-46e2-9bef-0e8a996b069d-kube-api-access-44kcn\") pod \"dnsmasq-dns-5bf47b49b7-hwcsx\" (UID: \"09e6a1ce-a4c0-46e2-9bef-0e8a996b069d\") " pod="openstack/dnsmasq-dns-5bf47b49b7-hwcsx" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.839021 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.909438 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d34a438-58ab-4f96-be16-b509c19cfd06-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-mhqx7\" (UID: \"0d34a438-58ab-4f96-be16-b509c19cfd06\") " pod="openstack/dnsmasq-dns-8554648995-mhqx7" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.909486 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cwdf\" (UniqueName: \"kubernetes.io/projected/145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a-kube-api-access-6cwdf\") pod \"ovn-controller-metrics-fkcc9\" (UID: \"145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a\") " pod="openstack/ovn-controller-metrics-fkcc9" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.909510 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/aae1bfd4-4ee9-4b40-bc1b-241275ef8097-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"aae1bfd4-4ee9-4b40-bc1b-241275ef8097\") " pod="openstack/ovn-northd-0" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.909528 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn9h8\" (UniqueName: \"kubernetes.io/projected/aae1bfd4-4ee9-4b40-bc1b-241275ef8097-kube-api-access-gn9h8\") pod \"ovn-northd-0\" (UID: \"aae1bfd4-4ee9-4b40-bc1b-241275ef8097\") " pod="openstack/ovn-northd-0" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.909553 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pccfh\" (UniqueName: \"kubernetes.io/projected/0d34a438-58ab-4f96-be16-b509c19cfd06-kube-api-access-pccfh\") pod \"dnsmasq-dns-8554648995-mhqx7\" (UID: \"0d34a438-58ab-4f96-be16-b509c19cfd06\") " pod="openstack/dnsmasq-dns-8554648995-mhqx7" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.909577 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a-combined-ca-bundle\") pod \"ovn-controller-metrics-fkcc9\" (UID: \"145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a\") " pod="openstack/ovn-controller-metrics-fkcc9" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.909626 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d34a438-58ab-4f96-be16-b509c19cfd06-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-mhqx7\" (UID: \"0d34a438-58ab-4f96-be16-b509c19cfd06\") " pod="openstack/dnsmasq-dns-8554648995-mhqx7" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.909649 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae1bfd4-4ee9-4b40-bc1b-241275ef8097-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"aae1bfd4-4ee9-4b40-bc1b-241275ef8097\") " pod="openstack/ovn-northd-0" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.909712 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d34a438-58ab-4f96-be16-b509c19cfd06-dns-svc\") pod \"dnsmasq-dns-8554648995-mhqx7\" (UID: \"0d34a438-58ab-4f96-be16-b509c19cfd06\") " pod="openstack/dnsmasq-dns-8554648995-mhqx7" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.909780 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aae1bfd4-4ee9-4b40-bc1b-241275ef8097-config\") pod \"ovn-northd-0\" (UID: \"aae1bfd4-4ee9-4b40-bc1b-241275ef8097\") " pod="openstack/ovn-northd-0" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.909832 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aae1bfd4-4ee9-4b40-bc1b-241275ef8097-scripts\") pod \"ovn-northd-0\" (UID: \"aae1bfd4-4ee9-4b40-bc1b-241275ef8097\") " pod="openstack/ovn-northd-0" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.909889 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a-config\") pod \"ovn-controller-metrics-fkcc9\" (UID: \"145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a\") " pod="openstack/ovn-controller-metrics-fkcc9" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.909921 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aae1bfd4-4ee9-4b40-bc1b-241275ef8097-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"aae1bfd4-4ee9-4b40-bc1b-241275ef8097\") " pod="openstack/ovn-northd-0" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.910022 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a-ovs-rundir\") pod \"ovn-controller-metrics-fkcc9\" (UID: \"145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a\") " pod="openstack/ovn-controller-metrics-fkcc9" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.910044 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aae1bfd4-4ee9-4b40-bc1b-241275ef8097-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"aae1bfd4-4ee9-4b40-bc1b-241275ef8097\") " pod="openstack/ovn-northd-0" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.910064 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a-ovn-rundir\") pod \"ovn-controller-metrics-fkcc9\" (UID: \"145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a\") " pod="openstack/ovn-controller-metrics-fkcc9" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.910373 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a-ovs-rundir\") pod \"ovn-controller-metrics-fkcc9\" (UID: \"145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a\") " pod="openstack/ovn-controller-metrics-fkcc9" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.910083 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d34a438-58ab-4f96-be16-b509c19cfd06-config\") pod \"dnsmasq-dns-8554648995-mhqx7\" (UID: \"0d34a438-58ab-4f96-be16-b509c19cfd06\") " pod="openstack/dnsmasq-dns-8554648995-mhqx7" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.910507 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-fkcc9\" (UID: \"145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a\") " pod="openstack/ovn-controller-metrics-fkcc9" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.910494 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a-ovn-rundir\") pod \"ovn-controller-metrics-fkcc9\" (UID: \"145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a\") " pod="openstack/ovn-controller-metrics-fkcc9" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.910541 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a-config\") pod \"ovn-controller-metrics-fkcc9\" (UID: \"145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a\") " pod="openstack/ovn-controller-metrics-fkcc9" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.912985 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a-combined-ca-bundle\") pod \"ovn-controller-metrics-fkcc9\" (UID: \"145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a\") " pod="openstack/ovn-controller-metrics-fkcc9" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.913297 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-fkcc9\" (UID: \"145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a\") " pod="openstack/ovn-controller-metrics-fkcc9" Oct 11 01:10:43 crc kubenswrapper[4743]: I1011 01:10:43.927075 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cwdf\" (UniqueName: \"kubernetes.io/projected/145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a-kube-api-access-6cwdf\") pod \"ovn-controller-metrics-fkcc9\" (UID: \"145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a\") " pod="openstack/ovn-controller-metrics-fkcc9" Oct 11 01:10:43 crc kubenswrapper[4743]: E1011 01:10:43.986243 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1a1779f_127f_4ea2_a937_b97f329e3878.slice/crio-conmon-3ca7a025605c798f9dce3f44e26b3545c887fbf30de417079fadcd6d371106ca.scope\": RecentStats: unable to find data in memory cache]" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.012363 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aae1bfd4-4ee9-4b40-bc1b-241275ef8097-config\") pod \"ovn-northd-0\" (UID: \"aae1bfd4-4ee9-4b40-bc1b-241275ef8097\") " pod="openstack/ovn-northd-0" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.012413 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aae1bfd4-4ee9-4b40-bc1b-241275ef8097-scripts\") pod \"ovn-northd-0\" (UID: \"aae1bfd4-4ee9-4b40-bc1b-241275ef8097\") " pod="openstack/ovn-northd-0" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.012449 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aae1bfd4-4ee9-4b40-bc1b-241275ef8097-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"aae1bfd4-4ee9-4b40-bc1b-241275ef8097\") " pod="openstack/ovn-northd-0" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.012504 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aae1bfd4-4ee9-4b40-bc1b-241275ef8097-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"aae1bfd4-4ee9-4b40-bc1b-241275ef8097\") " pod="openstack/ovn-northd-0" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.012531 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d34a438-58ab-4f96-be16-b509c19cfd06-config\") pod \"dnsmasq-dns-8554648995-mhqx7\" (UID: \"0d34a438-58ab-4f96-be16-b509c19cfd06\") " pod="openstack/dnsmasq-dns-8554648995-mhqx7" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.012571 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d34a438-58ab-4f96-be16-b509c19cfd06-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-mhqx7\" (UID: \"0d34a438-58ab-4f96-be16-b509c19cfd06\") " pod="openstack/dnsmasq-dns-8554648995-mhqx7" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.012591 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/aae1bfd4-4ee9-4b40-bc1b-241275ef8097-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"aae1bfd4-4ee9-4b40-bc1b-241275ef8097\") " pod="openstack/ovn-northd-0" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.012605 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn9h8\" (UniqueName: \"kubernetes.io/projected/aae1bfd4-4ee9-4b40-bc1b-241275ef8097-kube-api-access-gn9h8\") pod \"ovn-northd-0\" (UID: \"aae1bfd4-4ee9-4b40-bc1b-241275ef8097\") " pod="openstack/ovn-northd-0" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.012626 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pccfh\" (UniqueName: \"kubernetes.io/projected/0d34a438-58ab-4f96-be16-b509c19cfd06-kube-api-access-pccfh\") pod \"dnsmasq-dns-8554648995-mhqx7\" (UID: \"0d34a438-58ab-4f96-be16-b509c19cfd06\") " pod="openstack/dnsmasq-dns-8554648995-mhqx7" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.012654 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d34a438-58ab-4f96-be16-b509c19cfd06-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-mhqx7\" (UID: \"0d34a438-58ab-4f96-be16-b509c19cfd06\") " pod="openstack/dnsmasq-dns-8554648995-mhqx7" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.012672 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae1bfd4-4ee9-4b40-bc1b-241275ef8097-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"aae1bfd4-4ee9-4b40-bc1b-241275ef8097\") " pod="openstack/ovn-northd-0" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.012690 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d34a438-58ab-4f96-be16-b509c19cfd06-dns-svc\") pod \"dnsmasq-dns-8554648995-mhqx7\" (UID: \"0d34a438-58ab-4f96-be16-b509c19cfd06\") " pod="openstack/dnsmasq-dns-8554648995-mhqx7" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.013220 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aae1bfd4-4ee9-4b40-bc1b-241275ef8097-config\") pod \"ovn-northd-0\" (UID: \"aae1bfd4-4ee9-4b40-bc1b-241275ef8097\") " pod="openstack/ovn-northd-0" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.013502 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d34a438-58ab-4f96-be16-b509c19cfd06-dns-svc\") pod \"dnsmasq-dns-8554648995-mhqx7\" (UID: \"0d34a438-58ab-4f96-be16-b509c19cfd06\") " pod="openstack/dnsmasq-dns-8554648995-mhqx7" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.013910 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aae1bfd4-4ee9-4b40-bc1b-241275ef8097-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"aae1bfd4-4ee9-4b40-bc1b-241275ef8097\") " pod="openstack/ovn-northd-0" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.014154 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d34a438-58ab-4f96-be16-b509c19cfd06-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-mhqx7\" (UID: \"0d34a438-58ab-4f96-be16-b509c19cfd06\") " pod="openstack/dnsmasq-dns-8554648995-mhqx7" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.014299 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d34a438-58ab-4f96-be16-b509c19cfd06-config\") pod \"dnsmasq-dns-8554648995-mhqx7\" (UID: \"0d34a438-58ab-4f96-be16-b509c19cfd06\") " pod="openstack/dnsmasq-dns-8554648995-mhqx7" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.014325 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d34a438-58ab-4f96-be16-b509c19cfd06-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-mhqx7\" (UID: \"0d34a438-58ab-4f96-be16-b509c19cfd06\") " pod="openstack/dnsmasq-dns-8554648995-mhqx7" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.014987 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aae1bfd4-4ee9-4b40-bc1b-241275ef8097-scripts\") pod \"ovn-northd-0\" (UID: \"aae1bfd4-4ee9-4b40-bc1b-241275ef8097\") " pod="openstack/ovn-northd-0" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.016644 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aae1bfd4-4ee9-4b40-bc1b-241275ef8097-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"aae1bfd4-4ee9-4b40-bc1b-241275ef8097\") " pod="openstack/ovn-northd-0" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.017217 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/aae1bfd4-4ee9-4b40-bc1b-241275ef8097-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"aae1bfd4-4ee9-4b40-bc1b-241275ef8097\") " pod="openstack/ovn-northd-0" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.030471 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae1bfd4-4ee9-4b40-bc1b-241275ef8097-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"aae1bfd4-4ee9-4b40-bc1b-241275ef8097\") " pod="openstack/ovn-northd-0" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.032198 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pccfh\" (UniqueName: \"kubernetes.io/projected/0d34a438-58ab-4f96-be16-b509c19cfd06-kube-api-access-pccfh\") pod \"dnsmasq-dns-8554648995-mhqx7\" (UID: \"0d34a438-58ab-4f96-be16-b509c19cfd06\") " pod="openstack/dnsmasq-dns-8554648995-mhqx7" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.032290 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-fkcc9" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.033797 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn9h8\" (UniqueName: \"kubernetes.io/projected/aae1bfd4-4ee9-4b40-bc1b-241275ef8097-kube-api-access-gn9h8\") pod \"ovn-northd-0\" (UID: \"aae1bfd4-4ee9-4b40-bc1b-241275ef8097\") " pod="openstack/ovn-northd-0" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.136684 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-mhqx7" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.274570 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.423047 4743 generic.go:334] "Generic (PLEG): container finished" podID="e1a1779f-127f-4ea2-a937-b97f329e3878" containerID="3ca7a025605c798f9dce3f44e26b3545c887fbf30de417079fadcd6d371106ca" exitCode=0 Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.423132 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e1a1779f-127f-4ea2-a937-b97f329e3878","Type":"ContainerDied","Data":"3ca7a025605c798f9dce3f44e26b3545c887fbf30de417079fadcd6d371106ca"} Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.423635 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-hwcsx" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.439505 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-hwcsx" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.494496 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-fkcc9"] Oct 11 01:10:44 crc kubenswrapper[4743]: W1011 01:10:44.498785 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod145bdcc2_f62c_4dbf_bdb1_6ced4d65ba3a.slice/crio-61f7e4c4eb53681e4a6fd6daca4adb5a4164501145524f32e58ef6079150ebdf WatchSource:0}: Error finding container 61f7e4c4eb53681e4a6fd6daca4adb5a4164501145524f32e58ef6079150ebdf: Status 404 returned error can't find the container with id 61f7e4c4eb53681e4a6fd6daca4adb5a4164501145524f32e58ef6079150ebdf Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.527255 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09e6a1ce-a4c0-46e2-9bef-0e8a996b069d-ovsdbserver-nb\") pod \"09e6a1ce-a4c0-46e2-9bef-0e8a996b069d\" (UID: \"09e6a1ce-a4c0-46e2-9bef-0e8a996b069d\") " Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.527523 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44kcn\" (UniqueName: \"kubernetes.io/projected/09e6a1ce-a4c0-46e2-9bef-0e8a996b069d-kube-api-access-44kcn\") pod \"09e6a1ce-a4c0-46e2-9bef-0e8a996b069d\" (UID: \"09e6a1ce-a4c0-46e2-9bef-0e8a996b069d\") " Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.527580 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09e6a1ce-a4c0-46e2-9bef-0e8a996b069d-config\") pod \"09e6a1ce-a4c0-46e2-9bef-0e8a996b069d\" (UID: \"09e6a1ce-a4c0-46e2-9bef-0e8a996b069d\") " Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.527660 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09e6a1ce-a4c0-46e2-9bef-0e8a996b069d-dns-svc\") pod \"09e6a1ce-a4c0-46e2-9bef-0e8a996b069d\" (UID: \"09e6a1ce-a4c0-46e2-9bef-0e8a996b069d\") " Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.528932 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09e6a1ce-a4c0-46e2-9bef-0e8a996b069d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "09e6a1ce-a4c0-46e2-9bef-0e8a996b069d" (UID: "09e6a1ce-a4c0-46e2-9bef-0e8a996b069d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.529170 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09e6a1ce-a4c0-46e2-9bef-0e8a996b069d-config" (OuterVolumeSpecName: "config") pod "09e6a1ce-a4c0-46e2-9bef-0e8a996b069d" (UID: "09e6a1ce-a4c0-46e2-9bef-0e8a996b069d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.529282 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09e6a1ce-a4c0-46e2-9bef-0e8a996b069d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "09e6a1ce-a4c0-46e2-9bef-0e8a996b069d" (UID: "09e6a1ce-a4c0-46e2-9bef-0e8a996b069d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.531994 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09e6a1ce-a4c0-46e2-9bef-0e8a996b069d-kube-api-access-44kcn" (OuterVolumeSpecName: "kube-api-access-44kcn") pod "09e6a1ce-a4c0-46e2-9bef-0e8a996b069d" (UID: "09e6a1ce-a4c0-46e2-9bef-0e8a996b069d"). InnerVolumeSpecName "kube-api-access-44kcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.631289 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09e6a1ce-a4c0-46e2-9bef-0e8a996b069d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.631319 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44kcn\" (UniqueName: \"kubernetes.io/projected/09e6a1ce-a4c0-46e2-9bef-0e8a996b069d-kube-api-access-44kcn\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.631330 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09e6a1ce-a4c0-46e2-9bef-0e8a996b069d-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.631339 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09e6a1ce-a4c0-46e2-9bef-0e8a996b069d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.656634 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-mhqx7"] Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.809584 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 11 01:10:44 crc kubenswrapper[4743]: W1011 01:10:44.816323 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaae1bfd4_4ee9_4b40_bc1b_241275ef8097.slice/crio-b384508d3e8623c21345899e7ae8478fb3ceb4ed9cfc0133e7efbc977374e2e9 WatchSource:0}: Error finding container b384508d3e8623c21345899e7ae8478fb3ceb4ed9cfc0133e7efbc977374e2e9: Status 404 returned error can't find the container with id b384508d3e8623c21345899e7ae8478fb3ceb4ed9cfc0133e7efbc977374e2e9 Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.934662 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-mhqx7"] Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.974233 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zsfms"] Oct 11 01:10:44 crc kubenswrapper[4743]: I1011 01:10:44.979401 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.019422 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zsfms"] Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.025613 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.038527 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg42l\" (UniqueName: \"kubernetes.io/projected/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-kube-api-access-tg42l\") pod \"dnsmasq-dns-b8fbc5445-zsfms\" (UID: \"6d320933-71b0-4dd8-abe3-a36a0ab9aa79\") " pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.038584 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-zsfms\" (UID: \"6d320933-71b0-4dd8-abe3-a36a0ab9aa79\") " pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.038680 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-config\") pod \"dnsmasq-dns-b8fbc5445-zsfms\" (UID: \"6d320933-71b0-4dd8-abe3-a36a0ab9aa79\") " pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.038740 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-zsfms\" (UID: \"6d320933-71b0-4dd8-abe3-a36a0ab9aa79\") " pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.038782 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-zsfms\" (UID: \"6d320933-71b0-4dd8-abe3-a36a0ab9aa79\") " pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.140810 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-zsfms\" (UID: \"6d320933-71b0-4dd8-abe3-a36a0ab9aa79\") " pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.141427 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-config\") pod \"dnsmasq-dns-b8fbc5445-zsfms\" (UID: \"6d320933-71b0-4dd8-abe3-a36a0ab9aa79\") " pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.141551 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-zsfms\" (UID: \"6d320933-71b0-4dd8-abe3-a36a0ab9aa79\") " pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.141651 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-zsfms\" (UID: \"6d320933-71b0-4dd8-abe3-a36a0ab9aa79\") " pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.141743 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg42l\" (UniqueName: \"kubernetes.io/projected/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-kube-api-access-tg42l\") pod \"dnsmasq-dns-b8fbc5445-zsfms\" (UID: \"6d320933-71b0-4dd8-abe3-a36a0ab9aa79\") " pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.142837 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-config\") pod \"dnsmasq-dns-b8fbc5445-zsfms\" (UID: \"6d320933-71b0-4dd8-abe3-a36a0ab9aa79\") " pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.142883 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-zsfms\" (UID: \"6d320933-71b0-4dd8-abe3-a36a0ab9aa79\") " pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.142936 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-zsfms\" (UID: \"6d320933-71b0-4dd8-abe3-a36a0ab9aa79\") " pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.143799 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-zsfms\" (UID: \"6d320933-71b0-4dd8-abe3-a36a0ab9aa79\") " pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.187703 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg42l\" (UniqueName: \"kubernetes.io/projected/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-kube-api-access-tg42l\") pod \"dnsmasq-dns-b8fbc5445-zsfms\" (UID: \"6d320933-71b0-4dd8-abe3-a36a0ab9aa79\") " pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.309620 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.432889 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"aae1bfd4-4ee9-4b40-bc1b-241275ef8097","Type":"ContainerStarted","Data":"b384508d3e8623c21345899e7ae8478fb3ceb4ed9cfc0133e7efbc977374e2e9"} Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.435637 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-fkcc9" event={"ID":"145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a","Type":"ContainerStarted","Data":"9e0015b7cb0da883cfdd25e48440dcde8ab627a4ce7111d843c7b7701bc211bb"} Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.435675 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-fkcc9" event={"ID":"145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a","Type":"ContainerStarted","Data":"61f7e4c4eb53681e4a6fd6daca4adb5a4164501145524f32e58ef6079150ebdf"} Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.445425 4743 generic.go:334] "Generic (PLEG): container finished" podID="0d34a438-58ab-4f96-be16-b509c19cfd06" containerID="e286f4ae34f380f5bc0f550d5d9b10179383f9e41f9998a20e8f4dac025d1cf3" exitCode=0 Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.445886 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-mhqx7" event={"ID":"0d34a438-58ab-4f96-be16-b509c19cfd06","Type":"ContainerDied","Data":"e286f4ae34f380f5bc0f550d5d9b10179383f9e41f9998a20e8f4dac025d1cf3"} Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.445911 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-mhqx7" event={"ID":"0d34a438-58ab-4f96-be16-b509c19cfd06","Type":"ContainerStarted","Data":"a5ba665e164a91ad2935e0f9c16be9b0a6aff76202a2dbec8f083f85cc47590a"} Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.445946 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-hwcsx" Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.473243 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-fkcc9" podStartSLOduration=2.473227206 podStartE2EDuration="2.473227206s" podCreationTimestamp="2025-10-11 01:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:10:45.469660266 +0000 UTC m=+1140.122640663" watchObservedRunningTime="2025-10-11 01:10:45.473227206 +0000 UTC m=+1140.126207603" Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.588576 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-hwcsx"] Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.605144 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-hwcsx"] Oct 11 01:10:45 crc kubenswrapper[4743]: E1011 01:10:45.709206 4743 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 11 01:10:45 crc kubenswrapper[4743]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/0d34a438-58ab-4f96-be16-b509c19cfd06/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 11 01:10:45 crc kubenswrapper[4743]: > podSandboxID="a5ba665e164a91ad2935e0f9c16be9b0a6aff76202a2dbec8f083f85cc47590a" Oct 11 01:10:45 crc kubenswrapper[4743]: E1011 01:10:45.709368 4743 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 11 01:10:45 crc kubenswrapper[4743]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n654h99h64ch5dbh6dh555h587h64bh5cfh647h5fdh57ch679h9h597h5f5hbch59bh54fh575h566h667h586h5f5h65ch5bch57h68h65ch58bh694h5cfq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pccfh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-8554648995-mhqx7_openstack(0d34a438-58ab-4f96-be16-b509c19cfd06): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/0d34a438-58ab-4f96-be16-b509c19cfd06/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 11 01:10:45 crc kubenswrapper[4743]: > logger="UnhandledError" Oct 11 01:10:45 crc kubenswrapper[4743]: E1011 01:10:45.710485 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/0d34a438-58ab-4f96-be16-b509c19cfd06/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-8554648995-mhqx7" podUID="0d34a438-58ab-4f96-be16-b509c19cfd06" Oct 11 01:10:45 crc kubenswrapper[4743]: I1011 01:10:45.931553 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zsfms"] Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.125982 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09e6a1ce-a4c0-46e2-9bef-0e8a996b069d" path="/var/lib/kubelet/pods/09e6a1ce-a4c0-46e2-9bef-0e8a996b069d/volumes" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.133981 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.153502 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.157055 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.158105 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.159035 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.159269 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-2zgrl" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.185828 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.264216 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"082aa898-adc9-4e0d-a5e3-329d36f391aa\") " pod="openstack/swift-storage-0" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.264367 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/082aa898-adc9-4e0d-a5e3-329d36f391aa-lock\") pod \"swift-storage-0\" (UID: \"082aa898-adc9-4e0d-a5e3-329d36f391aa\") " pod="openstack/swift-storage-0" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.264393 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nph8\" (UniqueName: \"kubernetes.io/projected/082aa898-adc9-4e0d-a5e3-329d36f391aa-kube-api-access-6nph8\") pod \"swift-storage-0\" (UID: \"082aa898-adc9-4e0d-a5e3-329d36f391aa\") " pod="openstack/swift-storage-0" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.264424 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/082aa898-adc9-4e0d-a5e3-329d36f391aa-cache\") pod \"swift-storage-0\" (UID: \"082aa898-adc9-4e0d-a5e3-329d36f391aa\") " pod="openstack/swift-storage-0" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.265201 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/082aa898-adc9-4e0d-a5e3-329d36f391aa-etc-swift\") pod \"swift-storage-0\" (UID: \"082aa898-adc9-4e0d-a5e3-329d36f391aa\") " pod="openstack/swift-storage-0" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.368953 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/082aa898-adc9-4e0d-a5e3-329d36f391aa-lock\") pod \"swift-storage-0\" (UID: \"082aa898-adc9-4e0d-a5e3-329d36f391aa\") " pod="openstack/swift-storage-0" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.369018 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nph8\" (UniqueName: \"kubernetes.io/projected/082aa898-adc9-4e0d-a5e3-329d36f391aa-kube-api-access-6nph8\") pod \"swift-storage-0\" (UID: \"082aa898-adc9-4e0d-a5e3-329d36f391aa\") " pod="openstack/swift-storage-0" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.369046 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/082aa898-adc9-4e0d-a5e3-329d36f391aa-cache\") pod \"swift-storage-0\" (UID: \"082aa898-adc9-4e0d-a5e3-329d36f391aa\") " pod="openstack/swift-storage-0" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.369069 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/082aa898-adc9-4e0d-a5e3-329d36f391aa-etc-swift\") pod \"swift-storage-0\" (UID: \"082aa898-adc9-4e0d-a5e3-329d36f391aa\") " pod="openstack/swift-storage-0" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.369138 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"082aa898-adc9-4e0d-a5e3-329d36f391aa\") " pod="openstack/swift-storage-0" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.369501 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/082aa898-adc9-4e0d-a5e3-329d36f391aa-lock\") pod \"swift-storage-0\" (UID: \"082aa898-adc9-4e0d-a5e3-329d36f391aa\") " pod="openstack/swift-storage-0" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.369534 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"082aa898-adc9-4e0d-a5e3-329d36f391aa\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/swift-storage-0" Oct 11 01:10:46 crc kubenswrapper[4743]: E1011 01:10:46.370827 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 11 01:10:46 crc kubenswrapper[4743]: E1011 01:10:46.370849 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 11 01:10:46 crc kubenswrapper[4743]: E1011 01:10:46.370905 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/082aa898-adc9-4e0d-a5e3-329d36f391aa-etc-swift podName:082aa898-adc9-4e0d-a5e3-329d36f391aa nodeName:}" failed. No retries permitted until 2025-10-11 01:10:46.870889388 +0000 UTC m=+1141.523869785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/082aa898-adc9-4e0d-a5e3-329d36f391aa-etc-swift") pod "swift-storage-0" (UID: "082aa898-adc9-4e0d-a5e3-329d36f391aa") : configmap "swift-ring-files" not found Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.371065 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/082aa898-adc9-4e0d-a5e3-329d36f391aa-cache\") pod \"swift-storage-0\" (UID: \"082aa898-adc9-4e0d-a5e3-329d36f391aa\") " pod="openstack/swift-storage-0" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.391895 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nph8\" (UniqueName: \"kubernetes.io/projected/082aa898-adc9-4e0d-a5e3-329d36f391aa-kube-api-access-6nph8\") pod \"swift-storage-0\" (UID: \"082aa898-adc9-4e0d-a5e3-329d36f391aa\") " pod="openstack/swift-storage-0" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.393048 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"082aa898-adc9-4e0d-a5e3-329d36f391aa\") " pod="openstack/swift-storage-0" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.455815 4743 generic.go:334] "Generic (PLEG): container finished" podID="6d320933-71b0-4dd8-abe3-a36a0ab9aa79" containerID="4fa13e948631dee90c3ba0e53d0f99c64ccb6434dae2d7716fe3d6995d728301" exitCode=0 Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.455901 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" event={"ID":"6d320933-71b0-4dd8-abe3-a36a0ab9aa79","Type":"ContainerDied","Data":"4fa13e948631dee90c3ba0e53d0f99c64ccb6434dae2d7716fe3d6995d728301"} Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.455973 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" event={"ID":"6d320933-71b0-4dd8-abe3-a36a0ab9aa79","Type":"ContainerStarted","Data":"f5f7f6a2692056529c4fa8cb328abf623877410cc03169a23f53618d967ac8e5"} Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.796349 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-8qvc2"] Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.798079 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8qvc2" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.803984 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.804230 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.804384 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.812385 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-8qvc2"] Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.889243 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mrvc\" (UniqueName: \"kubernetes.io/projected/5718aabd-82b4-4079-96f4-d241fb2c8efc-kube-api-access-4mrvc\") pod \"swift-ring-rebalance-8qvc2\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " pod="openstack/swift-ring-rebalance-8qvc2" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.889304 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5718aabd-82b4-4079-96f4-d241fb2c8efc-swiftconf\") pod \"swift-ring-rebalance-8qvc2\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " pod="openstack/swift-ring-rebalance-8qvc2" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.889405 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5718aabd-82b4-4079-96f4-d241fb2c8efc-combined-ca-bundle\") pod \"swift-ring-rebalance-8qvc2\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " pod="openstack/swift-ring-rebalance-8qvc2" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.889445 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5718aabd-82b4-4079-96f4-d241fb2c8efc-ring-data-devices\") pod \"swift-ring-rebalance-8qvc2\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " pod="openstack/swift-ring-rebalance-8qvc2" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.889475 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5718aabd-82b4-4079-96f4-d241fb2c8efc-etc-swift\") pod \"swift-ring-rebalance-8qvc2\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " pod="openstack/swift-ring-rebalance-8qvc2" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.889505 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/082aa898-adc9-4e0d-a5e3-329d36f391aa-etc-swift\") pod \"swift-storage-0\" (UID: \"082aa898-adc9-4e0d-a5e3-329d36f391aa\") " pod="openstack/swift-storage-0" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.889544 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5718aabd-82b4-4079-96f4-d241fb2c8efc-dispersionconf\") pod \"swift-ring-rebalance-8qvc2\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " pod="openstack/swift-ring-rebalance-8qvc2" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.889572 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5718aabd-82b4-4079-96f4-d241fb2c8efc-scripts\") pod \"swift-ring-rebalance-8qvc2\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " pod="openstack/swift-ring-rebalance-8qvc2" Oct 11 01:10:46 crc kubenswrapper[4743]: E1011 01:10:46.889745 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 11 01:10:46 crc kubenswrapper[4743]: E1011 01:10:46.889758 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 11 01:10:46 crc kubenswrapper[4743]: E1011 01:10:46.889794 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/082aa898-adc9-4e0d-a5e3-329d36f391aa-etc-swift podName:082aa898-adc9-4e0d-a5e3-329d36f391aa nodeName:}" failed. No retries permitted until 2025-10-11 01:10:47.889779732 +0000 UTC m=+1142.542760129 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/082aa898-adc9-4e0d-a5e3-329d36f391aa-etc-swift") pod "swift-storage-0" (UID: "082aa898-adc9-4e0d-a5e3-329d36f391aa") : configmap "swift-ring-files" not found Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.990664 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5718aabd-82b4-4079-96f4-d241fb2c8efc-combined-ca-bundle\") pod \"swift-ring-rebalance-8qvc2\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " pod="openstack/swift-ring-rebalance-8qvc2" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.990982 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5718aabd-82b4-4079-96f4-d241fb2c8efc-ring-data-devices\") pod \"swift-ring-rebalance-8qvc2\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " pod="openstack/swift-ring-rebalance-8qvc2" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.991011 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5718aabd-82b4-4079-96f4-d241fb2c8efc-etc-swift\") pod \"swift-ring-rebalance-8qvc2\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " pod="openstack/swift-ring-rebalance-8qvc2" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.991053 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5718aabd-82b4-4079-96f4-d241fb2c8efc-dispersionconf\") pod \"swift-ring-rebalance-8qvc2\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " pod="openstack/swift-ring-rebalance-8qvc2" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.991073 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5718aabd-82b4-4079-96f4-d241fb2c8efc-scripts\") pod \"swift-ring-rebalance-8qvc2\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " pod="openstack/swift-ring-rebalance-8qvc2" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.991096 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mrvc\" (UniqueName: \"kubernetes.io/projected/5718aabd-82b4-4079-96f4-d241fb2c8efc-kube-api-access-4mrvc\") pod \"swift-ring-rebalance-8qvc2\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " pod="openstack/swift-ring-rebalance-8qvc2" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.991126 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5718aabd-82b4-4079-96f4-d241fb2c8efc-swiftconf\") pod \"swift-ring-rebalance-8qvc2\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " pod="openstack/swift-ring-rebalance-8qvc2" Oct 11 01:10:46 crc kubenswrapper[4743]: I1011 01:10:46.999579 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5718aabd-82b4-4079-96f4-d241fb2c8efc-scripts\") pod \"swift-ring-rebalance-8qvc2\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " pod="openstack/swift-ring-rebalance-8qvc2" Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.000266 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5718aabd-82b4-4079-96f4-d241fb2c8efc-ring-data-devices\") pod \"swift-ring-rebalance-8qvc2\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " pod="openstack/swift-ring-rebalance-8qvc2" Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.000533 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5718aabd-82b4-4079-96f4-d241fb2c8efc-etc-swift\") pod \"swift-ring-rebalance-8qvc2\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " pod="openstack/swift-ring-rebalance-8qvc2" Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.001640 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5718aabd-82b4-4079-96f4-d241fb2c8efc-combined-ca-bundle\") pod \"swift-ring-rebalance-8qvc2\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " pod="openstack/swift-ring-rebalance-8qvc2" Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.001985 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5718aabd-82b4-4079-96f4-d241fb2c8efc-swiftconf\") pod \"swift-ring-rebalance-8qvc2\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " pod="openstack/swift-ring-rebalance-8qvc2" Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.017068 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5718aabd-82b4-4079-96f4-d241fb2c8efc-dispersionconf\") pod \"swift-ring-rebalance-8qvc2\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " pod="openstack/swift-ring-rebalance-8qvc2" Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.025266 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mrvc\" (UniqueName: \"kubernetes.io/projected/5718aabd-82b4-4079-96f4-d241fb2c8efc-kube-api-access-4mrvc\") pod \"swift-ring-rebalance-8qvc2\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " pod="openstack/swift-ring-rebalance-8qvc2" Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.097248 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-mhqx7" Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.134613 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8qvc2" Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.193612 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d34a438-58ab-4f96-be16-b509c19cfd06-config\") pod \"0d34a438-58ab-4f96-be16-b509c19cfd06\" (UID: \"0d34a438-58ab-4f96-be16-b509c19cfd06\") " Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.193737 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pccfh\" (UniqueName: \"kubernetes.io/projected/0d34a438-58ab-4f96-be16-b509c19cfd06-kube-api-access-pccfh\") pod \"0d34a438-58ab-4f96-be16-b509c19cfd06\" (UID: \"0d34a438-58ab-4f96-be16-b509c19cfd06\") " Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.193916 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d34a438-58ab-4f96-be16-b509c19cfd06-ovsdbserver-nb\") pod \"0d34a438-58ab-4f96-be16-b509c19cfd06\" (UID: \"0d34a438-58ab-4f96-be16-b509c19cfd06\") " Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.193961 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d34a438-58ab-4f96-be16-b509c19cfd06-ovsdbserver-sb\") pod \"0d34a438-58ab-4f96-be16-b509c19cfd06\" (UID: \"0d34a438-58ab-4f96-be16-b509c19cfd06\") " Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.194004 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d34a438-58ab-4f96-be16-b509c19cfd06-dns-svc\") pod \"0d34a438-58ab-4f96-be16-b509c19cfd06\" (UID: \"0d34a438-58ab-4f96-be16-b509c19cfd06\") " Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.229091 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d34a438-58ab-4f96-be16-b509c19cfd06-kube-api-access-pccfh" (OuterVolumeSpecName: "kube-api-access-pccfh") pod "0d34a438-58ab-4f96-be16-b509c19cfd06" (UID: "0d34a438-58ab-4f96-be16-b509c19cfd06"). InnerVolumeSpecName "kube-api-access-pccfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.286529 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d34a438-58ab-4f96-be16-b509c19cfd06-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0d34a438-58ab-4f96-be16-b509c19cfd06" (UID: "0d34a438-58ab-4f96-be16-b509c19cfd06"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.299217 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pccfh\" (UniqueName: \"kubernetes.io/projected/0d34a438-58ab-4f96-be16-b509c19cfd06-kube-api-access-pccfh\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.299260 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d34a438-58ab-4f96-be16-b509c19cfd06-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.344460 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d34a438-58ab-4f96-be16-b509c19cfd06-config" (OuterVolumeSpecName: "config") pod "0d34a438-58ab-4f96-be16-b509c19cfd06" (UID: "0d34a438-58ab-4f96-be16-b509c19cfd06"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.352088 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d34a438-58ab-4f96-be16-b509c19cfd06-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0d34a438-58ab-4f96-be16-b509c19cfd06" (UID: "0d34a438-58ab-4f96-be16-b509c19cfd06"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.388313 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d34a438-58ab-4f96-be16-b509c19cfd06-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0d34a438-58ab-4f96-be16-b509c19cfd06" (UID: "0d34a438-58ab-4f96-be16-b509c19cfd06"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.400673 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d34a438-58ab-4f96-be16-b509c19cfd06-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.400703 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d34a438-58ab-4f96-be16-b509c19cfd06-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.400713 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d34a438-58ab-4f96-be16-b509c19cfd06-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.484303 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"aae1bfd4-4ee9-4b40-bc1b-241275ef8097","Type":"ContainerStarted","Data":"0f40ad2064c457ae87d25a4052a420b7099b173fbc4648047c13a34528534642"} Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.484634 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"aae1bfd4-4ee9-4b40-bc1b-241275ef8097","Type":"ContainerStarted","Data":"49a8f673b8ce9c8260913f2ffb8079f0a20e52f2cb8351ca1348a23578bba6d3"} Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.484649 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.493681 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" event={"ID":"6d320933-71b0-4dd8-abe3-a36a0ab9aa79","Type":"ContainerStarted","Data":"69da59d1eb2dce31de3404affb7b7abe455e1698feead3f4f007cd3c2e08a6eb"} Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.493979 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.505714 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-mhqx7" event={"ID":"0d34a438-58ab-4f96-be16-b509c19cfd06","Type":"ContainerDied","Data":"a5ba665e164a91ad2935e0f9c16be9b0a6aff76202a2dbec8f083f85cc47590a"} Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.505757 4743 scope.go:117] "RemoveContainer" containerID="e286f4ae34f380f5bc0f550d5d9b10179383f9e41f9998a20e8f4dac025d1cf3" Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.505939 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-mhqx7" Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.531646 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.769323608 podStartE2EDuration="4.531628284s" podCreationTimestamp="2025-10-11 01:10:43 +0000 UTC" firstStartedPulling="2025-10-11 01:10:44.819818831 +0000 UTC m=+1139.472799238" lastFinishedPulling="2025-10-11 01:10:46.582123517 +0000 UTC m=+1141.235103914" observedRunningTime="2025-10-11 01:10:47.526317759 +0000 UTC m=+1142.179298156" watchObservedRunningTime="2025-10-11 01:10:47.531628284 +0000 UTC m=+1142.184608681" Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.571105 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" podStartSLOduration=3.571089665 podStartE2EDuration="3.571089665s" podCreationTimestamp="2025-10-11 01:10:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:10:47.5697279 +0000 UTC m=+1142.222708297" watchObservedRunningTime="2025-10-11 01:10:47.571089665 +0000 UTC m=+1142.224070062" Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.619700 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-mhqx7"] Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.628361 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-mhqx7"] Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.874679 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-8qvc2"] Oct 11 01:10:47 crc kubenswrapper[4743]: I1011 01:10:47.916343 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/082aa898-adc9-4e0d-a5e3-329d36f391aa-etc-swift\") pod \"swift-storage-0\" (UID: \"082aa898-adc9-4e0d-a5e3-329d36f391aa\") " pod="openstack/swift-storage-0" Oct 11 01:10:47 crc kubenswrapper[4743]: E1011 01:10:47.916552 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 11 01:10:47 crc kubenswrapper[4743]: E1011 01:10:47.916577 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 11 01:10:47 crc kubenswrapper[4743]: E1011 01:10:47.916630 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/082aa898-adc9-4e0d-a5e3-329d36f391aa-etc-swift podName:082aa898-adc9-4e0d-a5e3-329d36f391aa nodeName:}" failed. No retries permitted until 2025-10-11 01:10:49.9166107 +0000 UTC m=+1144.569591097 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/082aa898-adc9-4e0d-a5e3-329d36f391aa-etc-swift") pod "swift-storage-0" (UID: "082aa898-adc9-4e0d-a5e3-329d36f391aa") : configmap "swift-ring-files" not found Oct 11 01:10:48 crc kubenswrapper[4743]: I1011 01:10:48.106453 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d34a438-58ab-4f96-be16-b509c19cfd06" path="/var/lib/kubelet/pods/0d34a438-58ab-4f96-be16-b509c19cfd06/volumes" Oct 11 01:10:48 crc kubenswrapper[4743]: I1011 01:10:48.516170 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-8qvc2" event={"ID":"5718aabd-82b4-4079-96f4-d241fb2c8efc","Type":"ContainerStarted","Data":"861503d235bee463affe4060cfeb8c5e7354d7f2776b480bf9e6d6f5f9904df2"} Oct 11 01:10:49 crc kubenswrapper[4743]: I1011 01:10:49.832083 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 11 01:10:49 crc kubenswrapper[4743]: I1011 01:10:49.832375 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 11 01:10:49 crc kubenswrapper[4743]: I1011 01:10:49.873844 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 11 01:10:49 crc kubenswrapper[4743]: I1011 01:10:49.956137 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/082aa898-adc9-4e0d-a5e3-329d36f391aa-etc-swift\") pod \"swift-storage-0\" (UID: \"082aa898-adc9-4e0d-a5e3-329d36f391aa\") " pod="openstack/swift-storage-0" Oct 11 01:10:49 crc kubenswrapper[4743]: E1011 01:10:49.956351 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 11 01:10:49 crc kubenswrapper[4743]: E1011 01:10:49.956382 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 11 01:10:49 crc kubenswrapper[4743]: E1011 01:10:49.956435 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/082aa898-adc9-4e0d-a5e3-329d36f391aa-etc-swift podName:082aa898-adc9-4e0d-a5e3-329d36f391aa nodeName:}" failed. No retries permitted until 2025-10-11 01:10:53.956416884 +0000 UTC m=+1148.609397271 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/082aa898-adc9-4e0d-a5e3-329d36f391aa-etc-swift") pod "swift-storage-0" (UID: "082aa898-adc9-4e0d-a5e3-329d36f391aa") : configmap "swift-ring-files" not found Oct 11 01:10:50 crc kubenswrapper[4743]: I1011 01:10:50.605188 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 11 01:10:51 crc kubenswrapper[4743]: I1011 01:10:51.326543 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:51 crc kubenswrapper[4743]: I1011 01:10:51.326797 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:51 crc kubenswrapper[4743]: I1011 01:10:51.359014 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-wjqbw"] Oct 11 01:10:51 crc kubenswrapper[4743]: E1011 01:10:51.359453 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d34a438-58ab-4f96-be16-b509c19cfd06" containerName="init" Oct 11 01:10:51 crc kubenswrapper[4743]: I1011 01:10:51.359468 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d34a438-58ab-4f96-be16-b509c19cfd06" containerName="init" Oct 11 01:10:51 crc kubenswrapper[4743]: I1011 01:10:51.359668 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d34a438-58ab-4f96-be16-b509c19cfd06" containerName="init" Oct 11 01:10:51 crc kubenswrapper[4743]: I1011 01:10:51.360400 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wjqbw" Oct 11 01:10:51 crc kubenswrapper[4743]: I1011 01:10:51.367732 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-wjqbw"] Oct 11 01:10:51 crc kubenswrapper[4743]: I1011 01:10:51.389689 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:51 crc kubenswrapper[4743]: I1011 01:10:51.439604 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-7bff76c5fd-k9d6v" podUID="4e0ffbae-47a1-49cd-93b6-46bc31e2ab74" containerName="console" containerID="cri-o://6024f101ee89d07405f68dd24eff958f2c6395a098f015f88711b3424ab17c14" gracePeriod=15 Oct 11 01:10:51 crc kubenswrapper[4743]: I1011 01:10:51.484282 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwl7k\" (UniqueName: \"kubernetes.io/projected/c6ae2ef5-9e7a-43c5-9772-7bcd53053fb6-kube-api-access-mwl7k\") pod \"keystone-db-create-wjqbw\" (UID: \"c6ae2ef5-9e7a-43c5-9772-7bcd53053fb6\") " pod="openstack/keystone-db-create-wjqbw" Oct 11 01:10:51 crc kubenswrapper[4743]: I1011 01:10:51.586723 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwl7k\" (UniqueName: \"kubernetes.io/projected/c6ae2ef5-9e7a-43c5-9772-7bcd53053fb6-kube-api-access-mwl7k\") pod \"keystone-db-create-wjqbw\" (UID: \"c6ae2ef5-9e7a-43c5-9772-7bcd53053fb6\") " pod="openstack/keystone-db-create-wjqbw" Oct 11 01:10:51 crc kubenswrapper[4743]: I1011 01:10:51.612704 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 11 01:10:51 crc kubenswrapper[4743]: I1011 01:10:51.613702 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwl7k\" (UniqueName: \"kubernetes.io/projected/c6ae2ef5-9e7a-43c5-9772-7bcd53053fb6-kube-api-access-mwl7k\") pod \"keystone-db-create-wjqbw\" (UID: \"c6ae2ef5-9e7a-43c5-9772-7bcd53053fb6\") " pod="openstack/keystone-db-create-wjqbw" Oct 11 01:10:51 crc kubenswrapper[4743]: I1011 01:10:51.682269 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wjqbw" Oct 11 01:10:51 crc kubenswrapper[4743]: I1011 01:10:51.737158 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-jdrfb"] Oct 11 01:10:51 crc kubenswrapper[4743]: I1011 01:10:51.738812 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jdrfb" Oct 11 01:10:51 crc kubenswrapper[4743]: I1011 01:10:51.756311 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-jdrfb"] Oct 11 01:10:51 crc kubenswrapper[4743]: I1011 01:10:51.791708 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znlvq\" (UniqueName: \"kubernetes.io/projected/4aacb3f4-8c2a-4f30-b51a-c31b9c89cd28-kube-api-access-znlvq\") pod \"placement-db-create-jdrfb\" (UID: \"4aacb3f4-8c2a-4f30-b51a-c31b9c89cd28\") " pod="openstack/placement-db-create-jdrfb" Oct 11 01:10:51 crc kubenswrapper[4743]: I1011 01:10:51.847262 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-r8cj7"] Oct 11 01:10:51 crc kubenswrapper[4743]: I1011 01:10:51.848920 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-r8cj7" Oct 11 01:10:51 crc kubenswrapper[4743]: I1011 01:10:51.856129 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-r8cj7"] Oct 11 01:10:51 crc kubenswrapper[4743]: I1011 01:10:51.894392 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znlvq\" (UniqueName: \"kubernetes.io/projected/4aacb3f4-8c2a-4f30-b51a-c31b9c89cd28-kube-api-access-znlvq\") pod \"placement-db-create-jdrfb\" (UID: \"4aacb3f4-8c2a-4f30-b51a-c31b9c89cd28\") " pod="openstack/placement-db-create-jdrfb" Oct 11 01:10:51 crc kubenswrapper[4743]: I1011 01:10:51.894473 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pg7q\" (UniqueName: \"kubernetes.io/projected/1561cfd4-e0e2-4f2b-ae94-195bc9061df5-kube-api-access-9pg7q\") pod \"glance-db-create-r8cj7\" (UID: \"1561cfd4-e0e2-4f2b-ae94-195bc9061df5\") " pod="openstack/glance-db-create-r8cj7" Oct 11 01:10:51 crc kubenswrapper[4743]: I1011 01:10:51.912749 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znlvq\" (UniqueName: \"kubernetes.io/projected/4aacb3f4-8c2a-4f30-b51a-c31b9c89cd28-kube-api-access-znlvq\") pod \"placement-db-create-jdrfb\" (UID: \"4aacb3f4-8c2a-4f30-b51a-c31b9c89cd28\") " pod="openstack/placement-db-create-jdrfb" Oct 11 01:10:51 crc kubenswrapper[4743]: I1011 01:10:51.996182 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pg7q\" (UniqueName: \"kubernetes.io/projected/1561cfd4-e0e2-4f2b-ae94-195bc9061df5-kube-api-access-9pg7q\") pod \"glance-db-create-r8cj7\" (UID: \"1561cfd4-e0e2-4f2b-ae94-195bc9061df5\") " pod="openstack/glance-db-create-r8cj7" Oct 11 01:10:52 crc kubenswrapper[4743]: I1011 01:10:52.015412 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pg7q\" (UniqueName: \"kubernetes.io/projected/1561cfd4-e0e2-4f2b-ae94-195bc9061df5-kube-api-access-9pg7q\") pod \"glance-db-create-r8cj7\" (UID: \"1561cfd4-e0e2-4f2b-ae94-195bc9061df5\") " pod="openstack/glance-db-create-r8cj7" Oct 11 01:10:52 crc kubenswrapper[4743]: I1011 01:10:52.060932 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jdrfb" Oct 11 01:10:52 crc kubenswrapper[4743]: I1011 01:10:52.167204 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-r8cj7" Oct 11 01:10:52 crc kubenswrapper[4743]: I1011 01:10:52.563963 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7bff76c5fd-k9d6v_4e0ffbae-47a1-49cd-93b6-46bc31e2ab74/console/0.log" Oct 11 01:10:52 crc kubenswrapper[4743]: I1011 01:10:52.563998 4743 generic.go:334] "Generic (PLEG): container finished" podID="4e0ffbae-47a1-49cd-93b6-46bc31e2ab74" containerID="6024f101ee89d07405f68dd24eff958f2c6395a098f015f88711b3424ab17c14" exitCode=2 Oct 11 01:10:52 crc kubenswrapper[4743]: I1011 01:10:52.565059 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bff76c5fd-k9d6v" event={"ID":"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74","Type":"ContainerDied","Data":"6024f101ee89d07405f68dd24eff958f2c6395a098f015f88711b3424ab17c14"} Oct 11 01:10:54 crc kubenswrapper[4743]: I1011 01:10:54.044975 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/082aa898-adc9-4e0d-a5e3-329d36f391aa-etc-swift\") pod \"swift-storage-0\" (UID: \"082aa898-adc9-4e0d-a5e3-329d36f391aa\") " pod="openstack/swift-storage-0" Oct 11 01:10:54 crc kubenswrapper[4743]: E1011 01:10:54.045163 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 11 01:10:54 crc kubenswrapper[4743]: E1011 01:10:54.045407 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 11 01:10:54 crc kubenswrapper[4743]: E1011 01:10:54.045476 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/082aa898-adc9-4e0d-a5e3-329d36f391aa-etc-swift podName:082aa898-adc9-4e0d-a5e3-329d36f391aa nodeName:}" failed. No retries permitted until 2025-10-11 01:11:02.045451993 +0000 UTC m=+1156.698432430 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/082aa898-adc9-4e0d-a5e3-329d36f391aa-etc-swift") pod "swift-storage-0" (UID: "082aa898-adc9-4e0d-a5e3-329d36f391aa") : configmap "swift-ring-files" not found Oct 11 01:10:54 crc kubenswrapper[4743]: I1011 01:10:54.868567 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-tf857"] Oct 11 01:10:54 crc kubenswrapper[4743]: I1011 01:10:54.871241 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-tf857" Oct 11 01:10:54 crc kubenswrapper[4743]: I1011 01:10:54.882530 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-tf857"] Oct 11 01:10:54 crc kubenswrapper[4743]: I1011 01:10:54.967668 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79zrn\" (UniqueName: \"kubernetes.io/projected/cad5491d-5b39-42df-b4d5-1d5be912561b-kube-api-access-79zrn\") pod \"mysqld-exporter-openstack-db-create-tf857\" (UID: \"cad5491d-5b39-42df-b4d5-1d5be912561b\") " pod="openstack/mysqld-exporter-openstack-db-create-tf857" Oct 11 01:10:55 crc kubenswrapper[4743]: I1011 01:10:55.070085 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79zrn\" (UniqueName: \"kubernetes.io/projected/cad5491d-5b39-42df-b4d5-1d5be912561b-kube-api-access-79zrn\") pod \"mysqld-exporter-openstack-db-create-tf857\" (UID: \"cad5491d-5b39-42df-b4d5-1d5be912561b\") " pod="openstack/mysqld-exporter-openstack-db-create-tf857" Oct 11 01:10:55 crc kubenswrapper[4743]: I1011 01:10:55.113556 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79zrn\" (UniqueName: \"kubernetes.io/projected/cad5491d-5b39-42df-b4d5-1d5be912561b-kube-api-access-79zrn\") pod \"mysqld-exporter-openstack-db-create-tf857\" (UID: \"cad5491d-5b39-42df-b4d5-1d5be912561b\") " pod="openstack/mysqld-exporter-openstack-db-create-tf857" Oct 11 01:10:55 crc kubenswrapper[4743]: I1011 01:10:55.199373 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-tf857" Oct 11 01:10:55 crc kubenswrapper[4743]: I1011 01:10:55.311156 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" Oct 11 01:10:55 crc kubenswrapper[4743]: I1011 01:10:55.371224 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xfzvh"] Oct 11 01:10:55 crc kubenswrapper[4743]: I1011 01:10:55.371449 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-xfzvh" podUID="5052563c-4805-4a28-8aec-0fccc620857e" containerName="dnsmasq-dns" containerID="cri-o://8312b49fb0e108d182a24807f43d44a0671d33db69ae4c00e8bf979eb2f3e7c5" gracePeriod=10 Oct 11 01:10:55 crc kubenswrapper[4743]: I1011 01:10:55.616737 4743 generic.go:334] "Generic (PLEG): container finished" podID="5052563c-4805-4a28-8aec-0fccc620857e" containerID="8312b49fb0e108d182a24807f43d44a0671d33db69ae4c00e8bf979eb2f3e7c5" exitCode=0 Oct 11 01:10:55 crc kubenswrapper[4743]: I1011 01:10:55.616779 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xfzvh" event={"ID":"5052563c-4805-4a28-8aec-0fccc620857e","Type":"ContainerDied","Data":"8312b49fb0e108d182a24807f43d44a0671d33db69ae4c00e8bf979eb2f3e7c5"} Oct 11 01:10:57 crc kubenswrapper[4743]: I1011 01:10:57.280452 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-xfzvh" podUID="5052563c-4805-4a28-8aec-0fccc620857e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.118:5353: connect: connection refused" Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.372711 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.651630 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e1a1779f-127f-4ea2-a937-b97f329e3878","Type":"ContainerStarted","Data":"db842a9aa8dd2163fe2dfdb4255214fea18de948432f619b277ea689c5fbc35c"} Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.653613 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-8qvc2" event={"ID":"5718aabd-82b4-4079-96f4-d241fb2c8efc","Type":"ContainerStarted","Data":"445cac7298dd9d2e9edaa51b0dab2950ccf61078b3dfa076b3222d705b958060"} Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.671752 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7bff76c5fd-k9d6v_4e0ffbae-47a1-49cd-93b6-46bc31e2ab74/console/0.log" Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.671826 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.677348 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-8qvc2" podStartSLOduration=2.373429989 podStartE2EDuration="13.677330847s" podCreationTimestamp="2025-10-11 01:10:46 +0000 UTC" firstStartedPulling="2025-10-11 01:10:47.885305546 +0000 UTC m=+1142.538285943" lastFinishedPulling="2025-10-11 01:10:59.189206394 +0000 UTC m=+1153.842186801" observedRunningTime="2025-10-11 01:10:59.675391868 +0000 UTC m=+1154.328372285" watchObservedRunningTime="2025-10-11 01:10:59.677330847 +0000 UTC m=+1154.330311244" Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.747439 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xfzvh" Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.762255 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-console-serving-cert\") pod \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.762355 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-console-config\") pod \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.762419 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-console-oauth-config\") pod \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.762511 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcjbg\" (UniqueName: \"kubernetes.io/projected/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-kube-api-access-mcjbg\") pod \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.762528 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-service-ca\") pod \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.762557 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-trusted-ca-bundle\") pod \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.762614 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-oauth-serving-cert\") pod \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\" (UID: \"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74\") " Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.766411 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-console-config" (OuterVolumeSpecName: "console-config") pod "4e0ffbae-47a1-49cd-93b6-46bc31e2ab74" (UID: "4e0ffbae-47a1-49cd-93b6-46bc31e2ab74"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.775385 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-service-ca" (OuterVolumeSpecName: "service-ca") pod "4e0ffbae-47a1-49cd-93b6-46bc31e2ab74" (UID: "4e0ffbae-47a1-49cd-93b6-46bc31e2ab74"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.775742 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4e0ffbae-47a1-49cd-93b6-46bc31e2ab74" (UID: "4e0ffbae-47a1-49cd-93b6-46bc31e2ab74"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.776135 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4e0ffbae-47a1-49cd-93b6-46bc31e2ab74" (UID: "4e0ffbae-47a1-49cd-93b6-46bc31e2ab74"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.777629 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4e0ffbae-47a1-49cd-93b6-46bc31e2ab74" (UID: "4e0ffbae-47a1-49cd-93b6-46bc31e2ab74"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.780483 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-kube-api-access-mcjbg" (OuterVolumeSpecName: "kube-api-access-mcjbg") pod "4e0ffbae-47a1-49cd-93b6-46bc31e2ab74" (UID: "4e0ffbae-47a1-49cd-93b6-46bc31e2ab74"). InnerVolumeSpecName "kube-api-access-mcjbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.799320 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4e0ffbae-47a1-49cd-93b6-46bc31e2ab74" (UID: "4e0ffbae-47a1-49cd-93b6-46bc31e2ab74"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.864077 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5052563c-4805-4a28-8aec-0fccc620857e-dns-svc\") pod \"5052563c-4805-4a28-8aec-0fccc620857e\" (UID: \"5052563c-4805-4a28-8aec-0fccc620857e\") " Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.864325 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8qbm\" (UniqueName: \"kubernetes.io/projected/5052563c-4805-4a28-8aec-0fccc620857e-kube-api-access-d8qbm\") pod \"5052563c-4805-4a28-8aec-0fccc620857e\" (UID: \"5052563c-4805-4a28-8aec-0fccc620857e\") " Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.864543 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5052563c-4805-4a28-8aec-0fccc620857e-config\") pod \"5052563c-4805-4a28-8aec-0fccc620857e\" (UID: \"5052563c-4805-4a28-8aec-0fccc620857e\") " Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.864961 4743 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.864972 4743 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.864981 4743 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-console-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.864989 4743 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.864997 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcjbg\" (UniqueName: \"kubernetes.io/projected/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-kube-api-access-mcjbg\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.865006 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-service-ca\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.865013 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.868841 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5052563c-4805-4a28-8aec-0fccc620857e-kube-api-access-d8qbm" (OuterVolumeSpecName: "kube-api-access-d8qbm") pod "5052563c-4805-4a28-8aec-0fccc620857e" (UID: "5052563c-4805-4a28-8aec-0fccc620857e"). InnerVolumeSpecName "kube-api-access-d8qbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.918988 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5052563c-4805-4a28-8aec-0fccc620857e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5052563c-4805-4a28-8aec-0fccc620857e" (UID: "5052563c-4805-4a28-8aec-0fccc620857e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.932284 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5052563c-4805-4a28-8aec-0fccc620857e-config" (OuterVolumeSpecName: "config") pod "5052563c-4805-4a28-8aec-0fccc620857e" (UID: "5052563c-4805-4a28-8aec-0fccc620857e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.966916 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5052563c-4805-4a28-8aec-0fccc620857e-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.966942 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5052563c-4805-4a28-8aec-0fccc620857e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 11 01:10:59 crc kubenswrapper[4743]: I1011 01:10:59.966952 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8qbm\" (UniqueName: \"kubernetes.io/projected/5052563c-4805-4a28-8aec-0fccc620857e-kube-api-access-d8qbm\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.071410 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-jdrfb"] Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.106791 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-r8cj7"] Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.106826 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-wjqbw"] Oct 11 01:11:00 crc kubenswrapper[4743]: W1011 01:11:00.115392 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6ae2ef5_9e7a_43c5_9772_7bcd53053fb6.slice/crio-7f214444a67ee9df170896caae8886e58c1e345cb2156e5115cc7c25aeb645aa WatchSource:0}: Error finding container 7f214444a67ee9df170896caae8886e58c1e345cb2156e5115cc7c25aeb645aa: Status 404 returned error can't find the container with id 7f214444a67ee9df170896caae8886e58c1e345cb2156e5115cc7c25aeb645aa Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.192000 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-tf857"] Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.682337 4743 generic.go:334] "Generic (PLEG): container finished" podID="cad5491d-5b39-42df-b4d5-1d5be912561b" containerID="a79a5eec1c20e5887958d3732820b9542f9c0915ad81786cf52d1f0ae29650c8" exitCode=0 Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.682667 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-tf857" event={"ID":"cad5491d-5b39-42df-b4d5-1d5be912561b","Type":"ContainerDied","Data":"a79a5eec1c20e5887958d3732820b9542f9c0915ad81786cf52d1f0ae29650c8"} Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.682694 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-tf857" event={"ID":"cad5491d-5b39-42df-b4d5-1d5be912561b","Type":"ContainerStarted","Data":"44d97d0a43494012021f17376978e4ae83f5942d5868e30d2e5ded35ba13f3b6"} Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.692652 4743 generic.go:334] "Generic (PLEG): container finished" podID="1561cfd4-e0e2-4f2b-ae94-195bc9061df5" containerID="c11ea7ff21062154fb575deb299c21943cc474dfb337c895f15ff7a8b32750e0" exitCode=0 Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.692717 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-r8cj7" event={"ID":"1561cfd4-e0e2-4f2b-ae94-195bc9061df5","Type":"ContainerDied","Data":"c11ea7ff21062154fb575deb299c21943cc474dfb337c895f15ff7a8b32750e0"} Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.692742 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-r8cj7" event={"ID":"1561cfd4-e0e2-4f2b-ae94-195bc9061df5","Type":"ContainerStarted","Data":"c0ce939433fe2d1f3abdfc03a14a5675ebe4d49357491e78651cfd47c8706312"} Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.707247 4743 generic.go:334] "Generic (PLEG): container finished" podID="4aacb3f4-8c2a-4f30-b51a-c31b9c89cd28" containerID="0aed64549f096d70a12d4f364b00c9964a0086b84b5f5073f58af45f615622b3" exitCode=0 Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.707330 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jdrfb" event={"ID":"4aacb3f4-8c2a-4f30-b51a-c31b9c89cd28","Type":"ContainerDied","Data":"0aed64549f096d70a12d4f364b00c9964a0086b84b5f5073f58af45f615622b3"} Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.707354 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jdrfb" event={"ID":"4aacb3f4-8c2a-4f30-b51a-c31b9c89cd28","Type":"ContainerStarted","Data":"84d8c011f36d4444244f76b033854ac0cbae942eb5a455c3e28caa51414a29db"} Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.727806 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7bff76c5fd-k9d6v_4e0ffbae-47a1-49cd-93b6-46bc31e2ab74/console/0.log" Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.727946 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bff76c5fd-k9d6v" event={"ID":"4e0ffbae-47a1-49cd-93b6-46bc31e2ab74","Type":"ContainerDied","Data":"a5bc4ecfb2ebfb4aad21b01d41c27d8cc141715c024bd41d07f8cdc8a60f0fbe"} Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.727988 4743 scope.go:117] "RemoveContainer" containerID="6024f101ee89d07405f68dd24eff958f2c6395a098f015f88711b3424ab17c14" Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.728128 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bff76c5fd-k9d6v" Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.751375 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xfzvh" event={"ID":"5052563c-4805-4a28-8aec-0fccc620857e","Type":"ContainerDied","Data":"3dab0e2f837f1f65cc09aad53bc4cf8fb95632c60cc835f8142635d86949d13d"} Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.751483 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xfzvh" Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.769289 4743 generic.go:334] "Generic (PLEG): container finished" podID="c6ae2ef5-9e7a-43c5-9772-7bcd53053fb6" containerID="9151a6ec813d0d3f00909a9b7ac2b314eef94a9d2ab2b385b47ee8ee09ea3695" exitCode=0 Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.769712 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wjqbw" event={"ID":"c6ae2ef5-9e7a-43c5-9772-7bcd53053fb6","Type":"ContainerDied","Data":"9151a6ec813d0d3f00909a9b7ac2b314eef94a9d2ab2b385b47ee8ee09ea3695"} Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.769812 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wjqbw" event={"ID":"c6ae2ef5-9e7a-43c5-9772-7bcd53053fb6","Type":"ContainerStarted","Data":"7f214444a67ee9df170896caae8886e58c1e345cb2156e5115cc7c25aeb645aa"} Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.806590 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7bff76c5fd-k9d6v"] Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.813255 4743 scope.go:117] "RemoveContainer" containerID="8312b49fb0e108d182a24807f43d44a0671d33db69ae4c00e8bf979eb2f3e7c5" Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.815385 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7bff76c5fd-k9d6v"] Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.823683 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xfzvh"] Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.846204 4743 scope.go:117] "RemoveContainer" containerID="b5674495de013435f6d4143555e6d0f21c6e2bd5bb78edf50529a27e3c8fe754" Oct 11 01:11:00 crc kubenswrapper[4743]: I1011 01:11:00.860226 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xfzvh"] Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.107823 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e0ffbae-47a1-49cd-93b6-46bc31e2ab74" path="/var/lib/kubelet/pods/4e0ffbae-47a1-49cd-93b6-46bc31e2ab74/volumes" Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.109217 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5052563c-4805-4a28-8aec-0fccc620857e" path="/var/lib/kubelet/pods/5052563c-4805-4a28-8aec-0fccc620857e/volumes" Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.114571 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/082aa898-adc9-4e0d-a5e3-329d36f391aa-etc-swift\") pod \"swift-storage-0\" (UID: \"082aa898-adc9-4e0d-a5e3-329d36f391aa\") " pod="openstack/swift-storage-0" Oct 11 01:11:02 crc kubenswrapper[4743]: E1011 01:11:02.114776 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 11 01:11:02 crc kubenswrapper[4743]: E1011 01:11:02.114806 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 11 01:11:02 crc kubenswrapper[4743]: E1011 01:11:02.114939 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/082aa898-adc9-4e0d-a5e3-329d36f391aa-etc-swift podName:082aa898-adc9-4e0d-a5e3-329d36f391aa nodeName:}" failed. No retries permitted until 2025-10-11 01:11:18.114848201 +0000 UTC m=+1172.767828598 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/082aa898-adc9-4e0d-a5e3-329d36f391aa-etc-swift") pod "swift-storage-0" (UID: "082aa898-adc9-4e0d-a5e3-329d36f391aa") : configmap "swift-ring-files" not found Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.260499 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-tf857" Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.317429 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79zrn\" (UniqueName: \"kubernetes.io/projected/cad5491d-5b39-42df-b4d5-1d5be912561b-kube-api-access-79zrn\") pod \"cad5491d-5b39-42df-b4d5-1d5be912561b\" (UID: \"cad5491d-5b39-42df-b4d5-1d5be912561b\") " Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.332239 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cad5491d-5b39-42df-b4d5-1d5be912561b-kube-api-access-79zrn" (OuterVolumeSpecName: "kube-api-access-79zrn") pod "cad5491d-5b39-42df-b4d5-1d5be912561b" (UID: "cad5491d-5b39-42df-b4d5-1d5be912561b"). InnerVolumeSpecName "kube-api-access-79zrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.419452 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79zrn\" (UniqueName: \"kubernetes.io/projected/cad5491d-5b39-42df-b4d5-1d5be912561b-kube-api-access-79zrn\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.465171 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wjqbw" Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.474280 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jdrfb" Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.482893 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-r8cj7" Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.520596 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pg7q\" (UniqueName: \"kubernetes.io/projected/1561cfd4-e0e2-4f2b-ae94-195bc9061df5-kube-api-access-9pg7q\") pod \"1561cfd4-e0e2-4f2b-ae94-195bc9061df5\" (UID: \"1561cfd4-e0e2-4f2b-ae94-195bc9061df5\") " Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.520830 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwl7k\" (UniqueName: \"kubernetes.io/projected/c6ae2ef5-9e7a-43c5-9772-7bcd53053fb6-kube-api-access-mwl7k\") pod \"c6ae2ef5-9e7a-43c5-9772-7bcd53053fb6\" (UID: \"c6ae2ef5-9e7a-43c5-9772-7bcd53053fb6\") " Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.520944 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znlvq\" (UniqueName: \"kubernetes.io/projected/4aacb3f4-8c2a-4f30-b51a-c31b9c89cd28-kube-api-access-znlvq\") pod \"4aacb3f4-8c2a-4f30-b51a-c31b9c89cd28\" (UID: \"4aacb3f4-8c2a-4f30-b51a-c31b9c89cd28\") " Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.524433 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aacb3f4-8c2a-4f30-b51a-c31b9c89cd28-kube-api-access-znlvq" (OuterVolumeSpecName: "kube-api-access-znlvq") pod "4aacb3f4-8c2a-4f30-b51a-c31b9c89cd28" (UID: "4aacb3f4-8c2a-4f30-b51a-c31b9c89cd28"). InnerVolumeSpecName "kube-api-access-znlvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.524546 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6ae2ef5-9e7a-43c5-9772-7bcd53053fb6-kube-api-access-mwl7k" (OuterVolumeSpecName: "kube-api-access-mwl7k") pod "c6ae2ef5-9e7a-43c5-9772-7bcd53053fb6" (UID: "c6ae2ef5-9e7a-43c5-9772-7bcd53053fb6"). InnerVolumeSpecName "kube-api-access-mwl7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.524624 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1561cfd4-e0e2-4f2b-ae94-195bc9061df5-kube-api-access-9pg7q" (OuterVolumeSpecName: "kube-api-access-9pg7q") pod "1561cfd4-e0e2-4f2b-ae94-195bc9061df5" (UID: "1561cfd4-e0e2-4f2b-ae94-195bc9061df5"). InnerVolumeSpecName "kube-api-access-9pg7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.622544 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwl7k\" (UniqueName: \"kubernetes.io/projected/c6ae2ef5-9e7a-43c5-9772-7bcd53053fb6-kube-api-access-mwl7k\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.622911 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znlvq\" (UniqueName: \"kubernetes.io/projected/4aacb3f4-8c2a-4f30-b51a-c31b9c89cd28-kube-api-access-znlvq\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.622979 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pg7q\" (UniqueName: \"kubernetes.io/projected/1561cfd4-e0e2-4f2b-ae94-195bc9061df5-kube-api-access-9pg7q\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.790906 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wjqbw" event={"ID":"c6ae2ef5-9e7a-43c5-9772-7bcd53053fb6","Type":"ContainerDied","Data":"7f214444a67ee9df170896caae8886e58c1e345cb2156e5115cc7c25aeb645aa"} Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.790953 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f214444a67ee9df170896caae8886e58c1e345cb2156e5115cc7c25aeb645aa" Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.791007 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wjqbw" Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.792777 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-tf857" event={"ID":"cad5491d-5b39-42df-b4d5-1d5be912561b","Type":"ContainerDied","Data":"44d97d0a43494012021f17376978e4ae83f5942d5868e30d2e5ded35ba13f3b6"} Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.792799 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44d97d0a43494012021f17376978e4ae83f5942d5868e30d2e5ded35ba13f3b6" Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.792983 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-tf857" Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.794565 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-r8cj7" Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.794590 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-r8cj7" event={"ID":"1561cfd4-e0e2-4f2b-ae94-195bc9061df5","Type":"ContainerDied","Data":"c0ce939433fe2d1f3abdfc03a14a5675ebe4d49357491e78651cfd47c8706312"} Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.794626 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0ce939433fe2d1f3abdfc03a14a5675ebe4d49357491e78651cfd47c8706312" Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.796190 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jdrfb" event={"ID":"4aacb3f4-8c2a-4f30-b51a-c31b9c89cd28","Type":"ContainerDied","Data":"84d8c011f36d4444244f76b033854ac0cbae942eb5a455c3e28caa51414a29db"} Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.796214 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84d8c011f36d4444244f76b033854ac0cbae942eb5a455c3e28caa51414a29db" Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.796246 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jdrfb" Oct 11 01:11:02 crc kubenswrapper[4743]: I1011 01:11:02.806341 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e1a1779f-127f-4ea2-a937-b97f329e3878","Type":"ContainerStarted","Data":"de4931ad2988f95a60457da9c6bacff7ad35a2cfe3fe6d732a283229eb8c32bf"} Oct 11 01:11:05 crc kubenswrapper[4743]: I1011 01:11:05.834713 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e1a1779f-127f-4ea2-a937-b97f329e3878","Type":"ContainerStarted","Data":"b10dd53f96ee32ccfd746850af138144705d93b07074a54ed230a878ff2363c1"} Oct 11 01:11:05 crc kubenswrapper[4743]: I1011 01:11:05.860725 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=10.10189009 podStartE2EDuration="51.860709364s" podCreationTimestamp="2025-10-11 01:10:14 +0000 UTC" firstStartedPulling="2025-10-11 01:10:23.476487525 +0000 UTC m=+1118.129467922" lastFinishedPulling="2025-10-11 01:11:05.235306799 +0000 UTC m=+1159.888287196" observedRunningTime="2025-10-11 01:11:05.856290182 +0000 UTC m=+1160.509270579" watchObservedRunningTime="2025-10-11 01:11:05.860709364 +0000 UTC m=+1160.513689761" Oct 11 01:11:06 crc kubenswrapper[4743]: I1011 01:11:06.308192 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:06 crc kubenswrapper[4743]: I1011 01:11:06.845849 4743 generic.go:334] "Generic (PLEG): container finished" podID="5718aabd-82b4-4079-96f4-d241fb2c8efc" containerID="445cac7298dd9d2e9edaa51b0dab2950ccf61078b3dfa076b3222d705b958060" exitCode=0 Oct 11 01:11:06 crc kubenswrapper[4743]: I1011 01:11:06.845952 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-8qvc2" event={"ID":"5718aabd-82b4-4079-96f4-d241fb2c8efc","Type":"ContainerDied","Data":"445cac7298dd9d2e9edaa51b0dab2950ccf61078b3dfa076b3222d705b958060"} Oct 11 01:11:07 crc kubenswrapper[4743]: I1011 01:11:07.861491 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-mwtxs" podUID="1ab33e99-afb5-4b67-89bd-a2eb540bf194" containerName="ovn-controller" probeResult="failure" output=< Oct 11 01:11:07 crc kubenswrapper[4743]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 11 01:11:07 crc kubenswrapper[4743]: > Oct 11 01:11:07 crc kubenswrapper[4743]: I1011 01:11:07.949646 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-6g6xb" Oct 11 01:11:07 crc kubenswrapper[4743]: I1011 01:11:07.960870 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-6g6xb" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.207906 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mwtxs-config-cngsm"] Oct 11 01:11:08 crc kubenswrapper[4743]: E1011 01:11:08.208778 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aacb3f4-8c2a-4f30-b51a-c31b9c89cd28" containerName="mariadb-database-create" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.208818 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aacb3f4-8c2a-4f30-b51a-c31b9c89cd28" containerName="mariadb-database-create" Oct 11 01:11:08 crc kubenswrapper[4743]: E1011 01:11:08.208836 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ae2ef5-9e7a-43c5-9772-7bcd53053fb6" containerName="mariadb-database-create" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.208844 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ae2ef5-9e7a-43c5-9772-7bcd53053fb6" containerName="mariadb-database-create" Oct 11 01:11:08 crc kubenswrapper[4743]: E1011 01:11:08.208893 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5052563c-4805-4a28-8aec-0fccc620857e" containerName="init" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.208904 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5052563c-4805-4a28-8aec-0fccc620857e" containerName="init" Oct 11 01:11:08 crc kubenswrapper[4743]: E1011 01:11:08.208916 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cad5491d-5b39-42df-b4d5-1d5be912561b" containerName="mariadb-database-create" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.208923 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cad5491d-5b39-42df-b4d5-1d5be912561b" containerName="mariadb-database-create" Oct 11 01:11:08 crc kubenswrapper[4743]: E1011 01:11:08.208960 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e0ffbae-47a1-49cd-93b6-46bc31e2ab74" containerName="console" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.208968 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e0ffbae-47a1-49cd-93b6-46bc31e2ab74" containerName="console" Oct 11 01:11:08 crc kubenswrapper[4743]: E1011 01:11:08.208999 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5052563c-4805-4a28-8aec-0fccc620857e" containerName="dnsmasq-dns" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.209007 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5052563c-4805-4a28-8aec-0fccc620857e" containerName="dnsmasq-dns" Oct 11 01:11:08 crc kubenswrapper[4743]: E1011 01:11:08.209044 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1561cfd4-e0e2-4f2b-ae94-195bc9061df5" containerName="mariadb-database-create" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.209054 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1561cfd4-e0e2-4f2b-ae94-195bc9061df5" containerName="mariadb-database-create" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.209388 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6ae2ef5-9e7a-43c5-9772-7bcd53053fb6" containerName="mariadb-database-create" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.209403 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="cad5491d-5b39-42df-b4d5-1d5be912561b" containerName="mariadb-database-create" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.209416 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5052563c-4805-4a28-8aec-0fccc620857e" containerName="dnsmasq-dns" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.209426 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1561cfd4-e0e2-4f2b-ae94-195bc9061df5" containerName="mariadb-database-create" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.209475 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aacb3f4-8c2a-4f30-b51a-c31b9c89cd28" containerName="mariadb-database-create" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.209493 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e0ffbae-47a1-49cd-93b6-46bc31e2ab74" containerName="console" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.210631 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mwtxs-config-cngsm" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.212360 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.221821 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mwtxs-config-cngsm"] Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.244953 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/504e853d-0ce0-4817-bdc5-ee5222a9069f-scripts\") pod \"ovn-controller-mwtxs-config-cngsm\" (UID: \"504e853d-0ce0-4817-bdc5-ee5222a9069f\") " pod="openstack/ovn-controller-mwtxs-config-cngsm" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.245008 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/504e853d-0ce0-4817-bdc5-ee5222a9069f-var-run\") pod \"ovn-controller-mwtxs-config-cngsm\" (UID: \"504e853d-0ce0-4817-bdc5-ee5222a9069f\") " pod="openstack/ovn-controller-mwtxs-config-cngsm" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.245032 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/504e853d-0ce0-4817-bdc5-ee5222a9069f-var-log-ovn\") pod \"ovn-controller-mwtxs-config-cngsm\" (UID: \"504e853d-0ce0-4817-bdc5-ee5222a9069f\") " pod="openstack/ovn-controller-mwtxs-config-cngsm" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.245111 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/504e853d-0ce0-4817-bdc5-ee5222a9069f-additional-scripts\") pod \"ovn-controller-mwtxs-config-cngsm\" (UID: \"504e853d-0ce0-4817-bdc5-ee5222a9069f\") " pod="openstack/ovn-controller-mwtxs-config-cngsm" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.245174 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/504e853d-0ce0-4817-bdc5-ee5222a9069f-var-run-ovn\") pod \"ovn-controller-mwtxs-config-cngsm\" (UID: \"504e853d-0ce0-4817-bdc5-ee5222a9069f\") " pod="openstack/ovn-controller-mwtxs-config-cngsm" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.245229 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkcc6\" (UniqueName: \"kubernetes.io/projected/504e853d-0ce0-4817-bdc5-ee5222a9069f-kube-api-access-kkcc6\") pod \"ovn-controller-mwtxs-config-cngsm\" (UID: \"504e853d-0ce0-4817-bdc5-ee5222a9069f\") " pod="openstack/ovn-controller-mwtxs-config-cngsm" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.346817 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/504e853d-0ce0-4817-bdc5-ee5222a9069f-var-run\") pod \"ovn-controller-mwtxs-config-cngsm\" (UID: \"504e853d-0ce0-4817-bdc5-ee5222a9069f\") " pod="openstack/ovn-controller-mwtxs-config-cngsm" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.346872 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/504e853d-0ce0-4817-bdc5-ee5222a9069f-var-log-ovn\") pod \"ovn-controller-mwtxs-config-cngsm\" (UID: \"504e853d-0ce0-4817-bdc5-ee5222a9069f\") " pod="openstack/ovn-controller-mwtxs-config-cngsm" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.346934 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/504e853d-0ce0-4817-bdc5-ee5222a9069f-additional-scripts\") pod \"ovn-controller-mwtxs-config-cngsm\" (UID: \"504e853d-0ce0-4817-bdc5-ee5222a9069f\") " pod="openstack/ovn-controller-mwtxs-config-cngsm" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.346974 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/504e853d-0ce0-4817-bdc5-ee5222a9069f-var-run-ovn\") pod \"ovn-controller-mwtxs-config-cngsm\" (UID: \"504e853d-0ce0-4817-bdc5-ee5222a9069f\") " pod="openstack/ovn-controller-mwtxs-config-cngsm" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.347009 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkcc6\" (UniqueName: \"kubernetes.io/projected/504e853d-0ce0-4817-bdc5-ee5222a9069f-kube-api-access-kkcc6\") pod \"ovn-controller-mwtxs-config-cngsm\" (UID: \"504e853d-0ce0-4817-bdc5-ee5222a9069f\") " pod="openstack/ovn-controller-mwtxs-config-cngsm" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.347075 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/504e853d-0ce0-4817-bdc5-ee5222a9069f-scripts\") pod \"ovn-controller-mwtxs-config-cngsm\" (UID: \"504e853d-0ce0-4817-bdc5-ee5222a9069f\") " pod="openstack/ovn-controller-mwtxs-config-cngsm" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.348931 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/504e853d-0ce0-4817-bdc5-ee5222a9069f-scripts\") pod \"ovn-controller-mwtxs-config-cngsm\" (UID: \"504e853d-0ce0-4817-bdc5-ee5222a9069f\") " pod="openstack/ovn-controller-mwtxs-config-cngsm" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.349133 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/504e853d-0ce0-4817-bdc5-ee5222a9069f-var-run\") pod \"ovn-controller-mwtxs-config-cngsm\" (UID: \"504e853d-0ce0-4817-bdc5-ee5222a9069f\") " pod="openstack/ovn-controller-mwtxs-config-cngsm" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.349173 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/504e853d-0ce0-4817-bdc5-ee5222a9069f-var-log-ovn\") pod \"ovn-controller-mwtxs-config-cngsm\" (UID: \"504e853d-0ce0-4817-bdc5-ee5222a9069f\") " pod="openstack/ovn-controller-mwtxs-config-cngsm" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.349526 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/504e853d-0ce0-4817-bdc5-ee5222a9069f-additional-scripts\") pod \"ovn-controller-mwtxs-config-cngsm\" (UID: \"504e853d-0ce0-4817-bdc5-ee5222a9069f\") " pod="openstack/ovn-controller-mwtxs-config-cngsm" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.349576 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/504e853d-0ce0-4817-bdc5-ee5222a9069f-var-run-ovn\") pod \"ovn-controller-mwtxs-config-cngsm\" (UID: \"504e853d-0ce0-4817-bdc5-ee5222a9069f\") " pod="openstack/ovn-controller-mwtxs-config-cngsm" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.371842 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkcc6\" (UniqueName: \"kubernetes.io/projected/504e853d-0ce0-4817-bdc5-ee5222a9069f-kube-api-access-kkcc6\") pod \"ovn-controller-mwtxs-config-cngsm\" (UID: \"504e853d-0ce0-4817-bdc5-ee5222a9069f\") " pod="openstack/ovn-controller-mwtxs-config-cngsm" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.476079 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mwtxs-config-cngsm" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.478950 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8qvc2" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.553676 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mrvc\" (UniqueName: \"kubernetes.io/projected/5718aabd-82b4-4079-96f4-d241fb2c8efc-kube-api-access-4mrvc\") pod \"5718aabd-82b4-4079-96f4-d241fb2c8efc\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.554214 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5718aabd-82b4-4079-96f4-d241fb2c8efc-combined-ca-bundle\") pod \"5718aabd-82b4-4079-96f4-d241fb2c8efc\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.554409 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5718aabd-82b4-4079-96f4-d241fb2c8efc-dispersionconf\") pod \"5718aabd-82b4-4079-96f4-d241fb2c8efc\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.554456 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5718aabd-82b4-4079-96f4-d241fb2c8efc-etc-swift\") pod \"5718aabd-82b4-4079-96f4-d241fb2c8efc\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.554490 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5718aabd-82b4-4079-96f4-d241fb2c8efc-ring-data-devices\") pod \"5718aabd-82b4-4079-96f4-d241fb2c8efc\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.554536 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5718aabd-82b4-4079-96f4-d241fb2c8efc-scripts\") pod \"5718aabd-82b4-4079-96f4-d241fb2c8efc\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.554568 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5718aabd-82b4-4079-96f4-d241fb2c8efc-swiftconf\") pod \"5718aabd-82b4-4079-96f4-d241fb2c8efc\" (UID: \"5718aabd-82b4-4079-96f4-d241fb2c8efc\") " Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.556486 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5718aabd-82b4-4079-96f4-d241fb2c8efc-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5718aabd-82b4-4079-96f4-d241fb2c8efc" (UID: "5718aabd-82b4-4079-96f4-d241fb2c8efc"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.557305 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5718aabd-82b4-4079-96f4-d241fb2c8efc-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5718aabd-82b4-4079-96f4-d241fb2c8efc" (UID: "5718aabd-82b4-4079-96f4-d241fb2c8efc"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.558267 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5718aabd-82b4-4079-96f4-d241fb2c8efc-kube-api-access-4mrvc" (OuterVolumeSpecName: "kube-api-access-4mrvc") pod "5718aabd-82b4-4079-96f4-d241fb2c8efc" (UID: "5718aabd-82b4-4079-96f4-d241fb2c8efc"). InnerVolumeSpecName "kube-api-access-4mrvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.566686 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5718aabd-82b4-4079-96f4-d241fb2c8efc-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5718aabd-82b4-4079-96f4-d241fb2c8efc" (UID: "5718aabd-82b4-4079-96f4-d241fb2c8efc"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.586731 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5718aabd-82b4-4079-96f4-d241fb2c8efc-scripts" (OuterVolumeSpecName: "scripts") pod "5718aabd-82b4-4079-96f4-d241fb2c8efc" (UID: "5718aabd-82b4-4079-96f4-d241fb2c8efc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.604955 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5718aabd-82b4-4079-96f4-d241fb2c8efc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5718aabd-82b4-4079-96f4-d241fb2c8efc" (UID: "5718aabd-82b4-4079-96f4-d241fb2c8efc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.610160 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5718aabd-82b4-4079-96f4-d241fb2c8efc-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5718aabd-82b4-4079-96f4-d241fb2c8efc" (UID: "5718aabd-82b4-4079-96f4-d241fb2c8efc"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.657150 4743 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5718aabd-82b4-4079-96f4-d241fb2c8efc-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.657190 4743 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5718aabd-82b4-4079-96f4-d241fb2c8efc-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.657199 4743 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5718aabd-82b4-4079-96f4-d241fb2c8efc-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.657208 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5718aabd-82b4-4079-96f4-d241fb2c8efc-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.657217 4743 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5718aabd-82b4-4079-96f4-d241fb2c8efc-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.657227 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mrvc\" (UniqueName: \"kubernetes.io/projected/5718aabd-82b4-4079-96f4-d241fb2c8efc-kube-api-access-4mrvc\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.657238 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5718aabd-82b4-4079-96f4-d241fb2c8efc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.865156 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-8qvc2" event={"ID":"5718aabd-82b4-4079-96f4-d241fb2c8efc","Type":"ContainerDied","Data":"861503d235bee463affe4060cfeb8c5e7354d7f2776b480bf9e6d6f5f9904df2"} Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.865193 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="861503d235bee463affe4060cfeb8c5e7354d7f2776b480bf9e6d6f5f9904df2" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.865245 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8qvc2" Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.868937 4743 generic.go:334] "Generic (PLEG): container finished" podID="73d7bbf0-dc76-4572-857e-fd0fb59d95cc" containerID="481a1258a6422d0adb2d8063cebd0064a3027d185956e75d1ac262e1e7ae0dbb" exitCode=0 Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.869009 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"73d7bbf0-dc76-4572-857e-fd0fb59d95cc","Type":"ContainerDied","Data":"481a1258a6422d0adb2d8063cebd0064a3027d185956e75d1ac262e1e7ae0dbb"} Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.871659 4743 generic.go:334] "Generic (PLEG): container finished" podID="9f596550-b88a-49d7-9cff-cbc2d4149a2e" containerID="71bd50ca24278810cfdfc1d2d9fe23b63e4ca59dbc1c1013a7228649e54927a6" exitCode=0 Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.871745 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9f596550-b88a-49d7-9cff-cbc2d4149a2e","Type":"ContainerDied","Data":"71bd50ca24278810cfdfc1d2d9fe23b63e4ca59dbc1c1013a7228649e54927a6"} Oct 11 01:11:08 crc kubenswrapper[4743]: I1011 01:11:08.959157 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mwtxs-config-cngsm"] Oct 11 01:11:09 crc kubenswrapper[4743]: I1011 01:11:09.892070 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9f596550-b88a-49d7-9cff-cbc2d4149a2e","Type":"ContainerStarted","Data":"5088c25b9ecada27037e94ce63bb12f86d1d169720a14623b47acf2816b9127b"} Oct 11 01:11:09 crc kubenswrapper[4743]: I1011 01:11:09.893526 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:11:09 crc kubenswrapper[4743]: I1011 01:11:09.895126 4743 generic.go:334] "Generic (PLEG): container finished" podID="504e853d-0ce0-4817-bdc5-ee5222a9069f" containerID="90c860b4a98badaee54fc3acd4233190ce617e4e2decbf77a26d43c826a2f5e0" exitCode=0 Oct 11 01:11:09 crc kubenswrapper[4743]: I1011 01:11:09.895198 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mwtxs-config-cngsm" event={"ID":"504e853d-0ce0-4817-bdc5-ee5222a9069f","Type":"ContainerDied","Data":"90c860b4a98badaee54fc3acd4233190ce617e4e2decbf77a26d43c826a2f5e0"} Oct 11 01:11:09 crc kubenswrapper[4743]: I1011 01:11:09.895226 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mwtxs-config-cngsm" event={"ID":"504e853d-0ce0-4817-bdc5-ee5222a9069f","Type":"ContainerStarted","Data":"a2b7977915bf9fc5aefc37fe549e71c7b74f4c7f3405799ca345081cfe1ba21c"} Oct 11 01:11:09 crc kubenswrapper[4743]: I1011 01:11:09.908175 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"73d7bbf0-dc76-4572-857e-fd0fb59d95cc","Type":"ContainerStarted","Data":"18d5924ee91371fd1ad1e224761869e30787b5d666608ba26d05d7cefcfe9f7b"} Oct 11 01:11:09 crc kubenswrapper[4743]: I1011 01:11:09.908422 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 11 01:11:09 crc kubenswrapper[4743]: I1011 01:11:09.919875 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=53.916906631 podStartE2EDuration="1m2.919841693s" podCreationTimestamp="2025-10-11 01:10:07 +0000 UTC" firstStartedPulling="2025-10-11 01:10:23.597946076 +0000 UTC m=+1118.250926473" lastFinishedPulling="2025-10-11 01:10:32.600881128 +0000 UTC m=+1127.253861535" observedRunningTime="2025-10-11 01:11:09.91656651 +0000 UTC m=+1164.569546907" watchObservedRunningTime="2025-10-11 01:11:09.919841693 +0000 UTC m=+1164.572822080" Oct 11 01:11:09 crc kubenswrapper[4743]: I1011 01:11:09.941190 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=54.902428555 podStartE2EDuration="1m3.941172175s" podCreationTimestamp="2025-10-11 01:10:06 +0000 UTC" firstStartedPulling="2025-10-11 01:10:23.560825755 +0000 UTC m=+1118.213806152" lastFinishedPulling="2025-10-11 01:10:32.599569355 +0000 UTC m=+1127.252549772" observedRunningTime="2025-10-11 01:11:09.936031784 +0000 UTC m=+1164.589012171" watchObservedRunningTime="2025-10-11 01:11:09.941172175 +0000 UTC m=+1164.594152572" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.319579 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mwtxs-config-cngsm" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.395697 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5f82-account-create-gq8c9"] Oct 11 01:11:11 crc kubenswrapper[4743]: E1011 01:11:11.396110 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5718aabd-82b4-4079-96f4-d241fb2c8efc" containerName="swift-ring-rebalance" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.396138 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5718aabd-82b4-4079-96f4-d241fb2c8efc" containerName="swift-ring-rebalance" Oct 11 01:11:11 crc kubenswrapper[4743]: E1011 01:11:11.396158 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="504e853d-0ce0-4817-bdc5-ee5222a9069f" containerName="ovn-config" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.396167 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="504e853d-0ce0-4817-bdc5-ee5222a9069f" containerName="ovn-config" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.396439 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5718aabd-82b4-4079-96f4-d241fb2c8efc" containerName="swift-ring-rebalance" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.396473 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="504e853d-0ce0-4817-bdc5-ee5222a9069f" containerName="ovn-config" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.397242 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f82-account-create-gq8c9" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.399676 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.415640 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5f82-account-create-gq8c9"] Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.425183 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/504e853d-0ce0-4817-bdc5-ee5222a9069f-scripts\") pod \"504e853d-0ce0-4817-bdc5-ee5222a9069f\" (UID: \"504e853d-0ce0-4817-bdc5-ee5222a9069f\") " Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.425278 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/504e853d-0ce0-4817-bdc5-ee5222a9069f-additional-scripts\") pod \"504e853d-0ce0-4817-bdc5-ee5222a9069f\" (UID: \"504e853d-0ce0-4817-bdc5-ee5222a9069f\") " Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.425314 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/504e853d-0ce0-4817-bdc5-ee5222a9069f-var-log-ovn\") pod \"504e853d-0ce0-4817-bdc5-ee5222a9069f\" (UID: \"504e853d-0ce0-4817-bdc5-ee5222a9069f\") " Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.425394 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/504e853d-0ce0-4817-bdc5-ee5222a9069f-var-run-ovn\") pod \"504e853d-0ce0-4817-bdc5-ee5222a9069f\" (UID: \"504e853d-0ce0-4817-bdc5-ee5222a9069f\") " Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.425413 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkcc6\" (UniqueName: \"kubernetes.io/projected/504e853d-0ce0-4817-bdc5-ee5222a9069f-kube-api-access-kkcc6\") pod \"504e853d-0ce0-4817-bdc5-ee5222a9069f\" (UID: \"504e853d-0ce0-4817-bdc5-ee5222a9069f\") " Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.425553 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/504e853d-0ce0-4817-bdc5-ee5222a9069f-var-run\") pod \"504e853d-0ce0-4817-bdc5-ee5222a9069f\" (UID: \"504e853d-0ce0-4817-bdc5-ee5222a9069f\") " Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.425647 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/504e853d-0ce0-4817-bdc5-ee5222a9069f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "504e853d-0ce0-4817-bdc5-ee5222a9069f" (UID: "504e853d-0ce0-4817-bdc5-ee5222a9069f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.425670 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/504e853d-0ce0-4817-bdc5-ee5222a9069f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "504e853d-0ce0-4817-bdc5-ee5222a9069f" (UID: "504e853d-0ce0-4817-bdc5-ee5222a9069f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.425772 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/504e853d-0ce0-4817-bdc5-ee5222a9069f-var-run" (OuterVolumeSpecName: "var-run") pod "504e853d-0ce0-4817-bdc5-ee5222a9069f" (UID: "504e853d-0ce0-4817-bdc5-ee5222a9069f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.426273 4743 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/504e853d-0ce0-4817-bdc5-ee5222a9069f-var-run\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.426293 4743 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/504e853d-0ce0-4817-bdc5-ee5222a9069f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.426307 4743 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/504e853d-0ce0-4817-bdc5-ee5222a9069f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.426507 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/504e853d-0ce0-4817-bdc5-ee5222a9069f-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "504e853d-0ce0-4817-bdc5-ee5222a9069f" (UID: "504e853d-0ce0-4817-bdc5-ee5222a9069f"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.426843 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/504e853d-0ce0-4817-bdc5-ee5222a9069f-scripts" (OuterVolumeSpecName: "scripts") pod "504e853d-0ce0-4817-bdc5-ee5222a9069f" (UID: "504e853d-0ce0-4817-bdc5-ee5222a9069f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.435130 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/504e853d-0ce0-4817-bdc5-ee5222a9069f-kube-api-access-kkcc6" (OuterVolumeSpecName: "kube-api-access-kkcc6") pod "504e853d-0ce0-4817-bdc5-ee5222a9069f" (UID: "504e853d-0ce0-4817-bdc5-ee5222a9069f"). InnerVolumeSpecName "kube-api-access-kkcc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.527736 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n9l5\" (UniqueName: \"kubernetes.io/projected/421a65dd-2c9a-4807-bffa-c292f25a8263-kube-api-access-7n9l5\") pod \"keystone-5f82-account-create-gq8c9\" (UID: \"421a65dd-2c9a-4807-bffa-c292f25a8263\") " pod="openstack/keystone-5f82-account-create-gq8c9" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.527901 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/504e853d-0ce0-4817-bdc5-ee5222a9069f-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.527924 4743 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/504e853d-0ce0-4817-bdc5-ee5222a9069f-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.527941 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkcc6\" (UniqueName: \"kubernetes.io/projected/504e853d-0ce0-4817-bdc5-ee5222a9069f-kube-api-access-kkcc6\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.629829 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n9l5\" (UniqueName: \"kubernetes.io/projected/421a65dd-2c9a-4807-bffa-c292f25a8263-kube-api-access-7n9l5\") pod \"keystone-5f82-account-create-gq8c9\" (UID: \"421a65dd-2c9a-4807-bffa-c292f25a8263\") " pod="openstack/keystone-5f82-account-create-gq8c9" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.647533 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n9l5\" (UniqueName: \"kubernetes.io/projected/421a65dd-2c9a-4807-bffa-c292f25a8263-kube-api-access-7n9l5\") pod \"keystone-5f82-account-create-gq8c9\" (UID: \"421a65dd-2c9a-4807-bffa-c292f25a8263\") " pod="openstack/keystone-5f82-account-create-gq8c9" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.716494 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f82-account-create-gq8c9" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.831564 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-2a5c-account-create-6nzhk"] Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.833683 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2a5c-account-create-6nzhk" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.838159 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2a5c-account-create-6nzhk"] Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.840534 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.923696 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mwtxs-config-cngsm" event={"ID":"504e853d-0ce0-4817-bdc5-ee5222a9069f","Type":"ContainerDied","Data":"a2b7977915bf9fc5aefc37fe549e71c7b74f4c7f3405799ca345081cfe1ba21c"} Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.923738 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2b7977915bf9fc5aefc37fe549e71c7b74f4c7f3405799ca345081cfe1ba21c" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.923788 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mwtxs-config-cngsm" Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.996748 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-d5a7-account-create-z86xz"] Oct 11 01:11:11 crc kubenswrapper[4743]: I1011 01:11:11.997963 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d5a7-account-create-z86xz" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.000067 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.010453 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d5a7-account-create-z86xz"] Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.035617 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd7xg\" (UniqueName: \"kubernetes.io/projected/c5dabefa-327b-4e8b-9a7f-81517a52c01b-kube-api-access-gd7xg\") pod \"placement-2a5c-account-create-6nzhk\" (UID: \"c5dabefa-327b-4e8b-9a7f-81517a52c01b\") " pod="openstack/placement-2a5c-account-create-6nzhk" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.136990 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bpfj\" (UniqueName: \"kubernetes.io/projected/b01c2590-9783-4392-8a81-a4a0ec37e88d-kube-api-access-5bpfj\") pod \"glance-d5a7-account-create-z86xz\" (UID: \"b01c2590-9783-4392-8a81-a4a0ec37e88d\") " pod="openstack/glance-d5a7-account-create-z86xz" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.137061 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd7xg\" (UniqueName: \"kubernetes.io/projected/c5dabefa-327b-4e8b-9a7f-81517a52c01b-kube-api-access-gd7xg\") pod \"placement-2a5c-account-create-6nzhk\" (UID: \"c5dabefa-327b-4e8b-9a7f-81517a52c01b\") " pod="openstack/placement-2a5c-account-create-6nzhk" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.158479 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5f82-account-create-gq8c9"] Oct 11 01:11:12 crc kubenswrapper[4743]: W1011 01:11:12.159428 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod421a65dd_2c9a_4807_bffa_c292f25a8263.slice/crio-311cea8862c5c3ce9a6c7ba7986b01dace90937f73422c4c91e4efe98f0dfc7c WatchSource:0}: Error finding container 311cea8862c5c3ce9a6c7ba7986b01dace90937f73422c4c91e4efe98f0dfc7c: Status 404 returned error can't find the container with id 311cea8862c5c3ce9a6c7ba7986b01dace90937f73422c4c91e4efe98f0dfc7c Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.165441 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd7xg\" (UniqueName: \"kubernetes.io/projected/c5dabefa-327b-4e8b-9a7f-81517a52c01b-kube-api-access-gd7xg\") pod \"placement-2a5c-account-create-6nzhk\" (UID: \"c5dabefa-327b-4e8b-9a7f-81517a52c01b\") " pod="openstack/placement-2a5c-account-create-6nzhk" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.239214 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bpfj\" (UniqueName: \"kubernetes.io/projected/b01c2590-9783-4392-8a81-a4a0ec37e88d-kube-api-access-5bpfj\") pod \"glance-d5a7-account-create-z86xz\" (UID: \"b01c2590-9783-4392-8a81-a4a0ec37e88d\") " pod="openstack/glance-d5a7-account-create-z86xz" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.266197 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bpfj\" (UniqueName: \"kubernetes.io/projected/b01c2590-9783-4392-8a81-a4a0ec37e88d-kube-api-access-5bpfj\") pod \"glance-d5a7-account-create-z86xz\" (UID: \"b01c2590-9783-4392-8a81-a4a0ec37e88d\") " pod="openstack/glance-d5a7-account-create-z86xz" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.314699 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d5a7-account-create-z86xz" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.431172 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-mwtxs-config-cngsm"] Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.437051 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-mwtxs-config-cngsm"] Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.455319 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2a5c-account-create-6nzhk" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.531510 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mwtxs-config-mfgb9"] Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.533085 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mwtxs-config-mfgb9" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.535174 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.542340 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mwtxs-config-mfgb9"] Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.545928 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/99a29b50-a76e-4093-9642-9138fa937e0c-var-log-ovn\") pod \"ovn-controller-mwtxs-config-mfgb9\" (UID: \"99a29b50-a76e-4093-9642-9138fa937e0c\") " pod="openstack/ovn-controller-mwtxs-config-mfgb9" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.545975 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/99a29b50-a76e-4093-9642-9138fa937e0c-additional-scripts\") pod \"ovn-controller-mwtxs-config-mfgb9\" (UID: \"99a29b50-a76e-4093-9642-9138fa937e0c\") " pod="openstack/ovn-controller-mwtxs-config-mfgb9" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.545996 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/99a29b50-a76e-4093-9642-9138fa937e0c-var-run\") pod \"ovn-controller-mwtxs-config-mfgb9\" (UID: \"99a29b50-a76e-4093-9642-9138fa937e0c\") " pod="openstack/ovn-controller-mwtxs-config-mfgb9" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.546040 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/99a29b50-a76e-4093-9642-9138fa937e0c-var-run-ovn\") pod \"ovn-controller-mwtxs-config-mfgb9\" (UID: \"99a29b50-a76e-4093-9642-9138fa937e0c\") " pod="openstack/ovn-controller-mwtxs-config-mfgb9" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.546101 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xncd\" (UniqueName: \"kubernetes.io/projected/99a29b50-a76e-4093-9642-9138fa937e0c-kube-api-access-7xncd\") pod \"ovn-controller-mwtxs-config-mfgb9\" (UID: \"99a29b50-a76e-4093-9642-9138fa937e0c\") " pod="openstack/ovn-controller-mwtxs-config-mfgb9" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.546131 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99a29b50-a76e-4093-9642-9138fa937e0c-scripts\") pod \"ovn-controller-mwtxs-config-mfgb9\" (UID: \"99a29b50-a76e-4093-9642-9138fa937e0c\") " pod="openstack/ovn-controller-mwtxs-config-mfgb9" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.647535 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/99a29b50-a76e-4093-9642-9138fa937e0c-var-log-ovn\") pod \"ovn-controller-mwtxs-config-mfgb9\" (UID: \"99a29b50-a76e-4093-9642-9138fa937e0c\") " pod="openstack/ovn-controller-mwtxs-config-mfgb9" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.647596 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/99a29b50-a76e-4093-9642-9138fa937e0c-additional-scripts\") pod \"ovn-controller-mwtxs-config-mfgb9\" (UID: \"99a29b50-a76e-4093-9642-9138fa937e0c\") " pod="openstack/ovn-controller-mwtxs-config-mfgb9" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.647618 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/99a29b50-a76e-4093-9642-9138fa937e0c-var-run\") pod \"ovn-controller-mwtxs-config-mfgb9\" (UID: \"99a29b50-a76e-4093-9642-9138fa937e0c\") " pod="openstack/ovn-controller-mwtxs-config-mfgb9" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.647691 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/99a29b50-a76e-4093-9642-9138fa937e0c-var-run-ovn\") pod \"ovn-controller-mwtxs-config-mfgb9\" (UID: \"99a29b50-a76e-4093-9642-9138fa937e0c\") " pod="openstack/ovn-controller-mwtxs-config-mfgb9" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.647740 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xncd\" (UniqueName: \"kubernetes.io/projected/99a29b50-a76e-4093-9642-9138fa937e0c-kube-api-access-7xncd\") pod \"ovn-controller-mwtxs-config-mfgb9\" (UID: \"99a29b50-a76e-4093-9642-9138fa937e0c\") " pod="openstack/ovn-controller-mwtxs-config-mfgb9" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.647767 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99a29b50-a76e-4093-9642-9138fa937e0c-scripts\") pod \"ovn-controller-mwtxs-config-mfgb9\" (UID: \"99a29b50-a76e-4093-9642-9138fa937e0c\") " pod="openstack/ovn-controller-mwtxs-config-mfgb9" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.647879 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/99a29b50-a76e-4093-9642-9138fa937e0c-var-log-ovn\") pod \"ovn-controller-mwtxs-config-mfgb9\" (UID: \"99a29b50-a76e-4093-9642-9138fa937e0c\") " pod="openstack/ovn-controller-mwtxs-config-mfgb9" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.647964 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/99a29b50-a76e-4093-9642-9138fa937e0c-var-run-ovn\") pod \"ovn-controller-mwtxs-config-mfgb9\" (UID: \"99a29b50-a76e-4093-9642-9138fa937e0c\") " pod="openstack/ovn-controller-mwtxs-config-mfgb9" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.647885 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/99a29b50-a76e-4093-9642-9138fa937e0c-var-run\") pod \"ovn-controller-mwtxs-config-mfgb9\" (UID: \"99a29b50-a76e-4093-9642-9138fa937e0c\") " pod="openstack/ovn-controller-mwtxs-config-mfgb9" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.648660 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/99a29b50-a76e-4093-9642-9138fa937e0c-additional-scripts\") pod \"ovn-controller-mwtxs-config-mfgb9\" (UID: \"99a29b50-a76e-4093-9642-9138fa937e0c\") " pod="openstack/ovn-controller-mwtxs-config-mfgb9" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.649930 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99a29b50-a76e-4093-9642-9138fa937e0c-scripts\") pod \"ovn-controller-mwtxs-config-mfgb9\" (UID: \"99a29b50-a76e-4093-9642-9138fa937e0c\") " pod="openstack/ovn-controller-mwtxs-config-mfgb9" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.667522 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xncd\" (UniqueName: \"kubernetes.io/projected/99a29b50-a76e-4093-9642-9138fa937e0c-kube-api-access-7xncd\") pod \"ovn-controller-mwtxs-config-mfgb9\" (UID: \"99a29b50-a76e-4093-9642-9138fa937e0c\") " pod="openstack/ovn-controller-mwtxs-config-mfgb9" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.844273 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-mwtxs" Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.852699 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mwtxs-config-mfgb9" Oct 11 01:11:12 crc kubenswrapper[4743]: W1011 01:11:12.882041 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb01c2590_9783_4392_8a81_a4a0ec37e88d.slice/crio-fc943ff209d1f7e81a928dc9ab7e882b13d179c7fb02a94c96a525347f7c3772 WatchSource:0}: Error finding container fc943ff209d1f7e81a928dc9ab7e882b13d179c7fb02a94c96a525347f7c3772: Status 404 returned error can't find the container with id fc943ff209d1f7e81a928dc9ab7e882b13d179c7fb02a94c96a525347f7c3772 Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.886377 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d5a7-account-create-z86xz"] Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.938386 4743 generic.go:334] "Generic (PLEG): container finished" podID="421a65dd-2c9a-4807-bffa-c292f25a8263" containerID="5e011c8554ca34706c9228cf902c194456fe233ac9b626c43abf439c9320109c" exitCode=0 Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.938562 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5f82-account-create-gq8c9" event={"ID":"421a65dd-2c9a-4807-bffa-c292f25a8263","Type":"ContainerDied","Data":"5e011c8554ca34706c9228cf902c194456fe233ac9b626c43abf439c9320109c"} Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.938820 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5f82-account-create-gq8c9" event={"ID":"421a65dd-2c9a-4807-bffa-c292f25a8263","Type":"ContainerStarted","Data":"311cea8862c5c3ce9a6c7ba7986b01dace90937f73422c4c91e4efe98f0dfc7c"} Oct 11 01:11:12 crc kubenswrapper[4743]: I1011 01:11:12.939735 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d5a7-account-create-z86xz" event={"ID":"b01c2590-9783-4392-8a81-a4a0ec37e88d","Type":"ContainerStarted","Data":"fc943ff209d1f7e81a928dc9ab7e882b13d179c7fb02a94c96a525347f7c3772"} Oct 11 01:11:13 crc kubenswrapper[4743]: I1011 01:11:13.106804 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2a5c-account-create-6nzhk"] Oct 11 01:11:13 crc kubenswrapper[4743]: I1011 01:11:13.498024 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mwtxs-config-mfgb9"] Oct 11 01:11:13 crc kubenswrapper[4743]: W1011 01:11:13.510772 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99a29b50_a76e_4093_9642_9138fa937e0c.slice/crio-c571eeb71694a98c76f62b39b0e01b05cc62a20d46d0bf1a74f413753763dd0e WatchSource:0}: Error finding container c571eeb71694a98c76f62b39b0e01b05cc62a20d46d0bf1a74f413753763dd0e: Status 404 returned error can't find the container with id c571eeb71694a98c76f62b39b0e01b05cc62a20d46d0bf1a74f413753763dd0e Oct 11 01:11:13 crc kubenswrapper[4743]: I1011 01:11:13.950160 4743 generic.go:334] "Generic (PLEG): container finished" podID="c5dabefa-327b-4e8b-9a7f-81517a52c01b" containerID="26ac38353778f0f982eb11e0c4b48a6fd455599647027554bbc74c4dc64cd8be" exitCode=0 Oct 11 01:11:13 crc kubenswrapper[4743]: I1011 01:11:13.950443 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2a5c-account-create-6nzhk" event={"ID":"c5dabefa-327b-4e8b-9a7f-81517a52c01b","Type":"ContainerDied","Data":"26ac38353778f0f982eb11e0c4b48a6fd455599647027554bbc74c4dc64cd8be"} Oct 11 01:11:13 crc kubenswrapper[4743]: I1011 01:11:13.950468 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2a5c-account-create-6nzhk" event={"ID":"c5dabefa-327b-4e8b-9a7f-81517a52c01b","Type":"ContainerStarted","Data":"d2d726032beb0e1e0b1202574f7d3a3212d5647a0e4d7a144163a4a49ea544f1"} Oct 11 01:11:13 crc kubenswrapper[4743]: I1011 01:11:13.952354 4743 generic.go:334] "Generic (PLEG): container finished" podID="b01c2590-9783-4392-8a81-a4a0ec37e88d" containerID="5d17ff7b9e1b806db0dcac19018f064d4b41fc11cdf8eed39cb3555c25f56cc8" exitCode=0 Oct 11 01:11:13 crc kubenswrapper[4743]: I1011 01:11:13.952402 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d5a7-account-create-z86xz" event={"ID":"b01c2590-9783-4392-8a81-a4a0ec37e88d","Type":"ContainerDied","Data":"5d17ff7b9e1b806db0dcac19018f064d4b41fc11cdf8eed39cb3555c25f56cc8"} Oct 11 01:11:13 crc kubenswrapper[4743]: I1011 01:11:13.954324 4743 generic.go:334] "Generic (PLEG): container finished" podID="99a29b50-a76e-4093-9642-9138fa937e0c" containerID="b160b4394e8b0c73d683003f5311518dd3780e0891042cfe320546e6b9556f90" exitCode=0 Oct 11 01:11:13 crc kubenswrapper[4743]: I1011 01:11:13.954524 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mwtxs-config-mfgb9" event={"ID":"99a29b50-a76e-4093-9642-9138fa937e0c","Type":"ContainerDied","Data":"b160b4394e8b0c73d683003f5311518dd3780e0891042cfe320546e6b9556f90"} Oct 11 01:11:13 crc kubenswrapper[4743]: I1011 01:11:13.954551 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mwtxs-config-mfgb9" event={"ID":"99a29b50-a76e-4093-9642-9138fa937e0c","Type":"ContainerStarted","Data":"c571eeb71694a98c76f62b39b0e01b05cc62a20d46d0bf1a74f413753763dd0e"} Oct 11 01:11:14 crc kubenswrapper[4743]: I1011 01:11:14.111392 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="504e853d-0ce0-4817-bdc5-ee5222a9069f" path="/var/lib/kubelet/pods/504e853d-0ce0-4817-bdc5-ee5222a9069f/volumes" Oct 11 01:11:14 crc kubenswrapper[4743]: I1011 01:11:14.402233 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f82-account-create-gq8c9" Oct 11 01:11:14 crc kubenswrapper[4743]: I1011 01:11:14.528407 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n9l5\" (UniqueName: \"kubernetes.io/projected/421a65dd-2c9a-4807-bffa-c292f25a8263-kube-api-access-7n9l5\") pod \"421a65dd-2c9a-4807-bffa-c292f25a8263\" (UID: \"421a65dd-2c9a-4807-bffa-c292f25a8263\") " Oct 11 01:11:14 crc kubenswrapper[4743]: I1011 01:11:14.541133 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/421a65dd-2c9a-4807-bffa-c292f25a8263-kube-api-access-7n9l5" (OuterVolumeSpecName: "kube-api-access-7n9l5") pod "421a65dd-2c9a-4807-bffa-c292f25a8263" (UID: "421a65dd-2c9a-4807-bffa-c292f25a8263"). InnerVolumeSpecName "kube-api-access-7n9l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:14 crc kubenswrapper[4743]: I1011 01:11:14.630886 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n9l5\" (UniqueName: \"kubernetes.io/projected/421a65dd-2c9a-4807-bffa-c292f25a8263-kube-api-access-7n9l5\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:14 crc kubenswrapper[4743]: I1011 01:11:14.920413 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0c59-account-create-gzvmt"] Oct 11 01:11:14 crc kubenswrapper[4743]: E1011 01:11:14.921758 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="421a65dd-2c9a-4807-bffa-c292f25a8263" containerName="mariadb-account-create" Oct 11 01:11:14 crc kubenswrapper[4743]: I1011 01:11:14.921793 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="421a65dd-2c9a-4807-bffa-c292f25a8263" containerName="mariadb-account-create" Oct 11 01:11:14 crc kubenswrapper[4743]: I1011 01:11:14.922221 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="421a65dd-2c9a-4807-bffa-c292f25a8263" containerName="mariadb-account-create" Oct 11 01:11:14 crc kubenswrapper[4743]: I1011 01:11:14.923681 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0c59-account-create-gzvmt" Oct 11 01:11:14 crc kubenswrapper[4743]: I1011 01:11:14.926164 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Oct 11 01:11:14 crc kubenswrapper[4743]: I1011 01:11:14.930649 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0c59-account-create-gzvmt"] Oct 11 01:11:14 crc kubenswrapper[4743]: I1011 01:11:14.964026 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5f82-account-create-gq8c9" event={"ID":"421a65dd-2c9a-4807-bffa-c292f25a8263","Type":"ContainerDied","Data":"311cea8862c5c3ce9a6c7ba7986b01dace90937f73422c4c91e4efe98f0dfc7c"} Oct 11 01:11:14 crc kubenswrapper[4743]: I1011 01:11:14.964071 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="311cea8862c5c3ce9a6c7ba7986b01dace90937f73422c4c91e4efe98f0dfc7c" Oct 11 01:11:14 crc kubenswrapper[4743]: I1011 01:11:14.964257 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f82-account-create-gq8c9" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.040091 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbpgq\" (UniqueName: \"kubernetes.io/projected/d22dfc6b-30a2-4d39-82d0-70bb32452261-kube-api-access-jbpgq\") pod \"mysqld-exporter-0c59-account-create-gzvmt\" (UID: \"d22dfc6b-30a2-4d39-82d0-70bb32452261\") " pod="openstack/mysqld-exporter-0c59-account-create-gzvmt" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.141576 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbpgq\" (UniqueName: \"kubernetes.io/projected/d22dfc6b-30a2-4d39-82d0-70bb32452261-kube-api-access-jbpgq\") pod \"mysqld-exporter-0c59-account-create-gzvmt\" (UID: \"d22dfc6b-30a2-4d39-82d0-70bb32452261\") " pod="openstack/mysqld-exporter-0c59-account-create-gzvmt" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.161143 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbpgq\" (UniqueName: \"kubernetes.io/projected/d22dfc6b-30a2-4d39-82d0-70bb32452261-kube-api-access-jbpgq\") pod \"mysqld-exporter-0c59-account-create-gzvmt\" (UID: \"d22dfc6b-30a2-4d39-82d0-70bb32452261\") " pod="openstack/mysqld-exporter-0c59-account-create-gzvmt" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.259470 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0c59-account-create-gzvmt" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.486530 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mwtxs-config-mfgb9" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.555196 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/99a29b50-a76e-4093-9642-9138fa937e0c-var-log-ovn\") pod \"99a29b50-a76e-4093-9642-9138fa937e0c\" (UID: \"99a29b50-a76e-4093-9642-9138fa937e0c\") " Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.555252 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/99a29b50-a76e-4093-9642-9138fa937e0c-var-run-ovn\") pod \"99a29b50-a76e-4093-9642-9138fa937e0c\" (UID: \"99a29b50-a76e-4093-9642-9138fa937e0c\") " Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.555312 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/99a29b50-a76e-4093-9642-9138fa937e0c-additional-scripts\") pod \"99a29b50-a76e-4093-9642-9138fa937e0c\" (UID: \"99a29b50-a76e-4093-9642-9138fa937e0c\") " Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.555381 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99a29b50-a76e-4093-9642-9138fa937e0c-scripts\") pod \"99a29b50-a76e-4093-9642-9138fa937e0c\" (UID: \"99a29b50-a76e-4093-9642-9138fa937e0c\") " Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.555427 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xncd\" (UniqueName: \"kubernetes.io/projected/99a29b50-a76e-4093-9642-9138fa937e0c-kube-api-access-7xncd\") pod \"99a29b50-a76e-4093-9642-9138fa937e0c\" (UID: \"99a29b50-a76e-4093-9642-9138fa937e0c\") " Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.555476 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/99a29b50-a76e-4093-9642-9138fa937e0c-var-run\") pod \"99a29b50-a76e-4093-9642-9138fa937e0c\" (UID: \"99a29b50-a76e-4093-9642-9138fa937e0c\") " Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.555890 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99a29b50-a76e-4093-9642-9138fa937e0c-var-run" (OuterVolumeSpecName: "var-run") pod "99a29b50-a76e-4093-9642-9138fa937e0c" (UID: "99a29b50-a76e-4093-9642-9138fa937e0c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.555925 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99a29b50-a76e-4093-9642-9138fa937e0c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "99a29b50-a76e-4093-9642-9138fa937e0c" (UID: "99a29b50-a76e-4093-9642-9138fa937e0c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.555944 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99a29b50-a76e-4093-9642-9138fa937e0c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "99a29b50-a76e-4093-9642-9138fa937e0c" (UID: "99a29b50-a76e-4093-9642-9138fa937e0c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.556617 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99a29b50-a76e-4093-9642-9138fa937e0c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "99a29b50-a76e-4093-9642-9138fa937e0c" (UID: "99a29b50-a76e-4093-9642-9138fa937e0c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.557453 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99a29b50-a76e-4093-9642-9138fa937e0c-scripts" (OuterVolumeSpecName: "scripts") pod "99a29b50-a76e-4093-9642-9138fa937e0c" (UID: "99a29b50-a76e-4093-9642-9138fa937e0c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.560328 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99a29b50-a76e-4093-9642-9138fa937e0c-kube-api-access-7xncd" (OuterVolumeSpecName: "kube-api-access-7xncd") pod "99a29b50-a76e-4093-9642-9138fa937e0c" (UID: "99a29b50-a76e-4093-9642-9138fa937e0c"). InnerVolumeSpecName "kube-api-access-7xncd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.585992 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d5a7-account-create-z86xz" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.596503 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2a5c-account-create-6nzhk" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.656941 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bpfj\" (UniqueName: \"kubernetes.io/projected/b01c2590-9783-4392-8a81-a4a0ec37e88d-kube-api-access-5bpfj\") pod \"b01c2590-9783-4392-8a81-a4a0ec37e88d\" (UID: \"b01c2590-9783-4392-8a81-a4a0ec37e88d\") " Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.657004 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd7xg\" (UniqueName: \"kubernetes.io/projected/c5dabefa-327b-4e8b-9a7f-81517a52c01b-kube-api-access-gd7xg\") pod \"c5dabefa-327b-4e8b-9a7f-81517a52c01b\" (UID: \"c5dabefa-327b-4e8b-9a7f-81517a52c01b\") " Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.657390 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99a29b50-a76e-4093-9642-9138fa937e0c-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.657406 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xncd\" (UniqueName: \"kubernetes.io/projected/99a29b50-a76e-4093-9642-9138fa937e0c-kube-api-access-7xncd\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.657418 4743 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/99a29b50-a76e-4093-9642-9138fa937e0c-var-run\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.657428 4743 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/99a29b50-a76e-4093-9642-9138fa937e0c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.657436 4743 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/99a29b50-a76e-4093-9642-9138fa937e0c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.657445 4743 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/99a29b50-a76e-4093-9642-9138fa937e0c-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.661029 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b01c2590-9783-4392-8a81-a4a0ec37e88d-kube-api-access-5bpfj" (OuterVolumeSpecName: "kube-api-access-5bpfj") pod "b01c2590-9783-4392-8a81-a4a0ec37e88d" (UID: "b01c2590-9783-4392-8a81-a4a0ec37e88d"). InnerVolumeSpecName "kube-api-access-5bpfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.661091 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5dabefa-327b-4e8b-9a7f-81517a52c01b-kube-api-access-gd7xg" (OuterVolumeSpecName: "kube-api-access-gd7xg") pod "c5dabefa-327b-4e8b-9a7f-81517a52c01b" (UID: "c5dabefa-327b-4e8b-9a7f-81517a52c01b"). InnerVolumeSpecName "kube-api-access-gd7xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.759204 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bpfj\" (UniqueName: \"kubernetes.io/projected/b01c2590-9783-4392-8a81-a4a0ec37e88d-kube-api-access-5bpfj\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.759239 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd7xg\" (UniqueName: \"kubernetes.io/projected/c5dabefa-327b-4e8b-9a7f-81517a52c01b-kube-api-access-gd7xg\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.818666 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0c59-account-create-gzvmt"] Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.975341 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0c59-account-create-gzvmt" event={"ID":"d22dfc6b-30a2-4d39-82d0-70bb32452261","Type":"ContainerStarted","Data":"2c7b3956dc1901cea2381f9d6552be6446d0c77adda981342617b1bec4ce4a68"} Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.975886 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0c59-account-create-gzvmt" event={"ID":"d22dfc6b-30a2-4d39-82d0-70bb32452261","Type":"ContainerStarted","Data":"431cf207d6f6548a98bea3997512a04a982ba3097999dd108f8dfb57fcecf6b8"} Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.978028 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2a5c-account-create-6nzhk" event={"ID":"c5dabefa-327b-4e8b-9a7f-81517a52c01b","Type":"ContainerDied","Data":"d2d726032beb0e1e0b1202574f7d3a3212d5647a0e4d7a144163a4a49ea544f1"} Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.978066 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2d726032beb0e1e0b1202574f7d3a3212d5647a0e4d7a144163a4a49ea544f1" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.978072 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2a5c-account-create-6nzhk" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.979578 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d5a7-account-create-z86xz" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.979565 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d5a7-account-create-z86xz" event={"ID":"b01c2590-9783-4392-8a81-a4a0ec37e88d","Type":"ContainerDied","Data":"fc943ff209d1f7e81a928dc9ab7e882b13d179c7fb02a94c96a525347f7c3772"} Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.979719 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc943ff209d1f7e81a928dc9ab7e882b13d179c7fb02a94c96a525347f7c3772" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.980744 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mwtxs-config-mfgb9" event={"ID":"99a29b50-a76e-4093-9642-9138fa937e0c","Type":"ContainerDied","Data":"c571eeb71694a98c76f62b39b0e01b05cc62a20d46d0bf1a74f413753763dd0e"} Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.980773 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c571eeb71694a98c76f62b39b0e01b05cc62a20d46d0bf1a74f413753763dd0e" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.980774 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mwtxs-config-mfgb9" Oct 11 01:11:15 crc kubenswrapper[4743]: I1011 01:11:15.990052 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0c59-account-create-gzvmt" podStartSLOduration=1.990031499 podStartE2EDuration="1.990031499s" podCreationTimestamp="2025-10-11 01:11:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:11:15.987659649 +0000 UTC m=+1170.640640056" watchObservedRunningTime="2025-10-11 01:11:15.990031499 +0000 UTC m=+1170.643011896" Oct 11 01:11:16 crc kubenswrapper[4743]: I1011 01:11:16.308259 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:16 crc kubenswrapper[4743]: I1011 01:11:16.311045 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:16 crc kubenswrapper[4743]: I1011 01:11:16.579039 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-mwtxs-config-mfgb9"] Oct 11 01:11:16 crc kubenswrapper[4743]: I1011 01:11:16.587574 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-mwtxs-config-mfgb9"] Oct 11 01:11:16 crc kubenswrapper[4743]: I1011 01:11:16.989502 4743 generic.go:334] "Generic (PLEG): container finished" podID="d22dfc6b-30a2-4d39-82d0-70bb32452261" containerID="2c7b3956dc1901cea2381f9d6552be6446d0c77adda981342617b1bec4ce4a68" exitCode=0 Oct 11 01:11:16 crc kubenswrapper[4743]: I1011 01:11:16.989571 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0c59-account-create-gzvmt" event={"ID":"d22dfc6b-30a2-4d39-82d0-70bb32452261","Type":"ContainerDied","Data":"2c7b3956dc1901cea2381f9d6552be6446d0c77adda981342617b1bec4ce4a68"} Oct 11 01:11:16 crc kubenswrapper[4743]: I1011 01:11:16.992538 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:17 crc kubenswrapper[4743]: I1011 01:11:17.138108 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-dznrn"] Oct 11 01:11:17 crc kubenswrapper[4743]: E1011 01:11:17.138513 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99a29b50-a76e-4093-9642-9138fa937e0c" containerName="ovn-config" Oct 11 01:11:17 crc kubenswrapper[4743]: I1011 01:11:17.138535 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a29b50-a76e-4093-9642-9138fa937e0c" containerName="ovn-config" Oct 11 01:11:17 crc kubenswrapper[4743]: E1011 01:11:17.138566 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b01c2590-9783-4392-8a81-a4a0ec37e88d" containerName="mariadb-account-create" Oct 11 01:11:17 crc kubenswrapper[4743]: I1011 01:11:17.138574 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b01c2590-9783-4392-8a81-a4a0ec37e88d" containerName="mariadb-account-create" Oct 11 01:11:17 crc kubenswrapper[4743]: E1011 01:11:17.138656 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5dabefa-327b-4e8b-9a7f-81517a52c01b" containerName="mariadb-account-create" Oct 11 01:11:17 crc kubenswrapper[4743]: I1011 01:11:17.138668 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5dabefa-327b-4e8b-9a7f-81517a52c01b" containerName="mariadb-account-create" Oct 11 01:11:17 crc kubenswrapper[4743]: I1011 01:11:17.138885 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="99a29b50-a76e-4093-9642-9138fa937e0c" containerName="ovn-config" Oct 11 01:11:17 crc kubenswrapper[4743]: I1011 01:11:17.138913 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b01c2590-9783-4392-8a81-a4a0ec37e88d" containerName="mariadb-account-create" Oct 11 01:11:17 crc kubenswrapper[4743]: I1011 01:11:17.138975 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5dabefa-327b-4e8b-9a7f-81517a52c01b" containerName="mariadb-account-create" Oct 11 01:11:17 crc kubenswrapper[4743]: I1011 01:11:17.139790 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dznrn" Oct 11 01:11:17 crc kubenswrapper[4743]: I1011 01:11:17.142363 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 11 01:11:17 crc kubenswrapper[4743]: I1011 01:11:17.142462 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9vxpt" Oct 11 01:11:17 crc kubenswrapper[4743]: I1011 01:11:17.151779 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-dznrn"] Oct 11 01:11:17 crc kubenswrapper[4743]: I1011 01:11:17.185338 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f60a76bb-fc23-4e3a-b7ad-123f65747952-combined-ca-bundle\") pod \"glance-db-sync-dznrn\" (UID: \"f60a76bb-fc23-4e3a-b7ad-123f65747952\") " pod="openstack/glance-db-sync-dznrn" Oct 11 01:11:17 crc kubenswrapper[4743]: I1011 01:11:17.185419 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f60a76bb-fc23-4e3a-b7ad-123f65747952-config-data\") pod \"glance-db-sync-dznrn\" (UID: \"f60a76bb-fc23-4e3a-b7ad-123f65747952\") " pod="openstack/glance-db-sync-dznrn" Oct 11 01:11:17 crc kubenswrapper[4743]: I1011 01:11:17.185477 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f60a76bb-fc23-4e3a-b7ad-123f65747952-db-sync-config-data\") pod \"glance-db-sync-dznrn\" (UID: \"f60a76bb-fc23-4e3a-b7ad-123f65747952\") " pod="openstack/glance-db-sync-dznrn" Oct 11 01:11:17 crc kubenswrapper[4743]: I1011 01:11:17.185809 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng6gj\" (UniqueName: \"kubernetes.io/projected/f60a76bb-fc23-4e3a-b7ad-123f65747952-kube-api-access-ng6gj\") pod \"glance-db-sync-dznrn\" (UID: \"f60a76bb-fc23-4e3a-b7ad-123f65747952\") " pod="openstack/glance-db-sync-dznrn" Oct 11 01:11:17 crc kubenswrapper[4743]: I1011 01:11:17.287971 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f60a76bb-fc23-4e3a-b7ad-123f65747952-combined-ca-bundle\") pod \"glance-db-sync-dznrn\" (UID: \"f60a76bb-fc23-4e3a-b7ad-123f65747952\") " pod="openstack/glance-db-sync-dznrn" Oct 11 01:11:17 crc kubenswrapper[4743]: I1011 01:11:17.288022 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f60a76bb-fc23-4e3a-b7ad-123f65747952-config-data\") pod \"glance-db-sync-dznrn\" (UID: \"f60a76bb-fc23-4e3a-b7ad-123f65747952\") " pod="openstack/glance-db-sync-dznrn" Oct 11 01:11:17 crc kubenswrapper[4743]: I1011 01:11:17.288051 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f60a76bb-fc23-4e3a-b7ad-123f65747952-db-sync-config-data\") pod \"glance-db-sync-dznrn\" (UID: \"f60a76bb-fc23-4e3a-b7ad-123f65747952\") " pod="openstack/glance-db-sync-dznrn" Oct 11 01:11:17 crc kubenswrapper[4743]: I1011 01:11:17.288109 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng6gj\" (UniqueName: \"kubernetes.io/projected/f60a76bb-fc23-4e3a-b7ad-123f65747952-kube-api-access-ng6gj\") pod \"glance-db-sync-dznrn\" (UID: \"f60a76bb-fc23-4e3a-b7ad-123f65747952\") " pod="openstack/glance-db-sync-dznrn" Oct 11 01:11:17 crc kubenswrapper[4743]: I1011 01:11:17.294938 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f60a76bb-fc23-4e3a-b7ad-123f65747952-combined-ca-bundle\") pod \"glance-db-sync-dznrn\" (UID: \"f60a76bb-fc23-4e3a-b7ad-123f65747952\") " pod="openstack/glance-db-sync-dznrn" Oct 11 01:11:17 crc kubenswrapper[4743]: I1011 01:11:17.300000 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f60a76bb-fc23-4e3a-b7ad-123f65747952-db-sync-config-data\") pod \"glance-db-sync-dznrn\" (UID: \"f60a76bb-fc23-4e3a-b7ad-123f65747952\") " pod="openstack/glance-db-sync-dznrn" Oct 11 01:11:17 crc kubenswrapper[4743]: I1011 01:11:17.300546 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f60a76bb-fc23-4e3a-b7ad-123f65747952-config-data\") pod \"glance-db-sync-dznrn\" (UID: \"f60a76bb-fc23-4e3a-b7ad-123f65747952\") " pod="openstack/glance-db-sync-dznrn" Oct 11 01:11:17 crc kubenswrapper[4743]: I1011 01:11:17.302172 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng6gj\" (UniqueName: \"kubernetes.io/projected/f60a76bb-fc23-4e3a-b7ad-123f65747952-kube-api-access-ng6gj\") pod \"glance-db-sync-dznrn\" (UID: \"f60a76bb-fc23-4e3a-b7ad-123f65747952\") " pod="openstack/glance-db-sync-dznrn" Oct 11 01:11:17 crc kubenswrapper[4743]: I1011 01:11:17.454613 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dznrn" Oct 11 01:11:18 crc kubenswrapper[4743]: I1011 01:11:18.103676 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99a29b50-a76e-4093-9642-9138fa937e0c" path="/var/lib/kubelet/pods/99a29b50-a76e-4093-9642-9138fa937e0c/volumes" Oct 11 01:11:18 crc kubenswrapper[4743]: I1011 01:11:18.122069 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/082aa898-adc9-4e0d-a5e3-329d36f391aa-etc-swift\") pod \"swift-storage-0\" (UID: \"082aa898-adc9-4e0d-a5e3-329d36f391aa\") " pod="openstack/swift-storage-0" Oct 11 01:11:18 crc kubenswrapper[4743]: I1011 01:11:18.138952 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/082aa898-adc9-4e0d-a5e3-329d36f391aa-etc-swift\") pod \"swift-storage-0\" (UID: \"082aa898-adc9-4e0d-a5e3-329d36f391aa\") " pod="openstack/swift-storage-0" Oct 11 01:11:18 crc kubenswrapper[4743]: I1011 01:11:18.214236 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-dznrn"] Oct 11 01:11:18 crc kubenswrapper[4743]: W1011 01:11:18.227474 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf60a76bb_fc23_4e3a_b7ad_123f65747952.slice/crio-c4947028128045b422e02d3761752b8103fd1284c0452aaee87ecf7ca8fcc6e2 WatchSource:0}: Error finding container c4947028128045b422e02d3761752b8103fd1284c0452aaee87ecf7ca8fcc6e2: Status 404 returned error can't find the container with id c4947028128045b422e02d3761752b8103fd1284c0452aaee87ecf7ca8fcc6e2 Oct 11 01:11:18 crc kubenswrapper[4743]: I1011 01:11:18.280040 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0c59-account-create-gzvmt" Oct 11 01:11:18 crc kubenswrapper[4743]: I1011 01:11:18.283341 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 11 01:11:18 crc kubenswrapper[4743]: I1011 01:11:18.325761 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbpgq\" (UniqueName: \"kubernetes.io/projected/d22dfc6b-30a2-4d39-82d0-70bb32452261-kube-api-access-jbpgq\") pod \"d22dfc6b-30a2-4d39-82d0-70bb32452261\" (UID: \"d22dfc6b-30a2-4d39-82d0-70bb32452261\") " Oct 11 01:11:18 crc kubenswrapper[4743]: I1011 01:11:18.330643 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d22dfc6b-30a2-4d39-82d0-70bb32452261-kube-api-access-jbpgq" (OuterVolumeSpecName: "kube-api-access-jbpgq") pod "d22dfc6b-30a2-4d39-82d0-70bb32452261" (UID: "d22dfc6b-30a2-4d39-82d0-70bb32452261"). InnerVolumeSpecName "kube-api-access-jbpgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:18 crc kubenswrapper[4743]: I1011 01:11:18.428344 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbpgq\" (UniqueName: \"kubernetes.io/projected/d22dfc6b-30a2-4d39-82d0-70bb32452261-kube-api-access-jbpgq\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:18 crc kubenswrapper[4743]: I1011 01:11:18.883591 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 11 01:11:19 crc kubenswrapper[4743]: I1011 01:11:19.025090 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"082aa898-adc9-4e0d-a5e3-329d36f391aa","Type":"ContainerStarted","Data":"c53b151356da9b8e1ba1303f53b04a64dc173357d4f3c48e5e19aa745e99e991"} Oct 11 01:11:19 crc kubenswrapper[4743]: I1011 01:11:19.034737 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dznrn" event={"ID":"f60a76bb-fc23-4e3a-b7ad-123f65747952","Type":"ContainerStarted","Data":"c4947028128045b422e02d3761752b8103fd1284c0452aaee87ecf7ca8fcc6e2"} Oct 11 01:11:19 crc kubenswrapper[4743]: I1011 01:11:19.054692 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0c59-account-create-gzvmt" event={"ID":"d22dfc6b-30a2-4d39-82d0-70bb32452261","Type":"ContainerDied","Data":"431cf207d6f6548a98bea3997512a04a982ba3097999dd108f8dfb57fcecf6b8"} Oct 11 01:11:19 crc kubenswrapper[4743]: I1011 01:11:19.054943 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="431cf207d6f6548a98bea3997512a04a982ba3097999dd108f8dfb57fcecf6b8" Oct 11 01:11:19 crc kubenswrapper[4743]: I1011 01:11:19.055073 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0c59-account-create-gzvmt" Oct 11 01:11:19 crc kubenswrapper[4743]: I1011 01:11:19.806485 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 11 01:11:19 crc kubenswrapper[4743]: I1011 01:11:19.808196 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e1a1779f-127f-4ea2-a937-b97f329e3878" containerName="prometheus" containerID="cri-o://db842a9aa8dd2163fe2dfdb4255214fea18de948432f619b277ea689c5fbc35c" gracePeriod=600 Oct 11 01:11:19 crc kubenswrapper[4743]: I1011 01:11:19.809010 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e1a1779f-127f-4ea2-a937-b97f329e3878" containerName="thanos-sidecar" containerID="cri-o://b10dd53f96ee32ccfd746850af138144705d93b07074a54ed230a878ff2363c1" gracePeriod=600 Oct 11 01:11:19 crc kubenswrapper[4743]: I1011 01:11:19.809001 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e1a1779f-127f-4ea2-a937-b97f329e3878" containerName="config-reloader" containerID="cri-o://de4931ad2988f95a60457da9c6bacff7ad35a2cfe3fe6d732a283229eb8c32bf" gracePeriod=600 Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.079613 4743 generic.go:334] "Generic (PLEG): container finished" podID="e1a1779f-127f-4ea2-a937-b97f329e3878" containerID="b10dd53f96ee32ccfd746850af138144705d93b07074a54ed230a878ff2363c1" exitCode=0 Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.079642 4743 generic.go:334] "Generic (PLEG): container finished" podID="e1a1779f-127f-4ea2-a937-b97f329e3878" containerID="de4931ad2988f95a60457da9c6bacff7ad35a2cfe3fe6d732a283229eb8c32bf" exitCode=0 Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.079653 4743 generic.go:334] "Generic (PLEG): container finished" podID="e1a1779f-127f-4ea2-a937-b97f329e3878" containerID="db842a9aa8dd2163fe2dfdb4255214fea18de948432f619b277ea689c5fbc35c" exitCode=0 Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.079672 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e1a1779f-127f-4ea2-a937-b97f329e3878","Type":"ContainerDied","Data":"b10dd53f96ee32ccfd746850af138144705d93b07074a54ed230a878ff2363c1"} Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.079697 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e1a1779f-127f-4ea2-a937-b97f329e3878","Type":"ContainerDied","Data":"de4931ad2988f95a60457da9c6bacff7ad35a2cfe3fe6d732a283229eb8c32bf"} Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.079706 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e1a1779f-127f-4ea2-a937-b97f329e3878","Type":"ContainerDied","Data":"db842a9aa8dd2163fe2dfdb4255214fea18de948432f619b277ea689c5fbc35c"} Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.152699 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-crf47"] Oct 11 01:11:20 crc kubenswrapper[4743]: E1011 01:11:20.153143 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d22dfc6b-30a2-4d39-82d0-70bb32452261" containerName="mariadb-account-create" Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.153162 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d22dfc6b-30a2-4d39-82d0-70bb32452261" containerName="mariadb-account-create" Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.153392 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d22dfc6b-30a2-4d39-82d0-70bb32452261" containerName="mariadb-account-create" Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.154047 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-crf47" Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.161404 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-crf47"] Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.291699 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nznnh\" (UniqueName: \"kubernetes.io/projected/ae3a2cd6-d036-4c48-aaed-fc9750d6c0d0-kube-api-access-nznnh\") pod \"mysqld-exporter-openstack-cell1-db-create-crf47\" (UID: \"ae3a2cd6-d036-4c48-aaed-fc9750d6c0d0\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-crf47" Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.394204 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nznnh\" (UniqueName: \"kubernetes.io/projected/ae3a2cd6-d036-4c48-aaed-fc9750d6c0d0-kube-api-access-nznnh\") pod \"mysqld-exporter-openstack-cell1-db-create-crf47\" (UID: \"ae3a2cd6-d036-4c48-aaed-fc9750d6c0d0\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-crf47" Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.416302 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nznnh\" (UniqueName: \"kubernetes.io/projected/ae3a2cd6-d036-4c48-aaed-fc9750d6c0d0-kube-api-access-nznnh\") pod \"mysqld-exporter-openstack-cell1-db-create-crf47\" (UID: \"ae3a2cd6-d036-4c48-aaed-fc9750d6c0d0\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-crf47" Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.497880 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-crf47" Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.543400 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.599421 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gsk9\" (UniqueName: \"kubernetes.io/projected/e1a1779f-127f-4ea2-a937-b97f329e3878-kube-api-access-6gsk9\") pod \"e1a1779f-127f-4ea2-a937-b97f329e3878\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.599497 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1a1779f-127f-4ea2-a937-b97f329e3878-config\") pod \"e1a1779f-127f-4ea2-a937-b97f329e3878\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.599547 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e1a1779f-127f-4ea2-a937-b97f329e3878-thanos-prometheus-http-client-file\") pod \"e1a1779f-127f-4ea2-a937-b97f329e3878\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.599781 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f28cf97-999f-4f71-812b-e7f9e9accdc6\") pod \"e1a1779f-127f-4ea2-a937-b97f329e3878\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.599839 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e1a1779f-127f-4ea2-a937-b97f329e3878-web-config\") pod \"e1a1779f-127f-4ea2-a937-b97f329e3878\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.599907 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e1a1779f-127f-4ea2-a937-b97f329e3878-tls-assets\") pod \"e1a1779f-127f-4ea2-a937-b97f329e3878\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.607753 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1a1779f-127f-4ea2-a937-b97f329e3878-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "e1a1779f-127f-4ea2-a937-b97f329e3878" (UID: "e1a1779f-127f-4ea2-a937-b97f329e3878"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.619427 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1a1779f-127f-4ea2-a937-b97f329e3878-config" (OuterVolumeSpecName: "config") pod "e1a1779f-127f-4ea2-a937-b97f329e3878" (UID: "e1a1779f-127f-4ea2-a937-b97f329e3878"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.619509 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1a1779f-127f-4ea2-a937-b97f329e3878-kube-api-access-6gsk9" (OuterVolumeSpecName: "kube-api-access-6gsk9") pod "e1a1779f-127f-4ea2-a937-b97f329e3878" (UID: "e1a1779f-127f-4ea2-a937-b97f329e3878"). InnerVolumeSpecName "kube-api-access-6gsk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.619577 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1a1779f-127f-4ea2-a937-b97f329e3878-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "e1a1779f-127f-4ea2-a937-b97f329e3878" (UID: "e1a1779f-127f-4ea2-a937-b97f329e3878"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.636222 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f28cf97-999f-4f71-812b-e7f9e9accdc6" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "e1a1779f-127f-4ea2-a937-b97f329e3878" (UID: "e1a1779f-127f-4ea2-a937-b97f329e3878"). InnerVolumeSpecName "pvc-5f28cf97-999f-4f71-812b-e7f9e9accdc6". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.639375 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1a1779f-127f-4ea2-a937-b97f329e3878-web-config" (OuterVolumeSpecName: "web-config") pod "e1a1779f-127f-4ea2-a937-b97f329e3878" (UID: "e1a1779f-127f-4ea2-a937-b97f329e3878"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.703811 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e1a1779f-127f-4ea2-a937-b97f329e3878-config-out\") pod \"e1a1779f-127f-4ea2-a937-b97f329e3878\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.703994 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e1a1779f-127f-4ea2-a937-b97f329e3878-prometheus-metric-storage-rulefiles-0\") pod \"e1a1779f-127f-4ea2-a937-b97f329e3878\" (UID: \"e1a1779f-127f-4ea2-a937-b97f329e3878\") " Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.704391 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gsk9\" (UniqueName: \"kubernetes.io/projected/e1a1779f-127f-4ea2-a937-b97f329e3878-kube-api-access-6gsk9\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.704404 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1a1779f-127f-4ea2-a937-b97f329e3878-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.704414 4743 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e1a1779f-127f-4ea2-a937-b97f329e3878-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.704435 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-5f28cf97-999f-4f71-812b-e7f9e9accdc6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f28cf97-999f-4f71-812b-e7f9e9accdc6\") on node \"crc\" " Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.704446 4743 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e1a1779f-127f-4ea2-a937-b97f329e3878-web-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.704455 4743 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e1a1779f-127f-4ea2-a937-b97f329e3878-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.705701 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1a1779f-127f-4ea2-a937-b97f329e3878-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "e1a1779f-127f-4ea2-a937-b97f329e3878" (UID: "e1a1779f-127f-4ea2-a937-b97f329e3878"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.708174 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1a1779f-127f-4ea2-a937-b97f329e3878-config-out" (OuterVolumeSpecName: "config-out") pod "e1a1779f-127f-4ea2-a937-b97f329e3878" (UID: "e1a1779f-127f-4ea2-a937-b97f329e3878"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.733092 4743 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.733301 4743 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-5f28cf97-999f-4f71-812b-e7f9e9accdc6" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f28cf97-999f-4f71-812b-e7f9e9accdc6") on node "crc" Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.807269 4743 reconciler_common.go:293] "Volume detached for volume \"pvc-5f28cf97-999f-4f71-812b-e7f9e9accdc6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f28cf97-999f-4f71-812b-e7f9e9accdc6\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.807572 4743 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e1a1779f-127f-4ea2-a937-b97f329e3878-config-out\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:20 crc kubenswrapper[4743]: I1011 01:11:20.807582 4743 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e1a1779f-127f-4ea2-a937-b97f329e3878-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.013112 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.051669 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-crf47"] Oct 11 01:11:21 crc kubenswrapper[4743]: W1011 01:11:21.062666 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae3a2cd6_d036_4c48_aaed_fc9750d6c0d0.slice/crio-5dfda5948373cee3a91f6304ba6c31ae493c21f3aa9e10d3dffd01dd9da7da6f WatchSource:0}: Error finding container 5dfda5948373cee3a91f6304ba6c31ae493c21f3aa9e10d3dffd01dd9da7da6f: Status 404 returned error can't find the container with id 5dfda5948373cee3a91f6304ba6c31ae493c21f3aa9e10d3dffd01dd9da7da6f Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.106402 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.106458 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e1a1779f-127f-4ea2-a937-b97f329e3878","Type":"ContainerDied","Data":"3878092d2f66d34b4df8c35f9e72877aa38718294bd86162acfee83fd2e0420a"} Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.106523 4743 scope.go:117] "RemoveContainer" containerID="b10dd53f96ee32ccfd746850af138144705d93b07074a54ed230a878ff2363c1" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.110260 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"082aa898-adc9-4e0d-a5e3-329d36f391aa","Type":"ContainerStarted","Data":"ff22a83072260c111b2dea12392e073ec3a7b2dfc307cb31eff73281e1d27138"} Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.110314 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"082aa898-adc9-4e0d-a5e3-329d36f391aa","Type":"ContainerStarted","Data":"8cf7186d96380e3a9f5d345a898dc3c50e5e83bfe5b37047ff35b692951ab0cc"} Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.110324 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"082aa898-adc9-4e0d-a5e3-329d36f391aa","Type":"ContainerStarted","Data":"a8a9867ca1a063dfe427cff1ae1025905408b979f53f4d9bb836b298e6504a3c"} Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.110333 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"082aa898-adc9-4e0d-a5e3-329d36f391aa","Type":"ContainerStarted","Data":"a9c352f992423d207b104acad08b8221a022ae3fcf94e883261071b2a4062578"} Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.116070 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-crf47" event={"ID":"ae3a2cd6-d036-4c48-aaed-fc9750d6c0d0","Type":"ContainerStarted","Data":"5dfda5948373cee3a91f6304ba6c31ae493c21f3aa9e10d3dffd01dd9da7da6f"} Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.132330 4743 scope.go:117] "RemoveContainer" containerID="de4931ad2988f95a60457da9c6bacff7ad35a2cfe3fe6d732a283229eb8c32bf" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.172417 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.181898 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.189145 4743 scope.go:117] "RemoveContainer" containerID="db842a9aa8dd2163fe2dfdb4255214fea18de948432f619b277ea689c5fbc35c" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.201458 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 11 01:11:21 crc kubenswrapper[4743]: E1011 01:11:21.201818 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a1779f-127f-4ea2-a937-b97f329e3878" containerName="prometheus" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.201838 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a1779f-127f-4ea2-a937-b97f329e3878" containerName="prometheus" Oct 11 01:11:21 crc kubenswrapper[4743]: E1011 01:11:21.201875 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a1779f-127f-4ea2-a937-b97f329e3878" containerName="config-reloader" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.201884 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a1779f-127f-4ea2-a937-b97f329e3878" containerName="config-reloader" Oct 11 01:11:21 crc kubenswrapper[4743]: E1011 01:11:21.201896 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a1779f-127f-4ea2-a937-b97f329e3878" containerName="init-config-reloader" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.201903 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a1779f-127f-4ea2-a937-b97f329e3878" containerName="init-config-reloader" Oct 11 01:11:21 crc kubenswrapper[4743]: E1011 01:11:21.201924 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a1779f-127f-4ea2-a937-b97f329e3878" containerName="thanos-sidecar" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.201930 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a1779f-127f-4ea2-a937-b97f329e3878" containerName="thanos-sidecar" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.202109 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1a1779f-127f-4ea2-a937-b97f329e3878" containerName="prometheus" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.202126 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1a1779f-127f-4ea2-a937-b97f329e3878" containerName="config-reloader" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.202138 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1a1779f-127f-4ea2-a937-b97f329e3878" containerName="thanos-sidecar" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.203719 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.211696 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.212117 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-tkdpc" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.212366 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.212509 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.212693 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.212847 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.235601 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.249975 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.300952 4743 scope.go:117] "RemoveContainer" containerID="3ca7a025605c798f9dce3f44e26b3545c887fbf30de417079fadcd6d371106ca" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.318702 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktxwb\" (UniqueName: \"kubernetes.io/projected/eed36ee9-8239-4139-97f3-0e7b2962f45b-kube-api-access-ktxwb\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.318744 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eed36ee9-8239-4139-97f3-0e7b2962f45b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.318788 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/eed36ee9-8239-4139-97f3-0e7b2962f45b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.318808 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/eed36ee9-8239-4139-97f3-0e7b2962f45b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.318838 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/eed36ee9-8239-4139-97f3-0e7b2962f45b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.318888 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/eed36ee9-8239-4139-97f3-0e7b2962f45b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.318929 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/eed36ee9-8239-4139-97f3-0e7b2962f45b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.318951 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eed36ee9-8239-4139-97f3-0e7b2962f45b-config\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.319003 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/eed36ee9-8239-4139-97f3-0e7b2962f45b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.319032 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5f28cf97-999f-4f71-812b-e7f9e9accdc6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f28cf97-999f-4f71-812b-e7f9e9accdc6\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.319053 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/eed36ee9-8239-4139-97f3-0e7b2962f45b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.420734 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eed36ee9-8239-4139-97f3-0e7b2962f45b-config\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.420818 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/eed36ee9-8239-4139-97f3-0e7b2962f45b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.420850 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5f28cf97-999f-4f71-812b-e7f9e9accdc6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f28cf97-999f-4f71-812b-e7f9e9accdc6\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.420888 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/eed36ee9-8239-4139-97f3-0e7b2962f45b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.420922 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktxwb\" (UniqueName: \"kubernetes.io/projected/eed36ee9-8239-4139-97f3-0e7b2962f45b-kube-api-access-ktxwb\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.420943 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eed36ee9-8239-4139-97f3-0e7b2962f45b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.420981 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/eed36ee9-8239-4139-97f3-0e7b2962f45b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.421004 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/eed36ee9-8239-4139-97f3-0e7b2962f45b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.421045 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/eed36ee9-8239-4139-97f3-0e7b2962f45b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.421070 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/eed36ee9-8239-4139-97f3-0e7b2962f45b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.421134 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/eed36ee9-8239-4139-97f3-0e7b2962f45b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.435601 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/eed36ee9-8239-4139-97f3-0e7b2962f45b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.459495 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/eed36ee9-8239-4139-97f3-0e7b2962f45b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.459501 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/eed36ee9-8239-4139-97f3-0e7b2962f45b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.459933 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/eed36ee9-8239-4139-97f3-0e7b2962f45b-config\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.460713 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/eed36ee9-8239-4139-97f3-0e7b2962f45b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.463623 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.463721 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5f28cf97-999f-4f71-812b-e7f9e9accdc6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f28cf97-999f-4f71-812b-e7f9e9accdc6\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/20eacae98d4e3ade30240db8ae2c9a452ab5c4cf715521e04f6c7bc8a9fb59e6/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.463928 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktxwb\" (UniqueName: \"kubernetes.io/projected/eed36ee9-8239-4139-97f3-0e7b2962f45b-kube-api-access-ktxwb\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.466238 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eed36ee9-8239-4139-97f3-0e7b2962f45b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.471887 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/eed36ee9-8239-4139-97f3-0e7b2962f45b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.473977 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/eed36ee9-8239-4139-97f3-0e7b2962f45b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.495337 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/eed36ee9-8239-4139-97f3-0e7b2962f45b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.530351 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5f28cf97-999f-4f71-812b-e7f9e9accdc6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f28cf97-999f-4f71-812b-e7f9e9accdc6\") pod \"prometheus-metric-storage-0\" (UID: \"eed36ee9-8239-4139-97f3-0e7b2962f45b\") " pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:21 crc kubenswrapper[4743]: I1011 01:11:21.578764 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:22 crc kubenswrapper[4743]: I1011 01:11:22.041927 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 11 01:11:22 crc kubenswrapper[4743]: I1011 01:11:22.107441 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1a1779f-127f-4ea2-a937-b97f329e3878" path="/var/lib/kubelet/pods/e1a1779f-127f-4ea2-a937-b97f329e3878/volumes" Oct 11 01:11:22 crc kubenswrapper[4743]: I1011 01:11:22.127804 4743 generic.go:334] "Generic (PLEG): container finished" podID="ae3a2cd6-d036-4c48-aaed-fc9750d6c0d0" containerID="463afe5f78b4ac572429650488710e5076ea660d5576f21e3d62c1ec4b3b3857" exitCode=0 Oct 11 01:11:22 crc kubenswrapper[4743]: I1011 01:11:22.127892 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-crf47" event={"ID":"ae3a2cd6-d036-4c48-aaed-fc9750d6c0d0","Type":"ContainerDied","Data":"463afe5f78b4ac572429650488710e5076ea660d5576f21e3d62c1ec4b3b3857"} Oct 11 01:11:23 crc kubenswrapper[4743]: I1011 01:11:23.150720 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"082aa898-adc9-4e0d-a5e3-329d36f391aa","Type":"ContainerStarted","Data":"bf1a24f45bb47e3b0d516cfbaf75b438bcf807a04ff810ab116978411edf4695"} Oct 11 01:11:23 crc kubenswrapper[4743]: I1011 01:11:23.151206 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"082aa898-adc9-4e0d-a5e3-329d36f391aa","Type":"ContainerStarted","Data":"1d5d34c42d6b72db4011dbbf90c8db09441df3ce0a1928d31f91752db684e8f3"} Oct 11 01:11:23 crc kubenswrapper[4743]: I1011 01:11:23.151217 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"082aa898-adc9-4e0d-a5e3-329d36f391aa","Type":"ContainerStarted","Data":"4b585d92c0100b26852ab0ee3cfd4923082b399983b629f6eed775fb421c2249"} Oct 11 01:11:23 crc kubenswrapper[4743]: I1011 01:11:23.151228 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"082aa898-adc9-4e0d-a5e3-329d36f391aa","Type":"ContainerStarted","Data":"846c51614361352b578d2dfa6a10812d44a52cb480e57cd428679653b74e82ec"} Oct 11 01:11:23 crc kubenswrapper[4743]: I1011 01:11:23.153156 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"eed36ee9-8239-4139-97f3-0e7b2962f45b","Type":"ContainerStarted","Data":"92da9157fe7cbea1457e989b9f4334faecf7a8abb338418a7176007f5aecf798"} Oct 11 01:11:23 crc kubenswrapper[4743]: I1011 01:11:23.540589 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-crf47" Oct 11 01:11:23 crc kubenswrapper[4743]: I1011 01:11:23.687780 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nznnh\" (UniqueName: \"kubernetes.io/projected/ae3a2cd6-d036-4c48-aaed-fc9750d6c0d0-kube-api-access-nznnh\") pod \"ae3a2cd6-d036-4c48-aaed-fc9750d6c0d0\" (UID: \"ae3a2cd6-d036-4c48-aaed-fc9750d6c0d0\") " Oct 11 01:11:23 crc kubenswrapper[4743]: I1011 01:11:23.695050 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae3a2cd6-d036-4c48-aaed-fc9750d6c0d0-kube-api-access-nznnh" (OuterVolumeSpecName: "kube-api-access-nznnh") pod "ae3a2cd6-d036-4c48-aaed-fc9750d6c0d0" (UID: "ae3a2cd6-d036-4c48-aaed-fc9750d6c0d0"). InnerVolumeSpecName "kube-api-access-nznnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:23 crc kubenswrapper[4743]: I1011 01:11:23.789573 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nznnh\" (UniqueName: \"kubernetes.io/projected/ae3a2cd6-d036-4c48-aaed-fc9750d6c0d0-kube-api-access-nznnh\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:24 crc kubenswrapper[4743]: I1011 01:11:24.164089 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-crf47" event={"ID":"ae3a2cd6-d036-4c48-aaed-fc9750d6c0d0","Type":"ContainerDied","Data":"5dfda5948373cee3a91f6304ba6c31ae493c21f3aa9e10d3dffd01dd9da7da6f"} Oct 11 01:11:24 crc kubenswrapper[4743]: I1011 01:11:24.164127 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dfda5948373cee3a91f6304ba6c31ae493c21f3aa9e10d3dffd01dd9da7da6f" Oct 11 01:11:24 crc kubenswrapper[4743]: I1011 01:11:24.164189 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-crf47" Oct 11 01:11:25 crc kubenswrapper[4743]: I1011 01:11:25.176353 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"082aa898-adc9-4e0d-a5e3-329d36f391aa","Type":"ContainerStarted","Data":"f6e5971b16d67375ae6df1aaf256850b7b1b5a4a686d998b667ccad3cf2fa2c6"} Oct 11 01:11:26 crc kubenswrapper[4743]: I1011 01:11:26.186791 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"eed36ee9-8239-4139-97f3-0e7b2962f45b","Type":"ContainerStarted","Data":"363b58c9fb704ed36159ba5fd9ba18575d1bf189f256202f0a025abc798664c3"} Oct 11 01:11:26 crc kubenswrapper[4743]: I1011 01:11:26.194125 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"082aa898-adc9-4e0d-a5e3-329d36f391aa","Type":"ContainerStarted","Data":"ed9fc6224d475e597fd3943f968c898e3ac862180876ffb50428b2814a1e57d6"} Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.130358 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.443079 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.562733 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-nmtm5"] Oct 11 01:11:28 crc kubenswrapper[4743]: E1011 01:11:28.563080 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae3a2cd6-d036-4c48-aaed-fc9750d6c0d0" containerName="mariadb-database-create" Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.563097 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae3a2cd6-d036-4c48-aaed-fc9750d6c0d0" containerName="mariadb-database-create" Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.563266 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae3a2cd6-d036-4c48-aaed-fc9750d6c0d0" containerName="mariadb-database-create" Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.563815 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-nmtm5" Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.578053 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-nmtm5"] Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.644206 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-rvpwq"] Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.646963 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-rvpwq" Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.661328 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-rvpwq"] Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.693796 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnktr\" (UniqueName: \"kubernetes.io/projected/a2bc9888-ce44-44fc-84b0-747f726ec516-kube-api-access-jnktr\") pod \"cinder-db-create-rvpwq\" (UID: \"a2bc9888-ce44-44fc-84b0-747f726ec516\") " pod="openstack/cinder-db-create-rvpwq" Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.694201 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9rvw\" (UniqueName: \"kubernetes.io/projected/85d3caa5-89e1-4b4b-a4e2-5ec2dcb90002-kube-api-access-j9rvw\") pod \"heat-db-create-nmtm5\" (UID: \"85d3caa5-89e1-4b4b-a4e2-5ec2dcb90002\") " pod="openstack/heat-db-create-nmtm5" Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.749904 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-4wts2"] Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.752083 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4wts2" Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.760013 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-4wts2"] Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.797873 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chqg4\" (UniqueName: \"kubernetes.io/projected/860e062d-6883-4ea9-8e44-8b2f4e9bae60-kube-api-access-chqg4\") pod \"barbican-db-create-4wts2\" (UID: \"860e062d-6883-4ea9-8e44-8b2f4e9bae60\") " pod="openstack/barbican-db-create-4wts2" Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.797999 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9rvw\" (UniqueName: \"kubernetes.io/projected/85d3caa5-89e1-4b4b-a4e2-5ec2dcb90002-kube-api-access-j9rvw\") pod \"heat-db-create-nmtm5\" (UID: \"85d3caa5-89e1-4b4b-a4e2-5ec2dcb90002\") " pod="openstack/heat-db-create-nmtm5" Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.798070 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnktr\" (UniqueName: \"kubernetes.io/projected/a2bc9888-ce44-44fc-84b0-747f726ec516-kube-api-access-jnktr\") pod \"cinder-db-create-rvpwq\" (UID: \"a2bc9888-ce44-44fc-84b0-747f726ec516\") " pod="openstack/cinder-db-create-rvpwq" Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.818883 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9rvw\" (UniqueName: \"kubernetes.io/projected/85d3caa5-89e1-4b4b-a4e2-5ec2dcb90002-kube-api-access-j9rvw\") pod \"heat-db-create-nmtm5\" (UID: \"85d3caa5-89e1-4b4b-a4e2-5ec2dcb90002\") " pod="openstack/heat-db-create-nmtm5" Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.820927 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnktr\" (UniqueName: \"kubernetes.io/projected/a2bc9888-ce44-44fc-84b0-747f726ec516-kube-api-access-jnktr\") pod \"cinder-db-create-rvpwq\" (UID: \"a2bc9888-ce44-44fc-84b0-747f726ec516\") " pod="openstack/cinder-db-create-rvpwq" Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.859309 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-srz27"] Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.860479 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-srz27" Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.874328 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-srz27"] Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.885262 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-nmtm5" Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.900545 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chqg4\" (UniqueName: \"kubernetes.io/projected/860e062d-6883-4ea9-8e44-8b2f4e9bae60-kube-api-access-chqg4\") pod \"barbican-db-create-4wts2\" (UID: \"860e062d-6883-4ea9-8e44-8b2f4e9bae60\") " pod="openstack/barbican-db-create-4wts2" Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.900619 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frzw5\" (UniqueName: \"kubernetes.io/projected/583eac1b-7b00-44ab-8f94-59b016a1d635-kube-api-access-frzw5\") pod \"neutron-db-create-srz27\" (UID: \"583eac1b-7b00-44ab-8f94-59b016a1d635\") " pod="openstack/neutron-db-create-srz27" Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.914993 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-zdrsb"] Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.916109 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zdrsb" Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.919467 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.920141 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.920367 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4qpmq" Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.920514 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.935958 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-zdrsb"] Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.964099 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chqg4\" (UniqueName: \"kubernetes.io/projected/860e062d-6883-4ea9-8e44-8b2f4e9bae60-kube-api-access-chqg4\") pod \"barbican-db-create-4wts2\" (UID: \"860e062d-6883-4ea9-8e44-8b2f4e9bae60\") " pod="openstack/barbican-db-create-4wts2" Oct 11 01:11:28 crc kubenswrapper[4743]: I1011 01:11:28.968113 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-rvpwq" Oct 11 01:11:29 crc kubenswrapper[4743]: I1011 01:11:29.002396 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f546b6b-8484-4ad7-879f-593cf31efaaa-config-data\") pod \"keystone-db-sync-zdrsb\" (UID: \"1f546b6b-8484-4ad7-879f-593cf31efaaa\") " pod="openstack/keystone-db-sync-zdrsb" Oct 11 01:11:29 crc kubenswrapper[4743]: I1011 01:11:29.002437 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn2r7\" (UniqueName: \"kubernetes.io/projected/1f546b6b-8484-4ad7-879f-593cf31efaaa-kube-api-access-jn2r7\") pod \"keystone-db-sync-zdrsb\" (UID: \"1f546b6b-8484-4ad7-879f-593cf31efaaa\") " pod="openstack/keystone-db-sync-zdrsb" Oct 11 01:11:29 crc kubenswrapper[4743]: I1011 01:11:29.002738 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frzw5\" (UniqueName: \"kubernetes.io/projected/583eac1b-7b00-44ab-8f94-59b016a1d635-kube-api-access-frzw5\") pod \"neutron-db-create-srz27\" (UID: \"583eac1b-7b00-44ab-8f94-59b016a1d635\") " pod="openstack/neutron-db-create-srz27" Oct 11 01:11:29 crc kubenswrapper[4743]: I1011 01:11:29.003261 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f546b6b-8484-4ad7-879f-593cf31efaaa-combined-ca-bundle\") pod \"keystone-db-sync-zdrsb\" (UID: \"1f546b6b-8484-4ad7-879f-593cf31efaaa\") " pod="openstack/keystone-db-sync-zdrsb" Oct 11 01:11:29 crc kubenswrapper[4743]: I1011 01:11:29.029710 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frzw5\" (UniqueName: \"kubernetes.io/projected/583eac1b-7b00-44ab-8f94-59b016a1d635-kube-api-access-frzw5\") pod \"neutron-db-create-srz27\" (UID: \"583eac1b-7b00-44ab-8f94-59b016a1d635\") " pod="openstack/neutron-db-create-srz27" Oct 11 01:11:29 crc kubenswrapper[4743]: I1011 01:11:29.066704 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4wts2" Oct 11 01:11:29 crc kubenswrapper[4743]: I1011 01:11:29.104951 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f546b6b-8484-4ad7-879f-593cf31efaaa-config-data\") pod \"keystone-db-sync-zdrsb\" (UID: \"1f546b6b-8484-4ad7-879f-593cf31efaaa\") " pod="openstack/keystone-db-sync-zdrsb" Oct 11 01:11:29 crc kubenswrapper[4743]: I1011 01:11:29.105004 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn2r7\" (UniqueName: \"kubernetes.io/projected/1f546b6b-8484-4ad7-879f-593cf31efaaa-kube-api-access-jn2r7\") pod \"keystone-db-sync-zdrsb\" (UID: \"1f546b6b-8484-4ad7-879f-593cf31efaaa\") " pod="openstack/keystone-db-sync-zdrsb" Oct 11 01:11:29 crc kubenswrapper[4743]: I1011 01:11:29.105140 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f546b6b-8484-4ad7-879f-593cf31efaaa-combined-ca-bundle\") pod \"keystone-db-sync-zdrsb\" (UID: \"1f546b6b-8484-4ad7-879f-593cf31efaaa\") " pod="openstack/keystone-db-sync-zdrsb" Oct 11 01:11:29 crc kubenswrapper[4743]: I1011 01:11:29.110454 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f546b6b-8484-4ad7-879f-593cf31efaaa-combined-ca-bundle\") pod \"keystone-db-sync-zdrsb\" (UID: \"1f546b6b-8484-4ad7-879f-593cf31efaaa\") " pod="openstack/keystone-db-sync-zdrsb" Oct 11 01:11:29 crc kubenswrapper[4743]: I1011 01:11:29.110671 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f546b6b-8484-4ad7-879f-593cf31efaaa-config-data\") pod \"keystone-db-sync-zdrsb\" (UID: \"1f546b6b-8484-4ad7-879f-593cf31efaaa\") " pod="openstack/keystone-db-sync-zdrsb" Oct 11 01:11:29 crc kubenswrapper[4743]: I1011 01:11:29.126677 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn2r7\" (UniqueName: \"kubernetes.io/projected/1f546b6b-8484-4ad7-879f-593cf31efaaa-kube-api-access-jn2r7\") pod \"keystone-db-sync-zdrsb\" (UID: \"1f546b6b-8484-4ad7-879f-593cf31efaaa\") " pod="openstack/keystone-db-sync-zdrsb" Oct 11 01:11:29 crc kubenswrapper[4743]: I1011 01:11:29.217362 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-srz27" Oct 11 01:11:29 crc kubenswrapper[4743]: I1011 01:11:29.232964 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zdrsb" Oct 11 01:11:30 crc kubenswrapper[4743]: I1011 01:11:30.367386 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-5027-account-create-9tvlm"] Oct 11 01:11:30 crc kubenswrapper[4743]: I1011 01:11:30.371073 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-5027-account-create-9tvlm" Oct 11 01:11:30 crc kubenswrapper[4743]: I1011 01:11:30.374773 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Oct 11 01:11:30 crc kubenswrapper[4743]: I1011 01:11:30.389876 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-5027-account-create-9tvlm"] Oct 11 01:11:30 crc kubenswrapper[4743]: I1011 01:11:30.430318 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zlsq\" (UniqueName: \"kubernetes.io/projected/3147b11a-05a1-4e5b-93b9-14748977e08e-kube-api-access-2zlsq\") pod \"mysqld-exporter-5027-account-create-9tvlm\" (UID: \"3147b11a-05a1-4e5b-93b9-14748977e08e\") " pod="openstack/mysqld-exporter-5027-account-create-9tvlm" Oct 11 01:11:30 crc kubenswrapper[4743]: I1011 01:11:30.531804 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zlsq\" (UniqueName: \"kubernetes.io/projected/3147b11a-05a1-4e5b-93b9-14748977e08e-kube-api-access-2zlsq\") pod \"mysqld-exporter-5027-account-create-9tvlm\" (UID: \"3147b11a-05a1-4e5b-93b9-14748977e08e\") " pod="openstack/mysqld-exporter-5027-account-create-9tvlm" Oct 11 01:11:30 crc kubenswrapper[4743]: I1011 01:11:30.553171 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zlsq\" (UniqueName: \"kubernetes.io/projected/3147b11a-05a1-4e5b-93b9-14748977e08e-kube-api-access-2zlsq\") pod \"mysqld-exporter-5027-account-create-9tvlm\" (UID: \"3147b11a-05a1-4e5b-93b9-14748977e08e\") " pod="openstack/mysqld-exporter-5027-account-create-9tvlm" Oct 11 01:11:30 crc kubenswrapper[4743]: I1011 01:11:30.691490 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-5027-account-create-9tvlm" Oct 11 01:11:32 crc kubenswrapper[4743]: I1011 01:11:32.249639 4743 generic.go:334] "Generic (PLEG): container finished" podID="eed36ee9-8239-4139-97f3-0e7b2962f45b" containerID="363b58c9fb704ed36159ba5fd9ba18575d1bf189f256202f0a025abc798664c3" exitCode=0 Oct 11 01:11:32 crc kubenswrapper[4743]: I1011 01:11:32.249691 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"eed36ee9-8239-4139-97f3-0e7b2962f45b","Type":"ContainerDied","Data":"363b58c9fb704ed36159ba5fd9ba18575d1bf189f256202f0a025abc798664c3"} Oct 11 01:11:34 crc kubenswrapper[4743]: I1011 01:11:34.281936 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"eed36ee9-8239-4139-97f3-0e7b2962f45b","Type":"ContainerStarted","Data":"00553b9cb314c79b196dd3872613ff19afcb983163ded4c5dfabd5bee7808008"} Oct 11 01:11:34 crc kubenswrapper[4743]: I1011 01:11:34.287956 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"082aa898-adc9-4e0d-a5e3-329d36f391aa","Type":"ContainerStarted","Data":"aebbfe1ba3e122c68255fb97129dab383c26f7e8d0f35161746a8a91fdf03dc6"} Oct 11 01:11:34 crc kubenswrapper[4743]: I1011 01:11:34.295390 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-rvpwq"] Oct 11 01:11:34 crc kubenswrapper[4743]: W1011 01:11:34.319824 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2bc9888_ce44_44fc_84b0_747f726ec516.slice/crio-507af387a2499fee5be8e499e39f9e4633f6a0d434e0890b73f9c1f49d750fbc WatchSource:0}: Error finding container 507af387a2499fee5be8e499e39f9e4633f6a0d434e0890b73f9c1f49d750fbc: Status 404 returned error can't find the container with id 507af387a2499fee5be8e499e39f9e4633f6a0d434e0890b73f9c1f49d750fbc Oct 11 01:11:34 crc kubenswrapper[4743]: I1011 01:11:34.329632 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-5027-account-create-9tvlm"] Oct 11 01:11:34 crc kubenswrapper[4743]: I1011 01:11:34.689624 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-zdrsb"] Oct 11 01:11:34 crc kubenswrapper[4743]: I1011 01:11:34.744268 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-4wts2"] Oct 11 01:11:34 crc kubenswrapper[4743]: I1011 01:11:34.776456 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-nmtm5"] Oct 11 01:11:34 crc kubenswrapper[4743]: I1011 01:11:34.782461 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-srz27"] Oct 11 01:11:34 crc kubenswrapper[4743]: W1011 01:11:34.844212 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod583eac1b_7b00_44ab_8f94_59b016a1d635.slice/crio-e4a48b48b519e94fa6390903ee29966d0b808449d2e2decc278b3264551d77b1 WatchSource:0}: Error finding container e4a48b48b519e94fa6390903ee29966d0b808449d2e2decc278b3264551d77b1: Status 404 returned error can't find the container with id e4a48b48b519e94fa6390903ee29966d0b808449d2e2decc278b3264551d77b1 Oct 11 01:11:35 crc kubenswrapper[4743]: I1011 01:11:35.303747 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4wts2" event={"ID":"860e062d-6883-4ea9-8e44-8b2f4e9bae60","Type":"ContainerStarted","Data":"cce7ba1556d93ffc1c9c1d3cd3274118e3c425e6dade6ed7a56a6db89b174f82"} Oct 11 01:11:35 crc kubenswrapper[4743]: I1011 01:11:35.306212 4743 generic.go:334] "Generic (PLEG): container finished" podID="3147b11a-05a1-4e5b-93b9-14748977e08e" containerID="70e9afaaf9657a2a88ff06cd48b80ae8cefd54ae9a3238e5e7ef25ea7d659cef" exitCode=0 Oct 11 01:11:35 crc kubenswrapper[4743]: I1011 01:11:35.306660 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-5027-account-create-9tvlm" event={"ID":"3147b11a-05a1-4e5b-93b9-14748977e08e","Type":"ContainerDied","Data":"70e9afaaf9657a2a88ff06cd48b80ae8cefd54ae9a3238e5e7ef25ea7d659cef"} Oct 11 01:11:35 crc kubenswrapper[4743]: I1011 01:11:35.306728 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-5027-account-create-9tvlm" event={"ID":"3147b11a-05a1-4e5b-93b9-14748977e08e","Type":"ContainerStarted","Data":"be0487d2e678d12031ad9cba4f90013cf91b82ba9f5847590c3651d8ac86f1cd"} Oct 11 01:11:35 crc kubenswrapper[4743]: I1011 01:11:35.308169 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-srz27" event={"ID":"583eac1b-7b00-44ab-8f94-59b016a1d635","Type":"ContainerStarted","Data":"e4a48b48b519e94fa6390903ee29966d0b808449d2e2decc278b3264551d77b1"} Oct 11 01:11:35 crc kubenswrapper[4743]: I1011 01:11:35.311603 4743 generic.go:334] "Generic (PLEG): container finished" podID="a2bc9888-ce44-44fc-84b0-747f726ec516" containerID="e83332b3e2b3333428ecde30b206b683751b46aea84ab7f7b12a0a0ab9be979d" exitCode=0 Oct 11 01:11:35 crc kubenswrapper[4743]: I1011 01:11:35.311657 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-rvpwq" event={"ID":"a2bc9888-ce44-44fc-84b0-747f726ec516","Type":"ContainerDied","Data":"e83332b3e2b3333428ecde30b206b683751b46aea84ab7f7b12a0a0ab9be979d"} Oct 11 01:11:35 crc kubenswrapper[4743]: I1011 01:11:35.311674 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-rvpwq" event={"ID":"a2bc9888-ce44-44fc-84b0-747f726ec516","Type":"ContainerStarted","Data":"507af387a2499fee5be8e499e39f9e4633f6a0d434e0890b73f9c1f49d750fbc"} Oct 11 01:11:35 crc kubenswrapper[4743]: I1011 01:11:35.318106 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-nmtm5" event={"ID":"85d3caa5-89e1-4b4b-a4e2-5ec2dcb90002","Type":"ContainerStarted","Data":"0a528ef68a858f321e9ea6ef54e525da7741312114fdc21eeef68e9d192935b5"} Oct 11 01:11:35 crc kubenswrapper[4743]: I1011 01:11:35.337132 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"082aa898-adc9-4e0d-a5e3-329d36f391aa","Type":"ContainerStarted","Data":"d813fc97985d738ee035d30bf2d533dd65d0ad6b97dae6d5b67e6491dfd4301b"} Oct 11 01:11:35 crc kubenswrapper[4743]: I1011 01:11:35.337199 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"082aa898-adc9-4e0d-a5e3-329d36f391aa","Type":"ContainerStarted","Data":"b9342f1823e3d6254aa5a3ad0703d94ac984e886b218dacf81482d89eecc7515"} Oct 11 01:11:35 crc kubenswrapper[4743]: I1011 01:11:35.339069 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zdrsb" event={"ID":"1f546b6b-8484-4ad7-879f-593cf31efaaa","Type":"ContainerStarted","Data":"22c90d4b209239ba748eea890f2b409b14126efd48cc5b4410ef63bddd623662"} Oct 11 01:11:35 crc kubenswrapper[4743]: I1011 01:11:35.340556 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dznrn" event={"ID":"f60a76bb-fc23-4e3a-b7ad-123f65747952","Type":"ContainerStarted","Data":"256ff5de7d671245b9e23bb339c4f711ebe44aff6310320abf53d018d077d3dd"} Oct 11 01:11:35 crc kubenswrapper[4743]: I1011 01:11:35.375199 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-dznrn" podStartSLOduration=2.706059924 podStartE2EDuration="18.3751833s" podCreationTimestamp="2025-10-11 01:11:17 +0000 UTC" firstStartedPulling="2025-10-11 01:11:18.233007288 +0000 UTC m=+1172.885987685" lastFinishedPulling="2025-10-11 01:11:33.902130654 +0000 UTC m=+1188.555111061" observedRunningTime="2025-10-11 01:11:35.36571085 +0000 UTC m=+1190.018691247" watchObservedRunningTime="2025-10-11 01:11:35.3751833 +0000 UTC m=+1190.028163697" Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.355331 4743 generic.go:334] "Generic (PLEG): container finished" podID="85d3caa5-89e1-4b4b-a4e2-5ec2dcb90002" containerID="af90b0865142410807d8db6070a8f38daa83fd613473c41eca141d2a24d258b2" exitCode=0 Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.355376 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-nmtm5" event={"ID":"85d3caa5-89e1-4b4b-a4e2-5ec2dcb90002","Type":"ContainerDied","Data":"af90b0865142410807d8db6070a8f38daa83fd613473c41eca141d2a24d258b2"} Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.363877 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"082aa898-adc9-4e0d-a5e3-329d36f391aa","Type":"ContainerStarted","Data":"e1f1e91fccd2499e5c5d0d0d23c17920f0536b8b13ab70d3b9ec198c7b6dc3ae"} Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.363929 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"082aa898-adc9-4e0d-a5e3-329d36f391aa","Type":"ContainerStarted","Data":"59c0381758b38998c39d943bbacdde51a53307d8976a364f6aad2c9ab1b7b0ce"} Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.365492 4743 generic.go:334] "Generic (PLEG): container finished" podID="860e062d-6883-4ea9-8e44-8b2f4e9bae60" containerID="1dff9ac51551dc7108a67e25d8c101ce005b31d733514e1d194a9523e985cc5f" exitCode=0 Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.365629 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4wts2" event={"ID":"860e062d-6883-4ea9-8e44-8b2f4e9bae60","Type":"ContainerDied","Data":"1dff9ac51551dc7108a67e25d8c101ce005b31d733514e1d194a9523e985cc5f"} Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.369484 4743 generic.go:334] "Generic (PLEG): container finished" podID="583eac1b-7b00-44ab-8f94-59b016a1d635" containerID="08f957faee5a7ed7a08ec2e27ccb9a79aaad18f10f96bc3b3b821591a4f7fda6" exitCode=0 Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.369605 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-srz27" event={"ID":"583eac1b-7b00-44ab-8f94-59b016a1d635","Type":"ContainerDied","Data":"08f957faee5a7ed7a08ec2e27ccb9a79aaad18f10f96bc3b3b821591a4f7fda6"} Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.446810 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=46.023625052 podStartE2EDuration="51.446796004s" podCreationTimestamp="2025-10-11 01:10:45 +0000 UTC" firstStartedPulling="2025-10-11 01:11:18.887050259 +0000 UTC m=+1173.540030656" lastFinishedPulling="2025-10-11 01:11:24.310221211 +0000 UTC m=+1178.963201608" observedRunningTime="2025-10-11 01:11:36.443205423 +0000 UTC m=+1191.096185840" watchObservedRunningTime="2025-10-11 01:11:36.446796004 +0000 UTC m=+1191.099776401" Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.739784 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-9jggh"] Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.744696 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.758316 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.785645 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-9jggh"] Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.880786 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-9jggh\" (UID: \"225b975c-f87f-4830-bedb-8ecbfb7bd941\") " pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.880960 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-9jggh\" (UID: \"225b975c-f87f-4830-bedb-8ecbfb7bd941\") " pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.881016 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztzzd\" (UniqueName: \"kubernetes.io/projected/225b975c-f87f-4830-bedb-8ecbfb7bd941-kube-api-access-ztzzd\") pod \"dnsmasq-dns-5c79d794d7-9jggh\" (UID: \"225b975c-f87f-4830-bedb-8ecbfb7bd941\") " pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.881040 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-9jggh\" (UID: \"225b975c-f87f-4830-bedb-8ecbfb7bd941\") " pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.881096 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-config\") pod \"dnsmasq-dns-5c79d794d7-9jggh\" (UID: \"225b975c-f87f-4830-bedb-8ecbfb7bd941\") " pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.881154 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-9jggh\" (UID: \"225b975c-f87f-4830-bedb-8ecbfb7bd941\") " pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.975272 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-rvpwq" Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.982930 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-9jggh\" (UID: \"225b975c-f87f-4830-bedb-8ecbfb7bd941\") " pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.983014 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-9jggh\" (UID: \"225b975c-f87f-4830-bedb-8ecbfb7bd941\") " pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.983132 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztzzd\" (UniqueName: \"kubernetes.io/projected/225b975c-f87f-4830-bedb-8ecbfb7bd941-kube-api-access-ztzzd\") pod \"dnsmasq-dns-5c79d794d7-9jggh\" (UID: \"225b975c-f87f-4830-bedb-8ecbfb7bd941\") " pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.983163 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-9jggh\" (UID: \"225b975c-f87f-4830-bedb-8ecbfb7bd941\") " pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.983225 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-config\") pod \"dnsmasq-dns-5c79d794d7-9jggh\" (UID: \"225b975c-f87f-4830-bedb-8ecbfb7bd941\") " pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.983299 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-9jggh\" (UID: \"225b975c-f87f-4830-bedb-8ecbfb7bd941\") " pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.984069 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-9jggh\" (UID: \"225b975c-f87f-4830-bedb-8ecbfb7bd941\") " pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.984354 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-9jggh\" (UID: \"225b975c-f87f-4830-bedb-8ecbfb7bd941\") " pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.984355 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-config\") pod \"dnsmasq-dns-5c79d794d7-9jggh\" (UID: \"225b975c-f87f-4830-bedb-8ecbfb7bd941\") " pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.984372 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-9jggh\" (UID: \"225b975c-f87f-4830-bedb-8ecbfb7bd941\") " pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.984488 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-5027-account-create-9tvlm" Oct 11 01:11:36 crc kubenswrapper[4743]: I1011 01:11:36.985036 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-9jggh\" (UID: \"225b975c-f87f-4830-bedb-8ecbfb7bd941\") " pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" Oct 11 01:11:37 crc kubenswrapper[4743]: I1011 01:11:37.020129 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztzzd\" (UniqueName: \"kubernetes.io/projected/225b975c-f87f-4830-bedb-8ecbfb7bd941-kube-api-access-ztzzd\") pod \"dnsmasq-dns-5c79d794d7-9jggh\" (UID: \"225b975c-f87f-4830-bedb-8ecbfb7bd941\") " pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" Oct 11 01:11:37 crc kubenswrapper[4743]: I1011 01:11:37.086566 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnktr\" (UniqueName: \"kubernetes.io/projected/a2bc9888-ce44-44fc-84b0-747f726ec516-kube-api-access-jnktr\") pod \"a2bc9888-ce44-44fc-84b0-747f726ec516\" (UID: \"a2bc9888-ce44-44fc-84b0-747f726ec516\") " Oct 11 01:11:37 crc kubenswrapper[4743]: I1011 01:11:37.086878 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zlsq\" (UniqueName: \"kubernetes.io/projected/3147b11a-05a1-4e5b-93b9-14748977e08e-kube-api-access-2zlsq\") pod \"3147b11a-05a1-4e5b-93b9-14748977e08e\" (UID: \"3147b11a-05a1-4e5b-93b9-14748977e08e\") " Oct 11 01:11:37 crc kubenswrapper[4743]: I1011 01:11:37.088967 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" Oct 11 01:11:37 crc kubenswrapper[4743]: I1011 01:11:37.089488 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2bc9888-ce44-44fc-84b0-747f726ec516-kube-api-access-jnktr" (OuterVolumeSpecName: "kube-api-access-jnktr") pod "a2bc9888-ce44-44fc-84b0-747f726ec516" (UID: "a2bc9888-ce44-44fc-84b0-747f726ec516"). InnerVolumeSpecName "kube-api-access-jnktr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:37 crc kubenswrapper[4743]: I1011 01:11:37.094677 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3147b11a-05a1-4e5b-93b9-14748977e08e-kube-api-access-2zlsq" (OuterVolumeSpecName: "kube-api-access-2zlsq") pod "3147b11a-05a1-4e5b-93b9-14748977e08e" (UID: "3147b11a-05a1-4e5b-93b9-14748977e08e"). InnerVolumeSpecName "kube-api-access-2zlsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:37 crc kubenswrapper[4743]: I1011 01:11:37.188749 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnktr\" (UniqueName: \"kubernetes.io/projected/a2bc9888-ce44-44fc-84b0-747f726ec516-kube-api-access-jnktr\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:37 crc kubenswrapper[4743]: I1011 01:11:37.188787 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zlsq\" (UniqueName: \"kubernetes.io/projected/3147b11a-05a1-4e5b-93b9-14748977e08e-kube-api-access-2zlsq\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:37 crc kubenswrapper[4743]: I1011 01:11:37.379761 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-5027-account-create-9tvlm" Oct 11 01:11:37 crc kubenswrapper[4743]: I1011 01:11:37.380381 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-5027-account-create-9tvlm" event={"ID":"3147b11a-05a1-4e5b-93b9-14748977e08e","Type":"ContainerDied","Data":"be0487d2e678d12031ad9cba4f90013cf91b82ba9f5847590c3651d8ac86f1cd"} Oct 11 01:11:37 crc kubenswrapper[4743]: I1011 01:11:37.380415 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be0487d2e678d12031ad9cba4f90013cf91b82ba9f5847590c3651d8ac86f1cd" Oct 11 01:11:37 crc kubenswrapper[4743]: I1011 01:11:37.384365 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"eed36ee9-8239-4139-97f3-0e7b2962f45b","Type":"ContainerStarted","Data":"9afcbeb31a6f146804cd45304886cfe0f3486e3f085325e83c7175c1faa1b419"} Oct 11 01:11:37 crc kubenswrapper[4743]: I1011 01:11:37.384518 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"eed36ee9-8239-4139-97f3-0e7b2962f45b","Type":"ContainerStarted","Data":"aaee5363d06c135e13064c08e7e37db518da947eda8b9e1f02de9042213ce827"} Oct 11 01:11:37 crc kubenswrapper[4743]: I1011 01:11:37.399961 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-rvpwq" Oct 11 01:11:37 crc kubenswrapper[4743]: I1011 01:11:37.402129 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-rvpwq" event={"ID":"a2bc9888-ce44-44fc-84b0-747f726ec516","Type":"ContainerDied","Data":"507af387a2499fee5be8e499e39f9e4633f6a0d434e0890b73f9c1f49d750fbc"} Oct 11 01:11:37 crc kubenswrapper[4743]: I1011 01:11:37.402176 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="507af387a2499fee5be8e499e39f9e4633f6a0d434e0890b73f9c1f49d750fbc" Oct 11 01:11:37 crc kubenswrapper[4743]: I1011 01:11:37.417553 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=16.41753717 podStartE2EDuration="16.41753717s" podCreationTimestamp="2025-10-11 01:11:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:11:37.413443836 +0000 UTC m=+1192.066424253" watchObservedRunningTime="2025-10-11 01:11:37.41753717 +0000 UTC m=+1192.070517567" Oct 11 01:11:37 crc kubenswrapper[4743]: I1011 01:11:37.564389 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-9jggh"] Oct 11 01:11:38 crc kubenswrapper[4743]: I1011 01:11:38.415928 4743 generic.go:334] "Generic (PLEG): container finished" podID="225b975c-f87f-4830-bedb-8ecbfb7bd941" containerID="4560273b2cafc18804c5cecab511c9996bfa79ae2d9d8394e05c34af64ba2656" exitCode=0 Oct 11 01:11:38 crc kubenswrapper[4743]: I1011 01:11:38.415995 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" event={"ID":"225b975c-f87f-4830-bedb-8ecbfb7bd941","Type":"ContainerDied","Data":"4560273b2cafc18804c5cecab511c9996bfa79ae2d9d8394e05c34af64ba2656"} Oct 11 01:11:38 crc kubenswrapper[4743]: I1011 01:11:38.416244 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" event={"ID":"225b975c-f87f-4830-bedb-8ecbfb7bd941","Type":"ContainerStarted","Data":"c86bb645f3cceec10612bfa9f7293f415e92e13cf968a0a10b25b953fbee1baa"} Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.070294 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-nmtm5" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.077702 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-srz27" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.089523 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4wts2" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.136472 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chqg4\" (UniqueName: \"kubernetes.io/projected/860e062d-6883-4ea9-8e44-8b2f4e9bae60-kube-api-access-chqg4\") pod \"860e062d-6883-4ea9-8e44-8b2f4e9bae60\" (UID: \"860e062d-6883-4ea9-8e44-8b2f4e9bae60\") " Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.136908 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9rvw\" (UniqueName: \"kubernetes.io/projected/85d3caa5-89e1-4b4b-a4e2-5ec2dcb90002-kube-api-access-j9rvw\") pod \"85d3caa5-89e1-4b4b-a4e2-5ec2dcb90002\" (UID: \"85d3caa5-89e1-4b4b-a4e2-5ec2dcb90002\") " Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.137016 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frzw5\" (UniqueName: \"kubernetes.io/projected/583eac1b-7b00-44ab-8f94-59b016a1d635-kube-api-access-frzw5\") pod \"583eac1b-7b00-44ab-8f94-59b016a1d635\" (UID: \"583eac1b-7b00-44ab-8f94-59b016a1d635\") " Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.155466 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85d3caa5-89e1-4b4b-a4e2-5ec2dcb90002-kube-api-access-j9rvw" (OuterVolumeSpecName: "kube-api-access-j9rvw") pod "85d3caa5-89e1-4b4b-a4e2-5ec2dcb90002" (UID: "85d3caa5-89e1-4b4b-a4e2-5ec2dcb90002"). InnerVolumeSpecName "kube-api-access-j9rvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.161484 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/583eac1b-7b00-44ab-8f94-59b016a1d635-kube-api-access-frzw5" (OuterVolumeSpecName: "kube-api-access-frzw5") pod "583eac1b-7b00-44ab-8f94-59b016a1d635" (UID: "583eac1b-7b00-44ab-8f94-59b016a1d635"). InnerVolumeSpecName "kube-api-access-frzw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.164394 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/860e062d-6883-4ea9-8e44-8b2f4e9bae60-kube-api-access-chqg4" (OuterVolumeSpecName: "kube-api-access-chqg4") pod "860e062d-6883-4ea9-8e44-8b2f4e9bae60" (UID: "860e062d-6883-4ea9-8e44-8b2f4e9bae60"). InnerVolumeSpecName "kube-api-access-chqg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.252585 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chqg4\" (UniqueName: \"kubernetes.io/projected/860e062d-6883-4ea9-8e44-8b2f4e9bae60-kube-api-access-chqg4\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.252671 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9rvw\" (UniqueName: \"kubernetes.io/projected/85d3caa5-89e1-4b4b-a4e2-5ec2dcb90002-kube-api-access-j9rvw\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.252693 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frzw5\" (UniqueName: \"kubernetes.io/projected/583eac1b-7b00-44ab-8f94-59b016a1d635-kube-api-access-frzw5\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.437646 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" event={"ID":"225b975c-f87f-4830-bedb-8ecbfb7bd941","Type":"ContainerStarted","Data":"a8bcc499b0f5af70609fbf02db5bc142683a57c1e8969b30c81386b972c24ea7"} Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.437797 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.440961 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-nmtm5" event={"ID":"85d3caa5-89e1-4b4b-a4e2-5ec2dcb90002","Type":"ContainerDied","Data":"0a528ef68a858f321e9ea6ef54e525da7741312114fdc21eeef68e9d192935b5"} Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.441012 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a528ef68a858f321e9ea6ef54e525da7741312114fdc21eeef68e9d192935b5" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.440982 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-nmtm5" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.449700 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zdrsb" event={"ID":"1f546b6b-8484-4ad7-879f-593cf31efaaa","Type":"ContainerStarted","Data":"c4640f891f229b11967e57e15d36e53eec7cf21df3cc39f4f37f338720a00c5e"} Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.451226 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4wts2" event={"ID":"860e062d-6883-4ea9-8e44-8b2f4e9bae60","Type":"ContainerDied","Data":"cce7ba1556d93ffc1c9c1d3cd3274118e3c425e6dade6ed7a56a6db89b174f82"} Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.451254 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4wts2" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.451253 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cce7ba1556d93ffc1c9c1d3cd3274118e3c425e6dade6ed7a56a6db89b174f82" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.452561 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-srz27" event={"ID":"583eac1b-7b00-44ab-8f94-59b016a1d635","Type":"ContainerDied","Data":"e4a48b48b519e94fa6390903ee29966d0b808449d2e2decc278b3264551d77b1"} Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.452595 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4a48b48b519e94fa6390903ee29966d0b808449d2e2decc278b3264551d77b1" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.452612 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-srz27" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.470202 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" podStartSLOduration=4.470184548 podStartE2EDuration="4.470184548s" podCreationTimestamp="2025-10-11 01:11:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:11:40.464838922 +0000 UTC m=+1195.117819309" watchObservedRunningTime="2025-10-11 01:11:40.470184548 +0000 UTC m=+1195.123164945" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.492179 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-zdrsb" podStartSLOduration=7.078023572 podStartE2EDuration="12.492159635s" podCreationTimestamp="2025-10-11 01:11:28 +0000 UTC" firstStartedPulling="2025-10-11 01:11:34.714688775 +0000 UTC m=+1189.367669172" lastFinishedPulling="2025-10-11 01:11:40.128824838 +0000 UTC m=+1194.781805235" observedRunningTime="2025-10-11 01:11:40.488332258 +0000 UTC m=+1195.141312655" watchObservedRunningTime="2025-10-11 01:11:40.492159635 +0000 UTC m=+1195.145140042" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.518878 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Oct 11 01:11:40 crc kubenswrapper[4743]: E1011 01:11:40.519231 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2bc9888-ce44-44fc-84b0-747f726ec516" containerName="mariadb-database-create" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.519246 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2bc9888-ce44-44fc-84b0-747f726ec516" containerName="mariadb-database-create" Oct 11 01:11:40 crc kubenswrapper[4743]: E1011 01:11:40.519266 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="583eac1b-7b00-44ab-8f94-59b016a1d635" containerName="mariadb-database-create" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.519276 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="583eac1b-7b00-44ab-8f94-59b016a1d635" containerName="mariadb-database-create" Oct 11 01:11:40 crc kubenswrapper[4743]: E1011 01:11:40.519289 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3147b11a-05a1-4e5b-93b9-14748977e08e" containerName="mariadb-account-create" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.519295 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3147b11a-05a1-4e5b-93b9-14748977e08e" containerName="mariadb-account-create" Oct 11 01:11:40 crc kubenswrapper[4743]: E1011 01:11:40.519317 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="860e062d-6883-4ea9-8e44-8b2f4e9bae60" containerName="mariadb-database-create" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.519322 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="860e062d-6883-4ea9-8e44-8b2f4e9bae60" containerName="mariadb-database-create" Oct 11 01:11:40 crc kubenswrapper[4743]: E1011 01:11:40.519334 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d3caa5-89e1-4b4b-a4e2-5ec2dcb90002" containerName="mariadb-database-create" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.519342 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d3caa5-89e1-4b4b-a4e2-5ec2dcb90002" containerName="mariadb-database-create" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.519499 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2bc9888-ce44-44fc-84b0-747f726ec516" containerName="mariadb-database-create" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.519516 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="583eac1b-7b00-44ab-8f94-59b016a1d635" containerName="mariadb-database-create" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.519528 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d3caa5-89e1-4b4b-a4e2-5ec2dcb90002" containerName="mariadb-database-create" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.519539 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="860e062d-6883-4ea9-8e44-8b2f4e9bae60" containerName="mariadb-database-create" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.519551 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3147b11a-05a1-4e5b-93b9-14748977e08e" containerName="mariadb-account-create" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.520160 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.522811 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.539635 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.557256 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4981c3d3-04e6-4e36-8a2c-fa34b65d8621-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"4981c3d3-04e6-4e36-8a2c-fa34b65d8621\") " pod="openstack/mysqld-exporter-0" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.557350 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4981c3d3-04e6-4e36-8a2c-fa34b65d8621-config-data\") pod \"mysqld-exporter-0\" (UID: \"4981c3d3-04e6-4e36-8a2c-fa34b65d8621\") " pod="openstack/mysqld-exporter-0" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.557412 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtd5s\" (UniqueName: \"kubernetes.io/projected/4981c3d3-04e6-4e36-8a2c-fa34b65d8621-kube-api-access-wtd5s\") pod \"mysqld-exporter-0\" (UID: \"4981c3d3-04e6-4e36-8a2c-fa34b65d8621\") " pod="openstack/mysqld-exporter-0" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.658542 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4981c3d3-04e6-4e36-8a2c-fa34b65d8621-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"4981c3d3-04e6-4e36-8a2c-fa34b65d8621\") " pod="openstack/mysqld-exporter-0" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.658618 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4981c3d3-04e6-4e36-8a2c-fa34b65d8621-config-data\") pod \"mysqld-exporter-0\" (UID: \"4981c3d3-04e6-4e36-8a2c-fa34b65d8621\") " pod="openstack/mysqld-exporter-0" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.658666 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtd5s\" (UniqueName: \"kubernetes.io/projected/4981c3d3-04e6-4e36-8a2c-fa34b65d8621-kube-api-access-wtd5s\") pod \"mysqld-exporter-0\" (UID: \"4981c3d3-04e6-4e36-8a2c-fa34b65d8621\") " pod="openstack/mysqld-exporter-0" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.662330 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4981c3d3-04e6-4e36-8a2c-fa34b65d8621-config-data\") pod \"mysqld-exporter-0\" (UID: \"4981c3d3-04e6-4e36-8a2c-fa34b65d8621\") " pod="openstack/mysqld-exporter-0" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.662539 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4981c3d3-04e6-4e36-8a2c-fa34b65d8621-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"4981c3d3-04e6-4e36-8a2c-fa34b65d8621\") " pod="openstack/mysqld-exporter-0" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.675776 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtd5s\" (UniqueName: \"kubernetes.io/projected/4981c3d3-04e6-4e36-8a2c-fa34b65d8621-kube-api-access-wtd5s\") pod \"mysqld-exporter-0\" (UID: \"4981c3d3-04e6-4e36-8a2c-fa34b65d8621\") " pod="openstack/mysqld-exporter-0" Oct 11 01:11:40 crc kubenswrapper[4743]: I1011 01:11:40.834139 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Oct 11 01:11:41 crc kubenswrapper[4743]: I1011 01:11:41.297158 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Oct 11 01:11:41 crc kubenswrapper[4743]: I1011 01:11:41.471354 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"4981c3d3-04e6-4e36-8a2c-fa34b65d8621","Type":"ContainerStarted","Data":"6eebfb975ac36d7a6d0cbcbb162a2e9d833635fcd05ceaba60ac7a13d30de7f5"} Oct 11 01:11:41 crc kubenswrapper[4743]: I1011 01:11:41.579849 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:42 crc kubenswrapper[4743]: I1011 01:11:42.480313 4743 generic.go:334] "Generic (PLEG): container finished" podID="f60a76bb-fc23-4e3a-b7ad-123f65747952" containerID="256ff5de7d671245b9e23bb339c4f711ebe44aff6310320abf53d018d077d3dd" exitCode=0 Oct 11 01:11:42 crc kubenswrapper[4743]: I1011 01:11:42.480372 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dznrn" event={"ID":"f60a76bb-fc23-4e3a-b7ad-123f65747952","Type":"ContainerDied","Data":"256ff5de7d671245b9e23bb339c4f711ebe44aff6310320abf53d018d077d3dd"} Oct 11 01:11:43 crc kubenswrapper[4743]: I1011 01:11:43.505683 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"4981c3d3-04e6-4e36-8a2c-fa34b65d8621","Type":"ContainerStarted","Data":"56e76dde75b702f858326d4fbaf27c11f7664ed304128b414db4e2e2b05e6fe4"} Oct 11 01:11:43 crc kubenswrapper[4743]: I1011 01:11:43.519028 4743 generic.go:334] "Generic (PLEG): container finished" podID="1f546b6b-8484-4ad7-879f-593cf31efaaa" containerID="c4640f891f229b11967e57e15d36e53eec7cf21df3cc39f4f37f338720a00c5e" exitCode=0 Oct 11 01:11:43 crc kubenswrapper[4743]: I1011 01:11:43.519156 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zdrsb" event={"ID":"1f546b6b-8484-4ad7-879f-593cf31efaaa","Type":"ContainerDied","Data":"c4640f891f229b11967e57e15d36e53eec7cf21df3cc39f4f37f338720a00c5e"} Oct 11 01:11:43 crc kubenswrapper[4743]: I1011 01:11:43.539238 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=2.032512651 podStartE2EDuration="3.539212101s" podCreationTimestamp="2025-10-11 01:11:40 +0000 UTC" firstStartedPulling="2025-10-11 01:11:41.339995613 +0000 UTC m=+1195.992976010" lastFinishedPulling="2025-10-11 01:11:42.846695063 +0000 UTC m=+1197.499675460" observedRunningTime="2025-10-11 01:11:43.531654269 +0000 UTC m=+1198.184634696" watchObservedRunningTime="2025-10-11 01:11:43.539212101 +0000 UTC m=+1198.192192518" Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.289061 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dznrn" Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.332389 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng6gj\" (UniqueName: \"kubernetes.io/projected/f60a76bb-fc23-4e3a-b7ad-123f65747952-kube-api-access-ng6gj\") pod \"f60a76bb-fc23-4e3a-b7ad-123f65747952\" (UID: \"f60a76bb-fc23-4e3a-b7ad-123f65747952\") " Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.332617 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f60a76bb-fc23-4e3a-b7ad-123f65747952-config-data\") pod \"f60a76bb-fc23-4e3a-b7ad-123f65747952\" (UID: \"f60a76bb-fc23-4e3a-b7ad-123f65747952\") " Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.332660 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f60a76bb-fc23-4e3a-b7ad-123f65747952-db-sync-config-data\") pod \"f60a76bb-fc23-4e3a-b7ad-123f65747952\" (UID: \"f60a76bb-fc23-4e3a-b7ad-123f65747952\") " Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.332699 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f60a76bb-fc23-4e3a-b7ad-123f65747952-combined-ca-bundle\") pod \"f60a76bb-fc23-4e3a-b7ad-123f65747952\" (UID: \"f60a76bb-fc23-4e3a-b7ad-123f65747952\") " Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.338851 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f60a76bb-fc23-4e3a-b7ad-123f65747952-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f60a76bb-fc23-4e3a-b7ad-123f65747952" (UID: "f60a76bb-fc23-4e3a-b7ad-123f65747952"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.342319 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f60a76bb-fc23-4e3a-b7ad-123f65747952-kube-api-access-ng6gj" (OuterVolumeSpecName: "kube-api-access-ng6gj") pod "f60a76bb-fc23-4e3a-b7ad-123f65747952" (UID: "f60a76bb-fc23-4e3a-b7ad-123f65747952"). InnerVolumeSpecName "kube-api-access-ng6gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.382313 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f60a76bb-fc23-4e3a-b7ad-123f65747952-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f60a76bb-fc23-4e3a-b7ad-123f65747952" (UID: "f60a76bb-fc23-4e3a-b7ad-123f65747952"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.407215 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f60a76bb-fc23-4e3a-b7ad-123f65747952-config-data" (OuterVolumeSpecName: "config-data") pod "f60a76bb-fc23-4e3a-b7ad-123f65747952" (UID: "f60a76bb-fc23-4e3a-b7ad-123f65747952"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.434901 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f60a76bb-fc23-4e3a-b7ad-123f65747952-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.434941 4743 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f60a76bb-fc23-4e3a-b7ad-123f65747952-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.434955 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f60a76bb-fc23-4e3a-b7ad-123f65747952-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.434967 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng6gj\" (UniqueName: \"kubernetes.io/projected/f60a76bb-fc23-4e3a-b7ad-123f65747952-kube-api-access-ng6gj\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.530171 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dznrn" event={"ID":"f60a76bb-fc23-4e3a-b7ad-123f65747952","Type":"ContainerDied","Data":"c4947028128045b422e02d3761752b8103fd1284c0452aaee87ecf7ca8fcc6e2"} Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.530222 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4947028128045b422e02d3761752b8103fd1284c0452aaee87ecf7ca8fcc6e2" Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.531763 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dznrn" Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.892821 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-9jggh"] Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.894485 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" podUID="225b975c-f87f-4830-bedb-8ecbfb7bd941" containerName="dnsmasq-dns" containerID="cri-o://a8bcc499b0f5af70609fbf02db5bc142683a57c1e8969b30c81386b972c24ea7" gracePeriod=10 Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.911393 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-xlkcn"] Oct 11 01:11:44 crc kubenswrapper[4743]: E1011 01:11:44.911744 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f60a76bb-fc23-4e3a-b7ad-123f65747952" containerName="glance-db-sync" Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.911759 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f60a76bb-fc23-4e3a-b7ad-123f65747952" containerName="glance-db-sync" Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.911936 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f60a76bb-fc23-4e3a-b7ad-123f65747952" containerName="glance-db-sync" Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.912916 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-xlkcn" Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.925522 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zdrsb" Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.944251 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f546b6b-8484-4ad7-879f-593cf31efaaa-combined-ca-bundle\") pod \"1f546b6b-8484-4ad7-879f-593cf31efaaa\" (UID: \"1f546b6b-8484-4ad7-879f-593cf31efaaa\") " Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.944334 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn2r7\" (UniqueName: \"kubernetes.io/projected/1f546b6b-8484-4ad7-879f-593cf31efaaa-kube-api-access-jn2r7\") pod \"1f546b6b-8484-4ad7-879f-593cf31efaaa\" (UID: \"1f546b6b-8484-4ad7-879f-593cf31efaaa\") " Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.944435 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f546b6b-8484-4ad7-879f-593cf31efaaa-config-data\") pod \"1f546b6b-8484-4ad7-879f-593cf31efaaa\" (UID: \"1f546b6b-8484-4ad7-879f-593cf31efaaa\") " Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.944695 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-xlkcn"] Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.944712 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-config\") pod \"dnsmasq-dns-5f59b8f679-xlkcn\" (UID: \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\") " pod="openstack/dnsmasq-dns-5f59b8f679-xlkcn" Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.944959 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-xlkcn\" (UID: \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\") " pod="openstack/dnsmasq-dns-5f59b8f679-xlkcn" Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.945046 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-xlkcn\" (UID: \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\") " pod="openstack/dnsmasq-dns-5f59b8f679-xlkcn" Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.945125 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-xlkcn\" (UID: \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\") " pod="openstack/dnsmasq-dns-5f59b8f679-xlkcn" Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.945146 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48sns\" (UniqueName: \"kubernetes.io/projected/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-kube-api-access-48sns\") pod \"dnsmasq-dns-5f59b8f679-xlkcn\" (UID: \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\") " pod="openstack/dnsmasq-dns-5f59b8f679-xlkcn" Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.945312 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-xlkcn\" (UID: \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\") " pod="openstack/dnsmasq-dns-5f59b8f679-xlkcn" Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.952043 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f546b6b-8484-4ad7-879f-593cf31efaaa-kube-api-access-jn2r7" (OuterVolumeSpecName: "kube-api-access-jn2r7") pod "1f546b6b-8484-4ad7-879f-593cf31efaaa" (UID: "1f546b6b-8484-4ad7-879f-593cf31efaaa"). InnerVolumeSpecName "kube-api-access-jn2r7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:44 crc kubenswrapper[4743]: I1011 01:11:44.982566 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f546b6b-8484-4ad7-879f-593cf31efaaa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f546b6b-8484-4ad7-879f-593cf31efaaa" (UID: "1f546b6b-8484-4ad7-879f-593cf31efaaa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.006364 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f546b6b-8484-4ad7-879f-593cf31efaaa-config-data" (OuterVolumeSpecName: "config-data") pod "1f546b6b-8484-4ad7-879f-593cf31efaaa" (UID: "1f546b6b-8484-4ad7-879f-593cf31efaaa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.047099 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-xlkcn\" (UID: \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\") " pod="openstack/dnsmasq-dns-5f59b8f679-xlkcn" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.047163 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-xlkcn\" (UID: \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\") " pod="openstack/dnsmasq-dns-5f59b8f679-xlkcn" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.047201 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-xlkcn\" (UID: \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\") " pod="openstack/dnsmasq-dns-5f59b8f679-xlkcn" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.047220 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48sns\" (UniqueName: \"kubernetes.io/projected/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-kube-api-access-48sns\") pod \"dnsmasq-dns-5f59b8f679-xlkcn\" (UID: \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\") " pod="openstack/dnsmasq-dns-5f59b8f679-xlkcn" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.047278 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-xlkcn\" (UID: \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\") " pod="openstack/dnsmasq-dns-5f59b8f679-xlkcn" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.047315 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-config\") pod \"dnsmasq-dns-5f59b8f679-xlkcn\" (UID: \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\") " pod="openstack/dnsmasq-dns-5f59b8f679-xlkcn" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.047379 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f546b6b-8484-4ad7-879f-593cf31efaaa-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.047391 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f546b6b-8484-4ad7-879f-593cf31efaaa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.047399 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn2r7\" (UniqueName: \"kubernetes.io/projected/1f546b6b-8484-4ad7-879f-593cf31efaaa-kube-api-access-jn2r7\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.048547 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-config\") pod \"dnsmasq-dns-5f59b8f679-xlkcn\" (UID: \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\") " pod="openstack/dnsmasq-dns-5f59b8f679-xlkcn" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.049296 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-xlkcn\" (UID: \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\") " pod="openstack/dnsmasq-dns-5f59b8f679-xlkcn" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.049916 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-xlkcn\" (UID: \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\") " pod="openstack/dnsmasq-dns-5f59b8f679-xlkcn" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.050671 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-xlkcn\" (UID: \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\") " pod="openstack/dnsmasq-dns-5f59b8f679-xlkcn" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.050694 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-xlkcn\" (UID: \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\") " pod="openstack/dnsmasq-dns-5f59b8f679-xlkcn" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.070709 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48sns\" (UniqueName: \"kubernetes.io/projected/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-kube-api-access-48sns\") pod \"dnsmasq-dns-5f59b8f679-xlkcn\" (UID: \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\") " pod="openstack/dnsmasq-dns-5f59b8f679-xlkcn" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.246116 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-xlkcn" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.411920 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.455276 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-dns-swift-storage-0\") pod \"225b975c-f87f-4830-bedb-8ecbfb7bd941\" (UID: \"225b975c-f87f-4830-bedb-8ecbfb7bd941\") " Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.455599 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-dns-svc\") pod \"225b975c-f87f-4830-bedb-8ecbfb7bd941\" (UID: \"225b975c-f87f-4830-bedb-8ecbfb7bd941\") " Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.455680 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-ovsdbserver-sb\") pod \"225b975c-f87f-4830-bedb-8ecbfb7bd941\" (UID: \"225b975c-f87f-4830-bedb-8ecbfb7bd941\") " Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.455698 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-ovsdbserver-nb\") pod \"225b975c-f87f-4830-bedb-8ecbfb7bd941\" (UID: \"225b975c-f87f-4830-bedb-8ecbfb7bd941\") " Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.455722 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztzzd\" (UniqueName: \"kubernetes.io/projected/225b975c-f87f-4830-bedb-8ecbfb7bd941-kube-api-access-ztzzd\") pod \"225b975c-f87f-4830-bedb-8ecbfb7bd941\" (UID: \"225b975c-f87f-4830-bedb-8ecbfb7bd941\") " Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.455807 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-config\") pod \"225b975c-f87f-4830-bedb-8ecbfb7bd941\" (UID: \"225b975c-f87f-4830-bedb-8ecbfb7bd941\") " Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.465765 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/225b975c-f87f-4830-bedb-8ecbfb7bd941-kube-api-access-ztzzd" (OuterVolumeSpecName: "kube-api-access-ztzzd") pod "225b975c-f87f-4830-bedb-8ecbfb7bd941" (UID: "225b975c-f87f-4830-bedb-8ecbfb7bd941"). InnerVolumeSpecName "kube-api-access-ztzzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.530567 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "225b975c-f87f-4830-bedb-8ecbfb7bd941" (UID: "225b975c-f87f-4830-bedb-8ecbfb7bd941"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.533301 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "225b975c-f87f-4830-bedb-8ecbfb7bd941" (UID: "225b975c-f87f-4830-bedb-8ecbfb7bd941"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.540624 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zdrsb" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.540627 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zdrsb" event={"ID":"1f546b6b-8484-4ad7-879f-593cf31efaaa","Type":"ContainerDied","Data":"22c90d4b209239ba748eea890f2b409b14126efd48cc5b4410ef63bddd623662"} Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.540731 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22c90d4b209239ba748eea890f2b409b14126efd48cc5b4410ef63bddd623662" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.542529 4743 generic.go:334] "Generic (PLEG): container finished" podID="225b975c-f87f-4830-bedb-8ecbfb7bd941" containerID="a8bcc499b0f5af70609fbf02db5bc142683a57c1e8969b30c81386b972c24ea7" exitCode=0 Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.542559 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" event={"ID":"225b975c-f87f-4830-bedb-8ecbfb7bd941","Type":"ContainerDied","Data":"a8bcc499b0f5af70609fbf02db5bc142683a57c1e8969b30c81386b972c24ea7"} Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.542580 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" event={"ID":"225b975c-f87f-4830-bedb-8ecbfb7bd941","Type":"ContainerDied","Data":"c86bb645f3cceec10612bfa9f7293f415e92e13cf968a0a10b25b953fbee1baa"} Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.542596 4743 scope.go:117] "RemoveContainer" containerID="a8bcc499b0f5af70609fbf02db5bc142683a57c1e8969b30c81386b972c24ea7" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.542715 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-9jggh" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.546604 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "225b975c-f87f-4830-bedb-8ecbfb7bd941" (UID: "225b975c-f87f-4830-bedb-8ecbfb7bd941"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.547681 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "225b975c-f87f-4830-bedb-8ecbfb7bd941" (UID: "225b975c-f87f-4830-bedb-8ecbfb7bd941"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.557800 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.557821 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.557833 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.557842 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztzzd\" (UniqueName: \"kubernetes.io/projected/225b975c-f87f-4830-bedb-8ecbfb7bd941-kube-api-access-ztzzd\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.557850 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.566569 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-config" (OuterVolumeSpecName: "config") pod "225b975c-f87f-4830-bedb-8ecbfb7bd941" (UID: "225b975c-f87f-4830-bedb-8ecbfb7bd941"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.567338 4743 scope.go:117] "RemoveContainer" containerID="4560273b2cafc18804c5cecab511c9996bfa79ae2d9d8394e05c34af64ba2656" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.603933 4743 scope.go:117] "RemoveContainer" containerID="a8bcc499b0f5af70609fbf02db5bc142683a57c1e8969b30c81386b972c24ea7" Oct 11 01:11:45 crc kubenswrapper[4743]: E1011 01:11:45.604673 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8bcc499b0f5af70609fbf02db5bc142683a57c1e8969b30c81386b972c24ea7\": container with ID starting with a8bcc499b0f5af70609fbf02db5bc142683a57c1e8969b30c81386b972c24ea7 not found: ID does not exist" containerID="a8bcc499b0f5af70609fbf02db5bc142683a57c1e8969b30c81386b972c24ea7" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.604705 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8bcc499b0f5af70609fbf02db5bc142683a57c1e8969b30c81386b972c24ea7"} err="failed to get container status \"a8bcc499b0f5af70609fbf02db5bc142683a57c1e8969b30c81386b972c24ea7\": rpc error: code = NotFound desc = could not find container \"a8bcc499b0f5af70609fbf02db5bc142683a57c1e8969b30c81386b972c24ea7\": container with ID starting with a8bcc499b0f5af70609fbf02db5bc142683a57c1e8969b30c81386b972c24ea7 not found: ID does not exist" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.604727 4743 scope.go:117] "RemoveContainer" containerID="4560273b2cafc18804c5cecab511c9996bfa79ae2d9d8394e05c34af64ba2656" Oct 11 01:11:45 crc kubenswrapper[4743]: E1011 01:11:45.605143 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4560273b2cafc18804c5cecab511c9996bfa79ae2d9d8394e05c34af64ba2656\": container with ID starting with 4560273b2cafc18804c5cecab511c9996bfa79ae2d9d8394e05c34af64ba2656 not found: ID does not exist" containerID="4560273b2cafc18804c5cecab511c9996bfa79ae2d9d8394e05c34af64ba2656" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.605203 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4560273b2cafc18804c5cecab511c9996bfa79ae2d9d8394e05c34af64ba2656"} err="failed to get container status \"4560273b2cafc18804c5cecab511c9996bfa79ae2d9d8394e05c34af64ba2656\": rpc error: code = NotFound desc = could not find container \"4560273b2cafc18804c5cecab511c9996bfa79ae2d9d8394e05c34af64ba2656\": container with ID starting with 4560273b2cafc18804c5cecab511c9996bfa79ae2d9d8394e05c34af64ba2656 not found: ID does not exist" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.659613 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/225b975c-f87f-4830-bedb-8ecbfb7bd941-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.702914 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-xlkcn"] Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.735994 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-jvxjh"] Oct 11 01:11:45 crc kubenswrapper[4743]: E1011 01:11:45.736585 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f546b6b-8484-4ad7-879f-593cf31efaaa" containerName="keystone-db-sync" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.736605 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f546b6b-8484-4ad7-879f-593cf31efaaa" containerName="keystone-db-sync" Oct 11 01:11:45 crc kubenswrapper[4743]: E1011 01:11:45.736622 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="225b975c-f87f-4830-bedb-8ecbfb7bd941" containerName="dnsmasq-dns" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.736629 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="225b975c-f87f-4830-bedb-8ecbfb7bd941" containerName="dnsmasq-dns" Oct 11 01:11:45 crc kubenswrapper[4743]: E1011 01:11:45.736648 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="225b975c-f87f-4830-bedb-8ecbfb7bd941" containerName="init" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.736655 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="225b975c-f87f-4830-bedb-8ecbfb7bd941" containerName="init" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.736842 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f546b6b-8484-4ad7-879f-593cf31efaaa" containerName="keystone-db-sync" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.736886 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="225b975c-f87f-4830-bedb-8ecbfb7bd941" containerName="dnsmasq-dns" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.796482 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-jvxjh"] Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.801015 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-jvxjh" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.814000 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xqz2q"] Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.822284 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xqz2q" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.826704 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.827176 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.827283 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.827419 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4qpmq" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.829120 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xqz2q"] Oct 11 01:11:45 crc kubenswrapper[4743]: W1011 01:11:45.841376 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09c4deb5_c6d5_4eee_8de5_43299f3d6c62.slice/crio-11a2ee15c45b331c331980f8cdaa4c89a95b85c773a9570e0178360a8e047489 WatchSource:0}: Error finding container 11a2ee15c45b331c331980f8cdaa4c89a95b85c773a9570e0178360a8e047489: Status 404 returned error can't find the container with id 11a2ee15c45b331c331980f8cdaa4c89a95b85c773a9570e0178360a8e047489 Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.849676 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-xlkcn"] Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.875547 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-config\") pod \"dnsmasq-dns-bbf5cc879-jvxjh\" (UID: \"130eba3b-8f13-481a-8f27-60889a94bbe8\") " pod="openstack/dnsmasq-dns-bbf5cc879-jvxjh" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.875583 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-jvxjh\" (UID: \"130eba3b-8f13-481a-8f27-60889a94bbe8\") " pod="openstack/dnsmasq-dns-bbf5cc879-jvxjh" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.875624 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdxq2\" (UniqueName: \"kubernetes.io/projected/130eba3b-8f13-481a-8f27-60889a94bbe8-kube-api-access-hdxq2\") pod \"dnsmasq-dns-bbf5cc879-jvxjh\" (UID: \"130eba3b-8f13-481a-8f27-60889a94bbe8\") " pod="openstack/dnsmasq-dns-bbf5cc879-jvxjh" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.875643 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-scripts\") pod \"keystone-bootstrap-xqz2q\" (UID: \"3333bf90-e88b-455c-9719-60c0c49b83fe\") " pod="openstack/keystone-bootstrap-xqz2q" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.875669 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-config-data\") pod \"keystone-bootstrap-xqz2q\" (UID: \"3333bf90-e88b-455c-9719-60c0c49b83fe\") " pod="openstack/keystone-bootstrap-xqz2q" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.875691 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj24b\" (UniqueName: \"kubernetes.io/projected/3333bf90-e88b-455c-9719-60c0c49b83fe-kube-api-access-kj24b\") pod \"keystone-bootstrap-xqz2q\" (UID: \"3333bf90-e88b-455c-9719-60c0c49b83fe\") " pod="openstack/keystone-bootstrap-xqz2q" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.875723 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-fernet-keys\") pod \"keystone-bootstrap-xqz2q\" (UID: \"3333bf90-e88b-455c-9719-60c0c49b83fe\") " pod="openstack/keystone-bootstrap-xqz2q" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.875773 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-jvxjh\" (UID: \"130eba3b-8f13-481a-8f27-60889a94bbe8\") " pod="openstack/dnsmasq-dns-bbf5cc879-jvxjh" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.875804 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-jvxjh\" (UID: \"130eba3b-8f13-481a-8f27-60889a94bbe8\") " pod="openstack/dnsmasq-dns-bbf5cc879-jvxjh" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.875822 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-combined-ca-bundle\") pod \"keystone-bootstrap-xqz2q\" (UID: \"3333bf90-e88b-455c-9719-60c0c49b83fe\") " pod="openstack/keystone-bootstrap-xqz2q" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.875841 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-credential-keys\") pod \"keystone-bootstrap-xqz2q\" (UID: \"3333bf90-e88b-455c-9719-60c0c49b83fe\") " pod="openstack/keystone-bootstrap-xqz2q" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.875870 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-jvxjh\" (UID: \"130eba3b-8f13-481a-8f27-60889a94bbe8\") " pod="openstack/dnsmasq-dns-bbf5cc879-jvxjh" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.885664 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-9jggh"] Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.893712 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-9jggh"] Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.977307 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-jvxjh\" (UID: \"130eba3b-8f13-481a-8f27-60889a94bbe8\") " pod="openstack/dnsmasq-dns-bbf5cc879-jvxjh" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.977347 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-fernet-keys\") pod \"keystone-bootstrap-xqz2q\" (UID: \"3333bf90-e88b-455c-9719-60c0c49b83fe\") " pod="openstack/keystone-bootstrap-xqz2q" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.977379 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-jvxjh\" (UID: \"130eba3b-8f13-481a-8f27-60889a94bbe8\") " pod="openstack/dnsmasq-dns-bbf5cc879-jvxjh" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.977397 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-combined-ca-bundle\") pod \"keystone-bootstrap-xqz2q\" (UID: \"3333bf90-e88b-455c-9719-60c0c49b83fe\") " pod="openstack/keystone-bootstrap-xqz2q" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.977416 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-credential-keys\") pod \"keystone-bootstrap-xqz2q\" (UID: \"3333bf90-e88b-455c-9719-60c0c49b83fe\") " pod="openstack/keystone-bootstrap-xqz2q" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.977433 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-jvxjh\" (UID: \"130eba3b-8f13-481a-8f27-60889a94bbe8\") " pod="openstack/dnsmasq-dns-bbf5cc879-jvxjh" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.977507 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-jvxjh\" (UID: \"130eba3b-8f13-481a-8f27-60889a94bbe8\") " pod="openstack/dnsmasq-dns-bbf5cc879-jvxjh" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.977524 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-config\") pod \"dnsmasq-dns-bbf5cc879-jvxjh\" (UID: \"130eba3b-8f13-481a-8f27-60889a94bbe8\") " pod="openstack/dnsmasq-dns-bbf5cc879-jvxjh" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.977560 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdxq2\" (UniqueName: \"kubernetes.io/projected/130eba3b-8f13-481a-8f27-60889a94bbe8-kube-api-access-hdxq2\") pod \"dnsmasq-dns-bbf5cc879-jvxjh\" (UID: \"130eba3b-8f13-481a-8f27-60889a94bbe8\") " pod="openstack/dnsmasq-dns-bbf5cc879-jvxjh" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.977576 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-scripts\") pod \"keystone-bootstrap-xqz2q\" (UID: \"3333bf90-e88b-455c-9719-60c0c49b83fe\") " pod="openstack/keystone-bootstrap-xqz2q" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.977604 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-config-data\") pod \"keystone-bootstrap-xqz2q\" (UID: \"3333bf90-e88b-455c-9719-60c0c49b83fe\") " pod="openstack/keystone-bootstrap-xqz2q" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.977632 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj24b\" (UniqueName: \"kubernetes.io/projected/3333bf90-e88b-455c-9719-60c0c49b83fe-kube-api-access-kj24b\") pod \"keystone-bootstrap-xqz2q\" (UID: \"3333bf90-e88b-455c-9719-60c0c49b83fe\") " pod="openstack/keystone-bootstrap-xqz2q" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.978660 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-jvxjh\" (UID: \"130eba3b-8f13-481a-8f27-60889a94bbe8\") " pod="openstack/dnsmasq-dns-bbf5cc879-jvxjh" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.979633 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-jvxjh\" (UID: \"130eba3b-8f13-481a-8f27-60889a94bbe8\") " pod="openstack/dnsmasq-dns-bbf5cc879-jvxjh" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.980186 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-jvxjh\" (UID: \"130eba3b-8f13-481a-8f27-60889a94bbe8\") " pod="openstack/dnsmasq-dns-bbf5cc879-jvxjh" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.980811 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-config\") pod \"dnsmasq-dns-bbf5cc879-jvxjh\" (UID: \"130eba3b-8f13-481a-8f27-60889a94bbe8\") " pod="openstack/dnsmasq-dns-bbf5cc879-jvxjh" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.984374 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-scripts\") pod \"keystone-bootstrap-xqz2q\" (UID: \"3333bf90-e88b-455c-9719-60c0c49b83fe\") " pod="openstack/keystone-bootstrap-xqz2q" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.985043 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-jvxjh\" (UID: \"130eba3b-8f13-481a-8f27-60889a94bbe8\") " pod="openstack/dnsmasq-dns-bbf5cc879-jvxjh" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.989173 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-config-data\") pod \"keystone-bootstrap-xqz2q\" (UID: \"3333bf90-e88b-455c-9719-60c0c49b83fe\") " pod="openstack/keystone-bootstrap-xqz2q" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.989255 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-fernet-keys\") pod \"keystone-bootstrap-xqz2q\" (UID: \"3333bf90-e88b-455c-9719-60c0c49b83fe\") " pod="openstack/keystone-bootstrap-xqz2q" Oct 11 01:11:45 crc kubenswrapper[4743]: I1011 01:11:45.994012 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-credential-keys\") pod \"keystone-bootstrap-xqz2q\" (UID: \"3333bf90-e88b-455c-9719-60c0c49b83fe\") " pod="openstack/keystone-bootstrap-xqz2q" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.008599 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdxq2\" (UniqueName: \"kubernetes.io/projected/130eba3b-8f13-481a-8f27-60889a94bbe8-kube-api-access-hdxq2\") pod \"dnsmasq-dns-bbf5cc879-jvxjh\" (UID: \"130eba3b-8f13-481a-8f27-60889a94bbe8\") " pod="openstack/dnsmasq-dns-bbf5cc879-jvxjh" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.008627 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-combined-ca-bundle\") pod \"keystone-bootstrap-xqz2q\" (UID: \"3333bf90-e88b-455c-9719-60c0c49b83fe\") " pod="openstack/keystone-bootstrap-xqz2q" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.009555 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-jvxjh"] Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.010245 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-jvxjh" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.016828 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj24b\" (UniqueName: \"kubernetes.io/projected/3333bf90-e88b-455c-9719-60c0c49b83fe-kube-api-access-kj24b\") pod \"keystone-bootstrap-xqz2q\" (UID: \"3333bf90-e88b-455c-9719-60c0c49b83fe\") " pod="openstack/keystone-bootstrap-xqz2q" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.069201 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-m99m4"] Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.072454 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.084488 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-m99m4"] Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.151358 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="225b975c-f87f-4830-bedb-8ecbfb7bd941" path="/var/lib/kubelet/pods/225b975c-f87f-4830-bedb-8ecbfb7bd941/volumes" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.151904 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-z4tv4"] Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.154769 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-z4tv4"] Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.154876 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-z4tv4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.169606 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.169804 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-4dds5" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.169933 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.174631 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4qpmq" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.174749 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xqz2q" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.179323 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.180545 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.182341 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.186794 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.193975 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-m99m4\" (UID: \"e9d40a6d-5220-4537-bb4b-d4248101d864\") " pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.194036 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-logs\") pod \"placement-db-sync-z4tv4\" (UID: \"7e0bb9e3-337b-443b-baa8-8ea69d351ea1\") " pod="openstack/placement-db-sync-z4tv4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.194065 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-combined-ca-bundle\") pod \"placement-db-sync-z4tv4\" (UID: \"7e0bb9e3-337b-443b-baa8-8ea69d351ea1\") " pod="openstack/placement-db-sync-z4tv4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.194084 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-config-data\") pod \"placement-db-sync-z4tv4\" (UID: \"7e0bb9e3-337b-443b-baa8-8ea69d351ea1\") " pod="openstack/placement-db-sync-z4tv4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.194184 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dq9z\" (UniqueName: \"kubernetes.io/projected/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-kube-api-access-5dq9z\") pod \"placement-db-sync-z4tv4\" (UID: \"7e0bb9e3-337b-443b-baa8-8ea69d351ea1\") " pod="openstack/placement-db-sync-z4tv4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.194256 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-config\") pod \"dnsmasq-dns-56df8fb6b7-m99m4\" (UID: \"e9d40a6d-5220-4537-bb4b-d4248101d864\") " pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.194367 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-scripts\") pod \"placement-db-sync-z4tv4\" (UID: \"7e0bb9e3-337b-443b-baa8-8ea69d351ea1\") " pod="openstack/placement-db-sync-z4tv4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.194402 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-m99m4\" (UID: \"e9d40a6d-5220-4537-bb4b-d4248101d864\") " pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.194441 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgpfr\" (UniqueName: \"kubernetes.io/projected/e9d40a6d-5220-4537-bb4b-d4248101d864-kube-api-access-fgpfr\") pod \"dnsmasq-dns-56df8fb6b7-m99m4\" (UID: \"e9d40a6d-5220-4537-bb4b-d4248101d864\") " pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.194504 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-m99m4\" (UID: \"e9d40a6d-5220-4537-bb4b-d4248101d864\") " pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.194629 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-m99m4\" (UID: \"e9d40a6d-5220-4537-bb4b-d4248101d864\") " pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.199815 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.297168 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " pod="openstack/ceilometer-0" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.297228 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv7m6\" (UniqueName: \"kubernetes.io/projected/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-kube-api-access-zv7m6\") pod \"ceilometer-0\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " pod="openstack/ceilometer-0" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.297251 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-config-data\") pod \"ceilometer-0\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " pod="openstack/ceilometer-0" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.297275 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " pod="openstack/ceilometer-0" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.297297 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-run-httpd\") pod \"ceilometer-0\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " pod="openstack/ceilometer-0" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.297313 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-scripts\") pod \"ceilometer-0\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " pod="openstack/ceilometer-0" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.297365 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-m99m4\" (UID: \"e9d40a6d-5220-4537-bb4b-d4248101d864\") " pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.297437 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-logs\") pod \"placement-db-sync-z4tv4\" (UID: \"7e0bb9e3-337b-443b-baa8-8ea69d351ea1\") " pod="openstack/placement-db-sync-z4tv4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.297466 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-combined-ca-bundle\") pod \"placement-db-sync-z4tv4\" (UID: \"7e0bb9e3-337b-443b-baa8-8ea69d351ea1\") " pod="openstack/placement-db-sync-z4tv4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.297497 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-config-data\") pod \"placement-db-sync-z4tv4\" (UID: \"7e0bb9e3-337b-443b-baa8-8ea69d351ea1\") " pod="openstack/placement-db-sync-z4tv4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.297513 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dq9z\" (UniqueName: \"kubernetes.io/projected/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-kube-api-access-5dq9z\") pod \"placement-db-sync-z4tv4\" (UID: \"7e0bb9e3-337b-443b-baa8-8ea69d351ea1\") " pod="openstack/placement-db-sync-z4tv4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.297593 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-config\") pod \"dnsmasq-dns-56df8fb6b7-m99m4\" (UID: \"e9d40a6d-5220-4537-bb4b-d4248101d864\") " pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.297687 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-scripts\") pod \"placement-db-sync-z4tv4\" (UID: \"7e0bb9e3-337b-443b-baa8-8ea69d351ea1\") " pod="openstack/placement-db-sync-z4tv4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.297717 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-m99m4\" (UID: \"e9d40a6d-5220-4537-bb4b-d4248101d864\") " pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.297755 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgpfr\" (UniqueName: \"kubernetes.io/projected/e9d40a6d-5220-4537-bb4b-d4248101d864-kube-api-access-fgpfr\") pod \"dnsmasq-dns-56df8fb6b7-m99m4\" (UID: \"e9d40a6d-5220-4537-bb4b-d4248101d864\") " pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.297793 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-m99m4\" (UID: \"e9d40a6d-5220-4537-bb4b-d4248101d864\") " pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.297844 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-m99m4\" (UID: \"e9d40a6d-5220-4537-bb4b-d4248101d864\") " pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.297900 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-log-httpd\") pod \"ceilometer-0\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " pod="openstack/ceilometer-0" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.298794 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-logs\") pod \"placement-db-sync-z4tv4\" (UID: \"7e0bb9e3-337b-443b-baa8-8ea69d351ea1\") " pod="openstack/placement-db-sync-z4tv4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.301597 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-config\") pod \"dnsmasq-dns-56df8fb6b7-m99m4\" (UID: \"e9d40a6d-5220-4537-bb4b-d4248101d864\") " pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.304408 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-config-data\") pod \"placement-db-sync-z4tv4\" (UID: \"7e0bb9e3-337b-443b-baa8-8ea69d351ea1\") " pod="openstack/placement-db-sync-z4tv4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.306037 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-combined-ca-bundle\") pod \"placement-db-sync-z4tv4\" (UID: \"7e0bb9e3-337b-443b-baa8-8ea69d351ea1\") " pod="openstack/placement-db-sync-z4tv4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.306628 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-m99m4\" (UID: \"e9d40a6d-5220-4537-bb4b-d4248101d864\") " pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.307870 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-scripts\") pod \"placement-db-sync-z4tv4\" (UID: \"7e0bb9e3-337b-443b-baa8-8ea69d351ea1\") " pod="openstack/placement-db-sync-z4tv4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.308929 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-m99m4\" (UID: \"e9d40a6d-5220-4537-bb4b-d4248101d864\") " pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.309226 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-m99m4\" (UID: \"e9d40a6d-5220-4537-bb4b-d4248101d864\") " pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.309254 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-m99m4\" (UID: \"e9d40a6d-5220-4537-bb4b-d4248101d864\") " pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.313835 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dq9z\" (UniqueName: \"kubernetes.io/projected/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-kube-api-access-5dq9z\") pod \"placement-db-sync-z4tv4\" (UID: \"7e0bb9e3-337b-443b-baa8-8ea69d351ea1\") " pod="openstack/placement-db-sync-z4tv4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.326764 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgpfr\" (UniqueName: \"kubernetes.io/projected/e9d40a6d-5220-4537-bb4b-d4248101d864-kube-api-access-fgpfr\") pod \"dnsmasq-dns-56df8fb6b7-m99m4\" (UID: \"e9d40a6d-5220-4537-bb4b-d4248101d864\") " pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.399219 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-log-httpd\") pod \"ceilometer-0\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " pod="openstack/ceilometer-0" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.399544 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " pod="openstack/ceilometer-0" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.399572 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv7m6\" (UniqueName: \"kubernetes.io/projected/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-kube-api-access-zv7m6\") pod \"ceilometer-0\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " pod="openstack/ceilometer-0" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.399590 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-config-data\") pod \"ceilometer-0\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " pod="openstack/ceilometer-0" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.399615 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " pod="openstack/ceilometer-0" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.399634 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-run-httpd\") pod \"ceilometer-0\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " pod="openstack/ceilometer-0" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.399652 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-scripts\") pod \"ceilometer-0\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " pod="openstack/ceilometer-0" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.400584 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-log-httpd\") pod \"ceilometer-0\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " pod="openstack/ceilometer-0" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.404340 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-run-httpd\") pod \"ceilometer-0\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " pod="openstack/ceilometer-0" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.408492 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-config-data\") pod \"ceilometer-0\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " pod="openstack/ceilometer-0" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.408647 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-scripts\") pod \"ceilometer-0\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " pod="openstack/ceilometer-0" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.410966 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " pod="openstack/ceilometer-0" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.416430 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv7m6\" (UniqueName: \"kubernetes.io/projected/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-kube-api-access-zv7m6\") pod \"ceilometer-0\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " pod="openstack/ceilometer-0" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.417617 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " pod="openstack/ceilometer-0" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.428723 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.491394 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-z4tv4" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.525967 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.565467 4743 generic.go:334] "Generic (PLEG): container finished" podID="09c4deb5-c6d5-4eee-8de5-43299f3d6c62" containerID="40d06d4f8d7c4e5744682aa73c3100aa43887d364d0e94be4d4f62691da3177a" exitCode=0 Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.565587 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-xlkcn" event={"ID":"09c4deb5-c6d5-4eee-8de5-43299f3d6c62","Type":"ContainerDied","Data":"40d06d4f8d7c4e5744682aa73c3100aa43887d364d0e94be4d4f62691da3177a"} Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.565620 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-xlkcn" event={"ID":"09c4deb5-c6d5-4eee-8de5-43299f3d6c62","Type":"ContainerStarted","Data":"11a2ee15c45b331c331980f8cdaa4c89a95b85c773a9570e0178360a8e047489"} Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.576661 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-jvxjh"] Oct 11 01:11:46 crc kubenswrapper[4743]: W1011 01:11:46.578420 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod130eba3b_8f13_481a_8f27_60889a94bbe8.slice/crio-e794a59cbb6be038b15586b8942dfd25e875520a346360c2686d8d88cb878a5c WatchSource:0}: Error finding container e794a59cbb6be038b15586b8942dfd25e875520a346360c2686d8d88cb878a5c: Status 404 returned error can't find the container with id e794a59cbb6be038b15586b8942dfd25e875520a346360c2686d8d88cb878a5c Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.734678 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xqz2q"] Oct 11 01:11:46 crc kubenswrapper[4743]: I1011 01:11:46.953713 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-m99m4"] Oct 11 01:11:46 crc kubenswrapper[4743]: W1011 01:11:46.960825 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9d40a6d_5220_4537_bb4b_d4248101d864.slice/crio-e856e77f4cfc97b7f3f4de0cb6f7b1b55e284939ba6c68d89cb607fb0b22d158 WatchSource:0}: Error finding container e856e77f4cfc97b7f3f4de0cb6f7b1b55e284939ba6c68d89cb607fb0b22d158: Status 404 returned error can't find the container with id e856e77f4cfc97b7f3f4de0cb6f7b1b55e284939ba6c68d89cb607fb0b22d158 Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.071442 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-xlkcn" Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.130213 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48sns\" (UniqueName: \"kubernetes.io/projected/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-kube-api-access-48sns\") pod \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\" (UID: \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\") " Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.130276 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-ovsdbserver-nb\") pod \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\" (UID: \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\") " Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.130385 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-ovsdbserver-sb\") pod \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\" (UID: \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\") " Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.130412 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-dns-svc\") pod \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\" (UID: \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\") " Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.130506 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-dns-swift-storage-0\") pod \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\" (UID: \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\") " Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.130541 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-config\") pod \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\" (UID: \"09c4deb5-c6d5-4eee-8de5-43299f3d6c62\") " Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.135198 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-kube-api-access-48sns" (OuterVolumeSpecName: "kube-api-access-48sns") pod "09c4deb5-c6d5-4eee-8de5-43299f3d6c62" (UID: "09c4deb5-c6d5-4eee-8de5-43299f3d6c62"). InnerVolumeSpecName "kube-api-access-48sns". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.155503 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "09c4deb5-c6d5-4eee-8de5-43299f3d6c62" (UID: "09c4deb5-c6d5-4eee-8de5-43299f3d6c62"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.173718 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "09c4deb5-c6d5-4eee-8de5-43299f3d6c62" (UID: "09c4deb5-c6d5-4eee-8de5-43299f3d6c62"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.173849 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "09c4deb5-c6d5-4eee-8de5-43299f3d6c62" (UID: "09c4deb5-c6d5-4eee-8de5-43299f3d6c62"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.182623 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "09c4deb5-c6d5-4eee-8de5-43299f3d6c62" (UID: "09c4deb5-c6d5-4eee-8de5-43299f3d6c62"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.190804 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-config" (OuterVolumeSpecName: "config") pod "09c4deb5-c6d5-4eee-8de5-43299f3d6c62" (UID: "09c4deb5-c6d5-4eee-8de5-43299f3d6c62"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.193783 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-z4tv4"] Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.200264 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.236032 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.236061 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.236070 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.236080 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.236088 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48sns\" (UniqueName: \"kubernetes.io/projected/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-kube-api-access-48sns\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.236095 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09c4deb5-c6d5-4eee-8de5-43299f3d6c62-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.582307 4743 generic.go:334] "Generic (PLEG): container finished" podID="130eba3b-8f13-481a-8f27-60889a94bbe8" containerID="06cc457e035a4d9c6d674760cade7b216feb35f5a8b06392c123248d5502c581" exitCode=0 Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.582374 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-jvxjh" event={"ID":"130eba3b-8f13-481a-8f27-60889a94bbe8","Type":"ContainerDied","Data":"06cc457e035a4d9c6d674760cade7b216feb35f5a8b06392c123248d5502c581"} Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.582776 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-jvxjh" event={"ID":"130eba3b-8f13-481a-8f27-60889a94bbe8","Type":"ContainerStarted","Data":"e794a59cbb6be038b15586b8942dfd25e875520a346360c2686d8d88cb878a5c"} Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.586846 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-xlkcn" event={"ID":"09c4deb5-c6d5-4eee-8de5-43299f3d6c62","Type":"ContainerDied","Data":"11a2ee15c45b331c331980f8cdaa4c89a95b85c773a9570e0178360a8e047489"} Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.586890 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-xlkcn" Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.586927 4743 scope.go:117] "RemoveContainer" containerID="40d06d4f8d7c4e5744682aa73c3100aa43887d364d0e94be4d4f62691da3177a" Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.592762 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d","Type":"ContainerStarted","Data":"182450f29bacd468b53883ed4ede946e02cb832335cd9fb784d7f0120ab454f1"} Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.594587 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xqz2q" event={"ID":"3333bf90-e88b-455c-9719-60c0c49b83fe","Type":"ContainerStarted","Data":"77945c6fa9449bd57597a49f83be1a34d5a578743bf8d18b705387926f910e3d"} Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.594625 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xqz2q" event={"ID":"3333bf90-e88b-455c-9719-60c0c49b83fe","Type":"ContainerStarted","Data":"4584a51687c5b1f909c0a273f081a540ed52054075bcb179d5f33ac61ff6952b"} Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.597206 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-z4tv4" event={"ID":"7e0bb9e3-337b-443b-baa8-8ea69d351ea1","Type":"ContainerStarted","Data":"8c5663c91c4e792da1ff16a0e1ddc611dca00f90c25b87d0699b13a59ae362a8"} Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.604811 4743 generic.go:334] "Generic (PLEG): container finished" podID="e9d40a6d-5220-4537-bb4b-d4248101d864" containerID="7ec6bf9ba82a05f95bba6448f0c6a65d716c6b6d4e212991d59441fa2916cea7" exitCode=0 Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.605132 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" event={"ID":"e9d40a6d-5220-4537-bb4b-d4248101d864","Type":"ContainerDied","Data":"7ec6bf9ba82a05f95bba6448f0c6a65d716c6b6d4e212991d59441fa2916cea7"} Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.606049 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" event={"ID":"e9d40a6d-5220-4537-bb4b-d4248101d864","Type":"ContainerStarted","Data":"e856e77f4cfc97b7f3f4de0cb6f7b1b55e284939ba6c68d89cb607fb0b22d158"} Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.630450 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xqz2q" podStartSLOduration=2.630428985 podStartE2EDuration="2.630428985s" podCreationTimestamp="2025-10-11 01:11:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:11:47.622903634 +0000 UTC m=+1202.275884041" watchObservedRunningTime="2025-10-11 01:11:47.630428985 +0000 UTC m=+1202.283409392" Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.698278 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-xlkcn"] Oct 11 01:11:47 crc kubenswrapper[4743]: I1011 01:11:47.707191 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-xlkcn"] Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.009159 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-jvxjh" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.052142 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-config\") pod \"130eba3b-8f13-481a-8f27-60889a94bbe8\" (UID: \"130eba3b-8f13-481a-8f27-60889a94bbe8\") " Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.052265 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-ovsdbserver-nb\") pod \"130eba3b-8f13-481a-8f27-60889a94bbe8\" (UID: \"130eba3b-8f13-481a-8f27-60889a94bbe8\") " Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.052367 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-dns-svc\") pod \"130eba3b-8f13-481a-8f27-60889a94bbe8\" (UID: \"130eba3b-8f13-481a-8f27-60889a94bbe8\") " Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.052400 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-dns-swift-storage-0\") pod \"130eba3b-8f13-481a-8f27-60889a94bbe8\" (UID: \"130eba3b-8f13-481a-8f27-60889a94bbe8\") " Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.052447 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdxq2\" (UniqueName: \"kubernetes.io/projected/130eba3b-8f13-481a-8f27-60889a94bbe8-kube-api-access-hdxq2\") pod \"130eba3b-8f13-481a-8f27-60889a94bbe8\" (UID: \"130eba3b-8f13-481a-8f27-60889a94bbe8\") " Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.052482 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-ovsdbserver-sb\") pod \"130eba3b-8f13-481a-8f27-60889a94bbe8\" (UID: \"130eba3b-8f13-481a-8f27-60889a94bbe8\") " Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.071982 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/130eba3b-8f13-481a-8f27-60889a94bbe8-kube-api-access-hdxq2" (OuterVolumeSpecName: "kube-api-access-hdxq2") pod "130eba3b-8f13-481a-8f27-60889a94bbe8" (UID: "130eba3b-8f13-481a-8f27-60889a94bbe8"). InnerVolumeSpecName "kube-api-access-hdxq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.090556 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "130eba3b-8f13-481a-8f27-60889a94bbe8" (UID: "130eba3b-8f13-481a-8f27-60889a94bbe8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.103216 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "130eba3b-8f13-481a-8f27-60889a94bbe8" (UID: "130eba3b-8f13-481a-8f27-60889a94bbe8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.105505 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "130eba3b-8f13-481a-8f27-60889a94bbe8" (UID: "130eba3b-8f13-481a-8f27-60889a94bbe8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.107721 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09c4deb5-c6d5-4eee-8de5-43299f3d6c62" path="/var/lib/kubelet/pods/09c4deb5-c6d5-4eee-8de5-43299f3d6c62/volumes" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.119877 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "130eba3b-8f13-481a-8f27-60889a94bbe8" (UID: "130eba3b-8f13-481a-8f27-60889a94bbe8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.120556 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-config" (OuterVolumeSpecName: "config") pod "130eba3b-8f13-481a-8f27-60889a94bbe8" (UID: "130eba3b-8f13-481a-8f27-60889a94bbe8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.155053 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.155085 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.155095 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.155104 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.155113 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdxq2\" (UniqueName: \"kubernetes.io/projected/130eba3b-8f13-481a-8f27-60889a94bbe8-kube-api-access-hdxq2\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.155123 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/130eba3b-8f13-481a-8f27-60889a94bbe8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.552061 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-70e4-account-create-v8fz8"] Oct 11 01:11:48 crc kubenswrapper[4743]: E1011 01:11:48.552830 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09c4deb5-c6d5-4eee-8de5-43299f3d6c62" containerName="init" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.552844 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="09c4deb5-c6d5-4eee-8de5-43299f3d6c62" containerName="init" Oct 11 01:11:48 crc kubenswrapper[4743]: E1011 01:11:48.552905 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="130eba3b-8f13-481a-8f27-60889a94bbe8" containerName="init" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.552915 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="130eba3b-8f13-481a-8f27-60889a94bbe8" containerName="init" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.553120 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="130eba3b-8f13-481a-8f27-60889a94bbe8" containerName="init" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.553144 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="09c4deb5-c6d5-4eee-8de5-43299f3d6c62" containerName="init" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.553827 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-70e4-account-create-v8fz8" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.558514 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.561012 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-70e4-account-create-v8fz8"] Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.617702 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-jvxjh" event={"ID":"130eba3b-8f13-481a-8f27-60889a94bbe8","Type":"ContainerDied","Data":"e794a59cbb6be038b15586b8942dfd25e875520a346360c2686d8d88cb878a5c"} Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.617755 4743 scope.go:117] "RemoveContainer" containerID="06cc457e035a4d9c6d674760cade7b216feb35f5a8b06392c123248d5502c581" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.617953 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-jvxjh" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.631360 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" event={"ID":"e9d40a6d-5220-4537-bb4b-d4248101d864","Type":"ContainerStarted","Data":"ba06ee2a5eb0ceba4aabba735361a4537bee8d4058f703e2cb13ff6effec3d2e"} Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.631441 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.664252 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" podStartSLOduration=2.66422919 podStartE2EDuration="2.66422919s" podCreationTimestamp="2025-10-11 01:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:11:48.661227384 +0000 UTC m=+1203.314207771" watchObservedRunningTime="2025-10-11 01:11:48.66422919 +0000 UTC m=+1203.317209587" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.681298 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24ktr\" (UniqueName: \"kubernetes.io/projected/3e5b2b21-68ca-41b3-96f4-f3dabc8cb94d-kube-api-access-24ktr\") pod \"heat-70e4-account-create-v8fz8\" (UID: \"3e5b2b21-68ca-41b3-96f4-f3dabc8cb94d\") " pod="openstack/heat-70e4-account-create-v8fz8" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.754781 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-jvxjh"] Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.768919 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-jvxjh"] Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.781487 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-33f9-account-create-zvc5d"] Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.782773 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-33f9-account-create-zvc5d" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.782928 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24ktr\" (UniqueName: \"kubernetes.io/projected/3e5b2b21-68ca-41b3-96f4-f3dabc8cb94d-kube-api-access-24ktr\") pod \"heat-70e4-account-create-v8fz8\" (UID: \"3e5b2b21-68ca-41b3-96f4-f3dabc8cb94d\") " pod="openstack/heat-70e4-account-create-v8fz8" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.786915 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.794562 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-33f9-account-create-zvc5d"] Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.819651 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24ktr\" (UniqueName: \"kubernetes.io/projected/3e5b2b21-68ca-41b3-96f4-f3dabc8cb94d-kube-api-access-24ktr\") pod \"heat-70e4-account-create-v8fz8\" (UID: \"3e5b2b21-68ca-41b3-96f4-f3dabc8cb94d\") " pod="openstack/heat-70e4-account-create-v8fz8" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.861603 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-11c8-account-create-n8n9x"] Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.862907 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-11c8-account-create-n8n9x" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.867996 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.885011 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-11c8-account-create-n8n9x"] Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.886469 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pzcn\" (UniqueName: \"kubernetes.io/projected/227021ae-b786-43b0-b61b-c3317fd4ee34-kube-api-access-6pzcn\") pod \"neutron-33f9-account-create-zvc5d\" (UID: \"227021ae-b786-43b0-b61b-c3317fd4ee34\") " pod="openstack/neutron-33f9-account-create-zvc5d" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.901728 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-70e4-account-create-v8fz8" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.922466 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-1f1d-account-create-8q67r"] Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.923703 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1f1d-account-create-8q67r" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.926216 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.941153 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-1f1d-account-create-8q67r"] Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.988778 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pzcn\" (UniqueName: \"kubernetes.io/projected/227021ae-b786-43b0-b61b-c3317fd4ee34-kube-api-access-6pzcn\") pod \"neutron-33f9-account-create-zvc5d\" (UID: \"227021ae-b786-43b0-b61b-c3317fd4ee34\") " pod="openstack/neutron-33f9-account-create-zvc5d" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.990085 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgr7g\" (UniqueName: \"kubernetes.io/projected/ad7c265c-766a-446c-863d-e6137abdd0e9-kube-api-access-tgr7g\") pod \"cinder-1f1d-account-create-8q67r\" (UID: \"ad7c265c-766a-446c-863d-e6137abdd0e9\") " pod="openstack/cinder-1f1d-account-create-8q67r" Oct 11 01:11:48 crc kubenswrapper[4743]: I1011 01:11:48.990317 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgg6x\" (UniqueName: \"kubernetes.io/projected/d6c0d49c-5d16-433e-aba5-de2c1b0746bf-kube-api-access-sgg6x\") pod \"barbican-11c8-account-create-n8n9x\" (UID: \"d6c0d49c-5d16-433e-aba5-de2c1b0746bf\") " pod="openstack/barbican-11c8-account-create-n8n9x" Oct 11 01:11:49 crc kubenswrapper[4743]: I1011 01:11:49.007021 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pzcn\" (UniqueName: \"kubernetes.io/projected/227021ae-b786-43b0-b61b-c3317fd4ee34-kube-api-access-6pzcn\") pod \"neutron-33f9-account-create-zvc5d\" (UID: \"227021ae-b786-43b0-b61b-c3317fd4ee34\") " pod="openstack/neutron-33f9-account-create-zvc5d" Oct 11 01:11:49 crc kubenswrapper[4743]: I1011 01:11:49.096523 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgr7g\" (UniqueName: \"kubernetes.io/projected/ad7c265c-766a-446c-863d-e6137abdd0e9-kube-api-access-tgr7g\") pod \"cinder-1f1d-account-create-8q67r\" (UID: \"ad7c265c-766a-446c-863d-e6137abdd0e9\") " pod="openstack/cinder-1f1d-account-create-8q67r" Oct 11 01:11:49 crc kubenswrapper[4743]: I1011 01:11:49.097051 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgg6x\" (UniqueName: \"kubernetes.io/projected/d6c0d49c-5d16-433e-aba5-de2c1b0746bf-kube-api-access-sgg6x\") pod \"barbican-11c8-account-create-n8n9x\" (UID: \"d6c0d49c-5d16-433e-aba5-de2c1b0746bf\") " pod="openstack/barbican-11c8-account-create-n8n9x" Oct 11 01:11:49 crc kubenswrapper[4743]: I1011 01:11:49.106838 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-33f9-account-create-zvc5d" Oct 11 01:11:49 crc kubenswrapper[4743]: I1011 01:11:49.116345 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgg6x\" (UniqueName: \"kubernetes.io/projected/d6c0d49c-5d16-433e-aba5-de2c1b0746bf-kube-api-access-sgg6x\") pod \"barbican-11c8-account-create-n8n9x\" (UID: \"d6c0d49c-5d16-433e-aba5-de2c1b0746bf\") " pod="openstack/barbican-11c8-account-create-n8n9x" Oct 11 01:11:49 crc kubenswrapper[4743]: I1011 01:11:49.117541 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgr7g\" (UniqueName: \"kubernetes.io/projected/ad7c265c-766a-446c-863d-e6137abdd0e9-kube-api-access-tgr7g\") pod \"cinder-1f1d-account-create-8q67r\" (UID: \"ad7c265c-766a-446c-863d-e6137abdd0e9\") " pod="openstack/cinder-1f1d-account-create-8q67r" Oct 11 01:11:49 crc kubenswrapper[4743]: I1011 01:11:49.194274 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-11c8-account-create-n8n9x" Oct 11 01:11:49 crc kubenswrapper[4743]: I1011 01:11:49.292373 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1f1d-account-create-8q67r" Oct 11 01:11:49 crc kubenswrapper[4743]: I1011 01:11:49.489040 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-70e4-account-create-v8fz8"] Oct 11 01:11:49 crc kubenswrapper[4743]: I1011 01:11:49.608249 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-33f9-account-create-zvc5d"] Oct 11 01:11:49 crc kubenswrapper[4743]: W1011 01:11:49.623988 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod227021ae_b786_43b0_b61b_c3317fd4ee34.slice/crio-df2486e4dd52add9c5f908622e2061c19cc813be704dbcf998136f2be2cf77e1 WatchSource:0}: Error finding container df2486e4dd52add9c5f908622e2061c19cc813be704dbcf998136f2be2cf77e1: Status 404 returned error can't find the container with id df2486e4dd52add9c5f908622e2061c19cc813be704dbcf998136f2be2cf77e1 Oct 11 01:11:49 crc kubenswrapper[4743]: I1011 01:11:49.651227 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-70e4-account-create-v8fz8" event={"ID":"3e5b2b21-68ca-41b3-96f4-f3dabc8cb94d","Type":"ContainerStarted","Data":"899b1ed883cef79d1e9012a21ffbd0575651b51a7ac22c35fab591dc15c1335b"} Oct 11 01:11:49 crc kubenswrapper[4743]: I1011 01:11:49.773465 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-11c8-account-create-n8n9x"] Oct 11 01:11:49 crc kubenswrapper[4743]: W1011 01:11:49.818020 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6c0d49c_5d16_433e_aba5_de2c1b0746bf.slice/crio-f09e73fedd18e60a57382a9f7fad4d8b14c3a26d25815872ee4aef5ac6e3b313 WatchSource:0}: Error finding container f09e73fedd18e60a57382a9f7fad4d8b14c3a26d25815872ee4aef5ac6e3b313: Status 404 returned error can't find the container with id f09e73fedd18e60a57382a9f7fad4d8b14c3a26d25815872ee4aef5ac6e3b313 Oct 11 01:11:49 crc kubenswrapper[4743]: I1011 01:11:49.929769 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:11:49 crc kubenswrapper[4743]: I1011 01:11:49.942737 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-1f1d-account-create-8q67r"] Oct 11 01:11:50 crc kubenswrapper[4743]: I1011 01:11:50.128078 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="130eba3b-8f13-481a-8f27-60889a94bbe8" path="/var/lib/kubelet/pods/130eba3b-8f13-481a-8f27-60889a94bbe8/volumes" Oct 11 01:11:50 crc kubenswrapper[4743]: I1011 01:11:50.672205 4743 generic.go:334] "Generic (PLEG): container finished" podID="227021ae-b786-43b0-b61b-c3317fd4ee34" containerID="bcf05e2bc0f024539c7285640e08713d28742c1031c605bd17a58f39ab373328" exitCode=0 Oct 11 01:11:50 crc kubenswrapper[4743]: I1011 01:11:50.672579 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-33f9-account-create-zvc5d" event={"ID":"227021ae-b786-43b0-b61b-c3317fd4ee34","Type":"ContainerDied","Data":"bcf05e2bc0f024539c7285640e08713d28742c1031c605bd17a58f39ab373328"} Oct 11 01:11:50 crc kubenswrapper[4743]: I1011 01:11:50.672607 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-33f9-account-create-zvc5d" event={"ID":"227021ae-b786-43b0-b61b-c3317fd4ee34","Type":"ContainerStarted","Data":"df2486e4dd52add9c5f908622e2061c19cc813be704dbcf998136f2be2cf77e1"} Oct 11 01:11:50 crc kubenswrapper[4743]: I1011 01:11:50.674417 4743 generic.go:334] "Generic (PLEG): container finished" podID="3e5b2b21-68ca-41b3-96f4-f3dabc8cb94d" containerID="36d04f245bfd66df2b42f8088c0fe20486a083c7fdf3628d1cf0ed6e4a7b8686" exitCode=0 Oct 11 01:11:50 crc kubenswrapper[4743]: I1011 01:11:50.674504 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-70e4-account-create-v8fz8" event={"ID":"3e5b2b21-68ca-41b3-96f4-f3dabc8cb94d","Type":"ContainerDied","Data":"36d04f245bfd66df2b42f8088c0fe20486a083c7fdf3628d1cf0ed6e4a7b8686"} Oct 11 01:11:50 crc kubenswrapper[4743]: I1011 01:11:50.677176 4743 generic.go:334] "Generic (PLEG): container finished" podID="ad7c265c-766a-446c-863d-e6137abdd0e9" containerID="950dcfc39960b21ca647842c438cad4ee651fddcfadb031860c91b04c2afbe16" exitCode=0 Oct 11 01:11:50 crc kubenswrapper[4743]: I1011 01:11:50.677243 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1f1d-account-create-8q67r" event={"ID":"ad7c265c-766a-446c-863d-e6137abdd0e9","Type":"ContainerDied","Data":"950dcfc39960b21ca647842c438cad4ee651fddcfadb031860c91b04c2afbe16"} Oct 11 01:11:50 crc kubenswrapper[4743]: I1011 01:11:50.677259 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1f1d-account-create-8q67r" event={"ID":"ad7c265c-766a-446c-863d-e6137abdd0e9","Type":"ContainerStarted","Data":"ab9c7ca7c58a30a1c62d97f0efeac8df52fc777fd929123f66c0c78fb32640e0"} Oct 11 01:11:50 crc kubenswrapper[4743]: I1011 01:11:50.678461 4743 generic.go:334] "Generic (PLEG): container finished" podID="d6c0d49c-5d16-433e-aba5-de2c1b0746bf" containerID="4e1f81d4023643285b93e504f933c75d55c24ea4c3f44ac4cf984109a8b538ed" exitCode=0 Oct 11 01:11:50 crc kubenswrapper[4743]: I1011 01:11:50.678529 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-11c8-account-create-n8n9x" event={"ID":"d6c0d49c-5d16-433e-aba5-de2c1b0746bf","Type":"ContainerDied","Data":"4e1f81d4023643285b93e504f933c75d55c24ea4c3f44ac4cf984109a8b538ed"} Oct 11 01:11:50 crc kubenswrapper[4743]: I1011 01:11:50.678586 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-11c8-account-create-n8n9x" event={"ID":"d6c0d49c-5d16-433e-aba5-de2c1b0746bf","Type":"ContainerStarted","Data":"f09e73fedd18e60a57382a9f7fad4d8b14c3a26d25815872ee4aef5ac6e3b313"} Oct 11 01:11:51 crc kubenswrapper[4743]: I1011 01:11:51.579107 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:51 crc kubenswrapper[4743]: I1011 01:11:51.587578 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:51 crc kubenswrapper[4743]: I1011 01:11:51.700183 4743 generic.go:334] "Generic (PLEG): container finished" podID="3333bf90-e88b-455c-9719-60c0c49b83fe" containerID="77945c6fa9449bd57597a49f83be1a34d5a578743bf8d18b705387926f910e3d" exitCode=0 Oct 11 01:11:51 crc kubenswrapper[4743]: I1011 01:11:51.700694 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xqz2q" event={"ID":"3333bf90-e88b-455c-9719-60c0c49b83fe","Type":"ContainerDied","Data":"77945c6fa9449bd57597a49f83be1a34d5a578743bf8d18b705387926f910e3d"} Oct 11 01:11:51 crc kubenswrapper[4743]: I1011 01:11:51.705540 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 11 01:11:52 crc kubenswrapper[4743]: I1011 01:11:52.845394 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-33f9-account-create-zvc5d" Oct 11 01:11:52 crc kubenswrapper[4743]: I1011 01:11:52.855330 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-11c8-account-create-n8n9x" Oct 11 01:11:52 crc kubenswrapper[4743]: I1011 01:11:52.880503 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-70e4-account-create-v8fz8" Oct 11 01:11:53 crc kubenswrapper[4743]: I1011 01:11:53.003532 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pzcn\" (UniqueName: \"kubernetes.io/projected/227021ae-b786-43b0-b61b-c3317fd4ee34-kube-api-access-6pzcn\") pod \"227021ae-b786-43b0-b61b-c3317fd4ee34\" (UID: \"227021ae-b786-43b0-b61b-c3317fd4ee34\") " Oct 11 01:11:53 crc kubenswrapper[4743]: I1011 01:11:53.003681 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24ktr\" (UniqueName: \"kubernetes.io/projected/3e5b2b21-68ca-41b3-96f4-f3dabc8cb94d-kube-api-access-24ktr\") pod \"3e5b2b21-68ca-41b3-96f4-f3dabc8cb94d\" (UID: \"3e5b2b21-68ca-41b3-96f4-f3dabc8cb94d\") " Oct 11 01:11:53 crc kubenswrapper[4743]: I1011 01:11:53.003738 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgg6x\" (UniqueName: \"kubernetes.io/projected/d6c0d49c-5d16-433e-aba5-de2c1b0746bf-kube-api-access-sgg6x\") pod \"d6c0d49c-5d16-433e-aba5-de2c1b0746bf\" (UID: \"d6c0d49c-5d16-433e-aba5-de2c1b0746bf\") " Oct 11 01:11:53 crc kubenswrapper[4743]: I1011 01:11:53.014468 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/227021ae-b786-43b0-b61b-c3317fd4ee34-kube-api-access-6pzcn" (OuterVolumeSpecName: "kube-api-access-6pzcn") pod "227021ae-b786-43b0-b61b-c3317fd4ee34" (UID: "227021ae-b786-43b0-b61b-c3317fd4ee34"). InnerVolumeSpecName "kube-api-access-6pzcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:53 crc kubenswrapper[4743]: I1011 01:11:53.014793 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6c0d49c-5d16-433e-aba5-de2c1b0746bf-kube-api-access-sgg6x" (OuterVolumeSpecName: "kube-api-access-sgg6x") pod "d6c0d49c-5d16-433e-aba5-de2c1b0746bf" (UID: "d6c0d49c-5d16-433e-aba5-de2c1b0746bf"). InnerVolumeSpecName "kube-api-access-sgg6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:53 crc kubenswrapper[4743]: I1011 01:11:53.036223 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e5b2b21-68ca-41b3-96f4-f3dabc8cb94d-kube-api-access-24ktr" (OuterVolumeSpecName: "kube-api-access-24ktr") pod "3e5b2b21-68ca-41b3-96f4-f3dabc8cb94d" (UID: "3e5b2b21-68ca-41b3-96f4-f3dabc8cb94d"). InnerVolumeSpecName "kube-api-access-24ktr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:53 crc kubenswrapper[4743]: I1011 01:11:53.107006 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24ktr\" (UniqueName: \"kubernetes.io/projected/3e5b2b21-68ca-41b3-96f4-f3dabc8cb94d-kube-api-access-24ktr\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:53 crc kubenswrapper[4743]: I1011 01:11:53.107035 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgg6x\" (UniqueName: \"kubernetes.io/projected/d6c0d49c-5d16-433e-aba5-de2c1b0746bf-kube-api-access-sgg6x\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:53 crc kubenswrapper[4743]: I1011 01:11:53.107045 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pzcn\" (UniqueName: \"kubernetes.io/projected/227021ae-b786-43b0-b61b-c3317fd4ee34-kube-api-access-6pzcn\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:53 crc kubenswrapper[4743]: I1011 01:11:53.739132 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-11c8-account-create-n8n9x" event={"ID":"d6c0d49c-5d16-433e-aba5-de2c1b0746bf","Type":"ContainerDied","Data":"f09e73fedd18e60a57382a9f7fad4d8b14c3a26d25815872ee4aef5ac6e3b313"} Oct 11 01:11:53 crc kubenswrapper[4743]: I1011 01:11:53.739418 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f09e73fedd18e60a57382a9f7fad4d8b14c3a26d25815872ee4aef5ac6e3b313" Oct 11 01:11:53 crc kubenswrapper[4743]: I1011 01:11:53.739524 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-11c8-account-create-n8n9x" Oct 11 01:11:53 crc kubenswrapper[4743]: I1011 01:11:53.744045 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-33f9-account-create-zvc5d" event={"ID":"227021ae-b786-43b0-b61b-c3317fd4ee34","Type":"ContainerDied","Data":"df2486e4dd52add9c5f908622e2061c19cc813be704dbcf998136f2be2cf77e1"} Oct 11 01:11:53 crc kubenswrapper[4743]: I1011 01:11:53.744085 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df2486e4dd52add9c5f908622e2061c19cc813be704dbcf998136f2be2cf77e1" Oct 11 01:11:53 crc kubenswrapper[4743]: I1011 01:11:53.744150 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-33f9-account-create-zvc5d" Oct 11 01:11:53 crc kubenswrapper[4743]: I1011 01:11:53.752789 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-70e4-account-create-v8fz8" event={"ID":"3e5b2b21-68ca-41b3-96f4-f3dabc8cb94d","Type":"ContainerDied","Data":"899b1ed883cef79d1e9012a21ffbd0575651b51a7ac22c35fab591dc15c1335b"} Oct 11 01:11:53 crc kubenswrapper[4743]: I1011 01:11:53.752828 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="899b1ed883cef79d1e9012a21ffbd0575651b51a7ac22c35fab591dc15c1335b" Oct 11 01:11:53 crc kubenswrapper[4743]: I1011 01:11:53.752865 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-70e4-account-create-v8fz8" Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.432075 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.512186 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zsfms"] Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.512483 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" podUID="6d320933-71b0-4dd8-abe3-a36a0ab9aa79" containerName="dnsmasq-dns" containerID="cri-o://69da59d1eb2dce31de3404affb7b7abe455e1698feead3f4f007cd3c2e08a6eb" gracePeriod=10 Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.555827 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xqz2q" Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.578024 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1f1d-account-create-8q67r" Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.692787 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-combined-ca-bundle\") pod \"3333bf90-e88b-455c-9719-60c0c49b83fe\" (UID: \"3333bf90-e88b-455c-9719-60c0c49b83fe\") " Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.693181 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-fernet-keys\") pod \"3333bf90-e88b-455c-9719-60c0c49b83fe\" (UID: \"3333bf90-e88b-455c-9719-60c0c49b83fe\") " Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.693209 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-config-data\") pod \"3333bf90-e88b-455c-9719-60c0c49b83fe\" (UID: \"3333bf90-e88b-455c-9719-60c0c49b83fe\") " Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.693228 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj24b\" (UniqueName: \"kubernetes.io/projected/3333bf90-e88b-455c-9719-60c0c49b83fe-kube-api-access-kj24b\") pod \"3333bf90-e88b-455c-9719-60c0c49b83fe\" (UID: \"3333bf90-e88b-455c-9719-60c0c49b83fe\") " Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.693341 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-credential-keys\") pod \"3333bf90-e88b-455c-9719-60c0c49b83fe\" (UID: \"3333bf90-e88b-455c-9719-60c0c49b83fe\") " Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.693370 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-scripts\") pod \"3333bf90-e88b-455c-9719-60c0c49b83fe\" (UID: \"3333bf90-e88b-455c-9719-60c0c49b83fe\") " Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.693453 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgr7g\" (UniqueName: \"kubernetes.io/projected/ad7c265c-766a-446c-863d-e6137abdd0e9-kube-api-access-tgr7g\") pod \"ad7c265c-766a-446c-863d-e6137abdd0e9\" (UID: \"ad7c265c-766a-446c-863d-e6137abdd0e9\") " Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.697218 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3333bf90-e88b-455c-9719-60c0c49b83fe" (UID: "3333bf90-e88b-455c-9719-60c0c49b83fe"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.697618 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad7c265c-766a-446c-863d-e6137abdd0e9-kube-api-access-tgr7g" (OuterVolumeSpecName: "kube-api-access-tgr7g") pod "ad7c265c-766a-446c-863d-e6137abdd0e9" (UID: "ad7c265c-766a-446c-863d-e6137abdd0e9"). InnerVolumeSpecName "kube-api-access-tgr7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.698382 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3333bf90-e88b-455c-9719-60c0c49b83fe" (UID: "3333bf90-e88b-455c-9719-60c0c49b83fe"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.698750 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3333bf90-e88b-455c-9719-60c0c49b83fe-kube-api-access-kj24b" (OuterVolumeSpecName: "kube-api-access-kj24b") pod "3333bf90-e88b-455c-9719-60c0c49b83fe" (UID: "3333bf90-e88b-455c-9719-60c0c49b83fe"). InnerVolumeSpecName "kube-api-access-kj24b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.701285 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-scripts" (OuterVolumeSpecName: "scripts") pod "3333bf90-e88b-455c-9719-60c0c49b83fe" (UID: "3333bf90-e88b-455c-9719-60c0c49b83fe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.724588 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3333bf90-e88b-455c-9719-60c0c49b83fe" (UID: "3333bf90-e88b-455c-9719-60c0c49b83fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.731325 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-config-data" (OuterVolumeSpecName: "config-data") pod "3333bf90-e88b-455c-9719-60c0c49b83fe" (UID: "3333bf90-e88b-455c-9719-60c0c49b83fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.795359 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgr7g\" (UniqueName: \"kubernetes.io/projected/ad7c265c-766a-446c-863d-e6137abdd0e9-kube-api-access-tgr7g\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.795388 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.795398 4743 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.795409 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.795432 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj24b\" (UniqueName: \"kubernetes.io/projected/3333bf90-e88b-455c-9719-60c0c49b83fe-kube-api-access-kj24b\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.795441 4743 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.795449 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3333bf90-e88b-455c-9719-60c0c49b83fe-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.805530 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-z4tv4" event={"ID":"7e0bb9e3-337b-443b-baa8-8ea69d351ea1","Type":"ContainerStarted","Data":"2bfe9875a178be1418db1466a53073392e86b327aca29ae3ce6d9aa6667e8240"} Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.819064 4743 generic.go:334] "Generic (PLEG): container finished" podID="6d320933-71b0-4dd8-abe3-a36a0ab9aa79" containerID="69da59d1eb2dce31de3404affb7b7abe455e1698feead3f4f007cd3c2e08a6eb" exitCode=0 Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.819182 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" event={"ID":"6d320933-71b0-4dd8-abe3-a36a0ab9aa79","Type":"ContainerDied","Data":"69da59d1eb2dce31de3404affb7b7abe455e1698feead3f4f007cd3c2e08a6eb"} Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.820883 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1f1d-account-create-8q67r" event={"ID":"ad7c265c-766a-446c-863d-e6137abdd0e9","Type":"ContainerDied","Data":"ab9c7ca7c58a30a1c62d97f0efeac8df52fc777fd929123f66c0c78fb32640e0"} Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.820935 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab9c7ca7c58a30a1c62d97f0efeac8df52fc777fd929123f66c0c78fb32640e0" Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.820987 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1f1d-account-create-8q67r" Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.826880 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-z4tv4" podStartSLOduration=1.7205408709999999 podStartE2EDuration="10.826834474s" podCreationTimestamp="2025-10-11 01:11:46 +0000 UTC" firstStartedPulling="2025-10-11 01:11:47.222143668 +0000 UTC m=+1201.875124065" lastFinishedPulling="2025-10-11 01:11:56.328437231 +0000 UTC m=+1210.981417668" observedRunningTime="2025-10-11 01:11:56.824349841 +0000 UTC m=+1211.477330238" watchObservedRunningTime="2025-10-11 01:11:56.826834474 +0000 UTC m=+1211.479814871" Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.833326 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xqz2q" event={"ID":"3333bf90-e88b-455c-9719-60c0c49b83fe","Type":"ContainerDied","Data":"4584a51687c5b1f909c0a273f081a540ed52054075bcb179d5f33ac61ff6952b"} Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.833368 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4584a51687c5b1f909c0a273f081a540ed52054075bcb179d5f33ac61ff6952b" Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.833420 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xqz2q" Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.842648 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d","Type":"ContainerStarted","Data":"5e31815cc6bb002a54d9b48fcc0bd3a79d54a858afc2f6c912547a5573fc65ba"} Oct 11 01:11:56 crc kubenswrapper[4743]: I1011 01:11:56.983328 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.099722 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-ovsdbserver-nb\") pod \"6d320933-71b0-4dd8-abe3-a36a0ab9aa79\" (UID: \"6d320933-71b0-4dd8-abe3-a36a0ab9aa79\") " Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.099814 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-ovsdbserver-sb\") pod \"6d320933-71b0-4dd8-abe3-a36a0ab9aa79\" (UID: \"6d320933-71b0-4dd8-abe3-a36a0ab9aa79\") " Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.099890 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg42l\" (UniqueName: \"kubernetes.io/projected/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-kube-api-access-tg42l\") pod \"6d320933-71b0-4dd8-abe3-a36a0ab9aa79\" (UID: \"6d320933-71b0-4dd8-abe3-a36a0ab9aa79\") " Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.099981 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-config\") pod \"6d320933-71b0-4dd8-abe3-a36a0ab9aa79\" (UID: \"6d320933-71b0-4dd8-abe3-a36a0ab9aa79\") " Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.100008 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-dns-svc\") pod \"6d320933-71b0-4dd8-abe3-a36a0ab9aa79\" (UID: \"6d320933-71b0-4dd8-abe3-a36a0ab9aa79\") " Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.115064 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-kube-api-access-tg42l" (OuterVolumeSpecName: "kube-api-access-tg42l") pod "6d320933-71b0-4dd8-abe3-a36a0ab9aa79" (UID: "6d320933-71b0-4dd8-abe3-a36a0ab9aa79"). InnerVolumeSpecName "kube-api-access-tg42l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.157534 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-config" (OuterVolumeSpecName: "config") pod "6d320933-71b0-4dd8-abe3-a36a0ab9aa79" (UID: "6d320933-71b0-4dd8-abe3-a36a0ab9aa79"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.158895 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6d320933-71b0-4dd8-abe3-a36a0ab9aa79" (UID: "6d320933-71b0-4dd8-abe3-a36a0ab9aa79"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.159045 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6d320933-71b0-4dd8-abe3-a36a0ab9aa79" (UID: "6d320933-71b0-4dd8-abe3-a36a0ab9aa79"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.162264 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6d320933-71b0-4dd8-abe3-a36a0ab9aa79" (UID: "6d320933-71b0-4dd8-abe3-a36a0ab9aa79"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.202194 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg42l\" (UniqueName: \"kubernetes.io/projected/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-kube-api-access-tg42l\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.202225 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.202237 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.202245 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.202252 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d320933-71b0-4dd8-abe3-a36a0ab9aa79-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.671151 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xqz2q"] Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.680301 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xqz2q"] Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.744125 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xq7tg"] Oct 11 01:11:57 crc kubenswrapper[4743]: E1011 01:11:57.744578 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e5b2b21-68ca-41b3-96f4-f3dabc8cb94d" containerName="mariadb-account-create" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.744601 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e5b2b21-68ca-41b3-96f4-f3dabc8cb94d" containerName="mariadb-account-create" Oct 11 01:11:57 crc kubenswrapper[4743]: E1011 01:11:57.744635 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d320933-71b0-4dd8-abe3-a36a0ab9aa79" containerName="dnsmasq-dns" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.744645 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d320933-71b0-4dd8-abe3-a36a0ab9aa79" containerName="dnsmasq-dns" Oct 11 01:11:57 crc kubenswrapper[4743]: E1011 01:11:57.744657 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d320933-71b0-4dd8-abe3-a36a0ab9aa79" containerName="init" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.744667 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d320933-71b0-4dd8-abe3-a36a0ab9aa79" containerName="init" Oct 11 01:11:57 crc kubenswrapper[4743]: E1011 01:11:57.744695 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6c0d49c-5d16-433e-aba5-de2c1b0746bf" containerName="mariadb-account-create" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.744705 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6c0d49c-5d16-433e-aba5-de2c1b0746bf" containerName="mariadb-account-create" Oct 11 01:11:57 crc kubenswrapper[4743]: E1011 01:11:57.744718 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad7c265c-766a-446c-863d-e6137abdd0e9" containerName="mariadb-account-create" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.744725 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad7c265c-766a-446c-863d-e6137abdd0e9" containerName="mariadb-account-create" Oct 11 01:11:57 crc kubenswrapper[4743]: E1011 01:11:57.744742 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="227021ae-b786-43b0-b61b-c3317fd4ee34" containerName="mariadb-account-create" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.744749 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="227021ae-b786-43b0-b61b-c3317fd4ee34" containerName="mariadb-account-create" Oct 11 01:11:57 crc kubenswrapper[4743]: E1011 01:11:57.744762 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3333bf90-e88b-455c-9719-60c0c49b83fe" containerName="keystone-bootstrap" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.744770 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3333bf90-e88b-455c-9719-60c0c49b83fe" containerName="keystone-bootstrap" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.744985 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d320933-71b0-4dd8-abe3-a36a0ab9aa79" containerName="dnsmasq-dns" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.745000 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6c0d49c-5d16-433e-aba5-de2c1b0746bf" containerName="mariadb-account-create" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.745010 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad7c265c-766a-446c-863d-e6137abdd0e9" containerName="mariadb-account-create" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.745030 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3333bf90-e88b-455c-9719-60c0c49b83fe" containerName="keystone-bootstrap" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.745046 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="227021ae-b786-43b0-b61b-c3317fd4ee34" containerName="mariadb-account-create" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.745060 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e5b2b21-68ca-41b3-96f4-f3dabc8cb94d" containerName="mariadb-account-create" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.745883 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xq7tg" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.747840 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.748728 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.748790 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.748951 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4qpmq" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.768265 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xq7tg"] Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.857416 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.858221 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zsfms" event={"ID":"6d320933-71b0-4dd8-abe3-a36a0ab9aa79","Type":"ContainerDied","Data":"f5f7f6a2692056529c4fa8cb328abf623877410cc03169a23f53618d967ac8e5"} Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.858260 4743 scope.go:117] "RemoveContainer" containerID="69da59d1eb2dce31de3404affb7b7abe455e1698feead3f4f007cd3c2e08a6eb" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.890589 4743 scope.go:117] "RemoveContainer" containerID="4fa13e948631dee90c3ba0e53d0f99c64ccb6434dae2d7716fe3d6995d728301" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.896824 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zsfms"] Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.908160 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zsfms"] Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.923147 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-config-data\") pod \"keystone-bootstrap-xq7tg\" (UID: \"bca21196-09b5-4f5a-b690-19afdf3318b4\") " pod="openstack/keystone-bootstrap-xq7tg" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.923215 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-combined-ca-bundle\") pod \"keystone-bootstrap-xq7tg\" (UID: \"bca21196-09b5-4f5a-b690-19afdf3318b4\") " pod="openstack/keystone-bootstrap-xq7tg" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.923252 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-credential-keys\") pod \"keystone-bootstrap-xq7tg\" (UID: \"bca21196-09b5-4f5a-b690-19afdf3318b4\") " pod="openstack/keystone-bootstrap-xq7tg" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.923319 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-scripts\") pod \"keystone-bootstrap-xq7tg\" (UID: \"bca21196-09b5-4f5a-b690-19afdf3318b4\") " pod="openstack/keystone-bootstrap-xq7tg" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.923402 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bbj4\" (UniqueName: \"kubernetes.io/projected/bca21196-09b5-4f5a-b690-19afdf3318b4-kube-api-access-5bbj4\") pod \"keystone-bootstrap-xq7tg\" (UID: \"bca21196-09b5-4f5a-b690-19afdf3318b4\") " pod="openstack/keystone-bootstrap-xq7tg" Oct 11 01:11:57 crc kubenswrapper[4743]: I1011 01:11:57.923449 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-fernet-keys\") pod \"keystone-bootstrap-xq7tg\" (UID: \"bca21196-09b5-4f5a-b690-19afdf3318b4\") " pod="openstack/keystone-bootstrap-xq7tg" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.025474 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-scripts\") pod \"keystone-bootstrap-xq7tg\" (UID: \"bca21196-09b5-4f5a-b690-19afdf3318b4\") " pod="openstack/keystone-bootstrap-xq7tg" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.025711 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bbj4\" (UniqueName: \"kubernetes.io/projected/bca21196-09b5-4f5a-b690-19afdf3318b4-kube-api-access-5bbj4\") pod \"keystone-bootstrap-xq7tg\" (UID: \"bca21196-09b5-4f5a-b690-19afdf3318b4\") " pod="openstack/keystone-bootstrap-xq7tg" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.025780 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-fernet-keys\") pod \"keystone-bootstrap-xq7tg\" (UID: \"bca21196-09b5-4f5a-b690-19afdf3318b4\") " pod="openstack/keystone-bootstrap-xq7tg" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.025916 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-config-data\") pod \"keystone-bootstrap-xq7tg\" (UID: \"bca21196-09b5-4f5a-b690-19afdf3318b4\") " pod="openstack/keystone-bootstrap-xq7tg" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.025959 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-combined-ca-bundle\") pod \"keystone-bootstrap-xq7tg\" (UID: \"bca21196-09b5-4f5a-b690-19afdf3318b4\") " pod="openstack/keystone-bootstrap-xq7tg" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.026014 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-credential-keys\") pod \"keystone-bootstrap-xq7tg\" (UID: \"bca21196-09b5-4f5a-b690-19afdf3318b4\") " pod="openstack/keystone-bootstrap-xq7tg" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.035256 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-scripts\") pod \"keystone-bootstrap-xq7tg\" (UID: \"bca21196-09b5-4f5a-b690-19afdf3318b4\") " pod="openstack/keystone-bootstrap-xq7tg" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.038276 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-config-data\") pod \"keystone-bootstrap-xq7tg\" (UID: \"bca21196-09b5-4f5a-b690-19afdf3318b4\") " pod="openstack/keystone-bootstrap-xq7tg" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.038628 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-credential-keys\") pod \"keystone-bootstrap-xq7tg\" (UID: \"bca21196-09b5-4f5a-b690-19afdf3318b4\") " pod="openstack/keystone-bootstrap-xq7tg" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.043436 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-fernet-keys\") pod \"keystone-bootstrap-xq7tg\" (UID: \"bca21196-09b5-4f5a-b690-19afdf3318b4\") " pod="openstack/keystone-bootstrap-xq7tg" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.043689 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-combined-ca-bundle\") pod \"keystone-bootstrap-xq7tg\" (UID: \"bca21196-09b5-4f5a-b690-19afdf3318b4\") " pod="openstack/keystone-bootstrap-xq7tg" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.047258 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bbj4\" (UniqueName: \"kubernetes.io/projected/bca21196-09b5-4f5a-b690-19afdf3318b4-kube-api-access-5bbj4\") pod \"keystone-bootstrap-xq7tg\" (UID: \"bca21196-09b5-4f5a-b690-19afdf3318b4\") " pod="openstack/keystone-bootstrap-xq7tg" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.068718 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xq7tg" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.108206 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3333bf90-e88b-455c-9719-60c0c49b83fe" path="/var/lib/kubelet/pods/3333bf90-e88b-455c-9719-60c0c49b83fe/volumes" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.116541 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d320933-71b0-4dd8-abe3-a36a0ab9aa79" path="/var/lib/kubelet/pods/6d320933-71b0-4dd8-abe3-a36a0ab9aa79/volumes" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.556689 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xq7tg"] Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.690378 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-f4rpt"] Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.692395 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-f4rpt" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.696239 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.697305 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-8n2gf" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.699029 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-f4rpt"] Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.847528 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5bpr\" (UniqueName: \"kubernetes.io/projected/476a4c6e-ddae-4974-a899-78a8f1ee973d-kube-api-access-q5bpr\") pod \"heat-db-sync-f4rpt\" (UID: \"476a4c6e-ddae-4974-a899-78a8f1ee973d\") " pod="openstack/heat-db-sync-f4rpt" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.847601 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/476a4c6e-ddae-4974-a899-78a8f1ee973d-config-data\") pod \"heat-db-sync-f4rpt\" (UID: \"476a4c6e-ddae-4974-a899-78a8f1ee973d\") " pod="openstack/heat-db-sync-f4rpt" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.847714 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/476a4c6e-ddae-4974-a899-78a8f1ee973d-combined-ca-bundle\") pod \"heat-db-sync-f4rpt\" (UID: \"476a4c6e-ddae-4974-a899-78a8f1ee973d\") " pod="openstack/heat-db-sync-f4rpt" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.877987 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xq7tg" event={"ID":"bca21196-09b5-4f5a-b690-19afdf3318b4","Type":"ContainerStarted","Data":"6cb7f6c66a09d4553d55d7d29dab74982ef213dba0017818e02949cad0b45d4e"} Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.949264 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5bpr\" (UniqueName: \"kubernetes.io/projected/476a4c6e-ddae-4974-a899-78a8f1ee973d-kube-api-access-q5bpr\") pod \"heat-db-sync-f4rpt\" (UID: \"476a4c6e-ddae-4974-a899-78a8f1ee973d\") " pod="openstack/heat-db-sync-f4rpt" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.949614 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/476a4c6e-ddae-4974-a899-78a8f1ee973d-config-data\") pod \"heat-db-sync-f4rpt\" (UID: \"476a4c6e-ddae-4974-a899-78a8f1ee973d\") " pod="openstack/heat-db-sync-f4rpt" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.949688 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/476a4c6e-ddae-4974-a899-78a8f1ee973d-combined-ca-bundle\") pod \"heat-db-sync-f4rpt\" (UID: \"476a4c6e-ddae-4974-a899-78a8f1ee973d\") " pod="openstack/heat-db-sync-f4rpt" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.956727 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/476a4c6e-ddae-4974-a899-78a8f1ee973d-config-data\") pod \"heat-db-sync-f4rpt\" (UID: \"476a4c6e-ddae-4974-a899-78a8f1ee973d\") " pod="openstack/heat-db-sync-f4rpt" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.960416 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/476a4c6e-ddae-4974-a899-78a8f1ee973d-combined-ca-bundle\") pod \"heat-db-sync-f4rpt\" (UID: \"476a4c6e-ddae-4974-a899-78a8f1ee973d\") " pod="openstack/heat-db-sync-f4rpt" Oct 11 01:11:58 crc kubenswrapper[4743]: I1011 01:11:58.973331 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5bpr\" (UniqueName: \"kubernetes.io/projected/476a4c6e-ddae-4974-a899-78a8f1ee973d-kube-api-access-q5bpr\") pod \"heat-db-sync-f4rpt\" (UID: \"476a4c6e-ddae-4974-a899-78a8f1ee973d\") " pod="openstack/heat-db-sync-f4rpt" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.090443 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-s49bx"] Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.091613 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-s49bx" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.094743 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.094997 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bgp4n" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.095225 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.111681 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-s49bx"] Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.184772 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-6pzxd"] Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.186904 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6pzxd" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.188319 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-f4rpt" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.189843 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5zgpr" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.190290 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.208840 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-6pzxd"] Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.253844 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f85e310-6acf-40d8-b07d-9e1c9b4d997b-config\") pod \"neutron-db-sync-s49bx\" (UID: \"2f85e310-6acf-40d8-b07d-9e1c9b4d997b\") " pod="openstack/neutron-db-sync-s49bx" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.253901 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f85e310-6acf-40d8-b07d-9e1c9b4d997b-combined-ca-bundle\") pod \"neutron-db-sync-s49bx\" (UID: \"2f85e310-6acf-40d8-b07d-9e1c9b4d997b\") " pod="openstack/neutron-db-sync-s49bx" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.253929 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x89p4\" (UniqueName: \"kubernetes.io/projected/2f85e310-6acf-40d8-b07d-9e1c9b4d997b-kube-api-access-x89p4\") pod \"neutron-db-sync-s49bx\" (UID: \"2f85e310-6acf-40d8-b07d-9e1c9b4d997b\") " pod="openstack/neutron-db-sync-s49bx" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.279012 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-przwv"] Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.280169 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-przwv" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.291998 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.292422 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.292551 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-sdbk7" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.300623 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-przwv"] Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.355777 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615827fb-c1f4-46c6-8014-00c71fe2403b-combined-ca-bundle\") pod \"barbican-db-sync-6pzxd\" (UID: \"615827fb-c1f4-46c6-8014-00c71fe2403b\") " pod="openstack/barbican-db-sync-6pzxd" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.355848 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f85e310-6acf-40d8-b07d-9e1c9b4d997b-config\") pod \"neutron-db-sync-s49bx\" (UID: \"2f85e310-6acf-40d8-b07d-9e1c9b4d997b\") " pod="openstack/neutron-db-sync-s49bx" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.355898 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f85e310-6acf-40d8-b07d-9e1c9b4d997b-combined-ca-bundle\") pod \"neutron-db-sync-s49bx\" (UID: \"2f85e310-6acf-40d8-b07d-9e1c9b4d997b\") " pod="openstack/neutron-db-sync-s49bx" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.355923 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x89p4\" (UniqueName: \"kubernetes.io/projected/2f85e310-6acf-40d8-b07d-9e1c9b4d997b-kube-api-access-x89p4\") pod \"neutron-db-sync-s49bx\" (UID: \"2f85e310-6acf-40d8-b07d-9e1c9b4d997b\") " pod="openstack/neutron-db-sync-s49bx" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.356009 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4z8m\" (UniqueName: \"kubernetes.io/projected/615827fb-c1f4-46c6-8014-00c71fe2403b-kube-api-access-t4z8m\") pod \"barbican-db-sync-6pzxd\" (UID: \"615827fb-c1f4-46c6-8014-00c71fe2403b\") " pod="openstack/barbican-db-sync-6pzxd" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.356066 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/615827fb-c1f4-46c6-8014-00c71fe2403b-db-sync-config-data\") pod \"barbican-db-sync-6pzxd\" (UID: \"615827fb-c1f4-46c6-8014-00c71fe2403b\") " pod="openstack/barbican-db-sync-6pzxd" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.360322 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f85e310-6acf-40d8-b07d-9e1c9b4d997b-config\") pod \"neutron-db-sync-s49bx\" (UID: \"2f85e310-6acf-40d8-b07d-9e1c9b4d997b\") " pod="openstack/neutron-db-sync-s49bx" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.361832 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f85e310-6acf-40d8-b07d-9e1c9b4d997b-combined-ca-bundle\") pod \"neutron-db-sync-s49bx\" (UID: \"2f85e310-6acf-40d8-b07d-9e1c9b4d997b\") " pod="openstack/neutron-db-sync-s49bx" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.373838 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x89p4\" (UniqueName: \"kubernetes.io/projected/2f85e310-6acf-40d8-b07d-9e1c9b4d997b-kube-api-access-x89p4\") pod \"neutron-db-sync-s49bx\" (UID: \"2f85e310-6acf-40d8-b07d-9e1c9b4d997b\") " pod="openstack/neutron-db-sync-s49bx" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.458477 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4z8m\" (UniqueName: \"kubernetes.io/projected/615827fb-c1f4-46c6-8014-00c71fe2403b-kube-api-access-t4z8m\") pod \"barbican-db-sync-6pzxd\" (UID: \"615827fb-c1f4-46c6-8014-00c71fe2403b\") " pod="openstack/barbican-db-sync-6pzxd" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.458532 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65f773e4-09e5-4312-8c6b-9176a1a022f0-combined-ca-bundle\") pod \"cinder-db-sync-przwv\" (UID: \"65f773e4-09e5-4312-8c6b-9176a1a022f0\") " pod="openstack/cinder-db-sync-przwv" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.458559 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65f773e4-09e5-4312-8c6b-9176a1a022f0-scripts\") pod \"cinder-db-sync-przwv\" (UID: \"65f773e4-09e5-4312-8c6b-9176a1a022f0\") " pod="openstack/cinder-db-sync-przwv" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.458586 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/615827fb-c1f4-46c6-8014-00c71fe2403b-db-sync-config-data\") pod \"barbican-db-sync-6pzxd\" (UID: \"615827fb-c1f4-46c6-8014-00c71fe2403b\") " pod="openstack/barbican-db-sync-6pzxd" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.459265 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615827fb-c1f4-46c6-8014-00c71fe2403b-combined-ca-bundle\") pod \"barbican-db-sync-6pzxd\" (UID: \"615827fb-c1f4-46c6-8014-00c71fe2403b\") " pod="openstack/barbican-db-sync-6pzxd" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.459319 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65f773e4-09e5-4312-8c6b-9176a1a022f0-config-data\") pod \"cinder-db-sync-przwv\" (UID: \"65f773e4-09e5-4312-8c6b-9176a1a022f0\") " pod="openstack/cinder-db-sync-przwv" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.459358 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlw9m\" (UniqueName: \"kubernetes.io/projected/65f773e4-09e5-4312-8c6b-9176a1a022f0-kube-api-access-dlw9m\") pod \"cinder-db-sync-przwv\" (UID: \"65f773e4-09e5-4312-8c6b-9176a1a022f0\") " pod="openstack/cinder-db-sync-przwv" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.459383 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/65f773e4-09e5-4312-8c6b-9176a1a022f0-etc-machine-id\") pod \"cinder-db-sync-przwv\" (UID: \"65f773e4-09e5-4312-8c6b-9176a1a022f0\") " pod="openstack/cinder-db-sync-przwv" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.459408 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/65f773e4-09e5-4312-8c6b-9176a1a022f0-db-sync-config-data\") pod \"cinder-db-sync-przwv\" (UID: \"65f773e4-09e5-4312-8c6b-9176a1a022f0\") " pod="openstack/cinder-db-sync-przwv" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.463094 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615827fb-c1f4-46c6-8014-00c71fe2403b-combined-ca-bundle\") pod \"barbican-db-sync-6pzxd\" (UID: \"615827fb-c1f4-46c6-8014-00c71fe2403b\") " pod="openstack/barbican-db-sync-6pzxd" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.466904 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/615827fb-c1f4-46c6-8014-00c71fe2403b-db-sync-config-data\") pod \"barbican-db-sync-6pzxd\" (UID: \"615827fb-c1f4-46c6-8014-00c71fe2403b\") " pod="openstack/barbican-db-sync-6pzxd" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.468116 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-s49bx" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.475652 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4z8m\" (UniqueName: \"kubernetes.io/projected/615827fb-c1f4-46c6-8014-00c71fe2403b-kube-api-access-t4z8m\") pod \"barbican-db-sync-6pzxd\" (UID: \"615827fb-c1f4-46c6-8014-00c71fe2403b\") " pod="openstack/barbican-db-sync-6pzxd" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.502908 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6pzxd" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.561206 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65f773e4-09e5-4312-8c6b-9176a1a022f0-config-data\") pod \"cinder-db-sync-przwv\" (UID: \"65f773e4-09e5-4312-8c6b-9176a1a022f0\") " pod="openstack/cinder-db-sync-przwv" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.561252 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlw9m\" (UniqueName: \"kubernetes.io/projected/65f773e4-09e5-4312-8c6b-9176a1a022f0-kube-api-access-dlw9m\") pod \"cinder-db-sync-przwv\" (UID: \"65f773e4-09e5-4312-8c6b-9176a1a022f0\") " pod="openstack/cinder-db-sync-przwv" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.561281 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/65f773e4-09e5-4312-8c6b-9176a1a022f0-etc-machine-id\") pod \"cinder-db-sync-przwv\" (UID: \"65f773e4-09e5-4312-8c6b-9176a1a022f0\") " pod="openstack/cinder-db-sync-przwv" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.561302 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/65f773e4-09e5-4312-8c6b-9176a1a022f0-db-sync-config-data\") pod \"cinder-db-sync-przwv\" (UID: \"65f773e4-09e5-4312-8c6b-9176a1a022f0\") " pod="openstack/cinder-db-sync-przwv" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.561344 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65f773e4-09e5-4312-8c6b-9176a1a022f0-combined-ca-bundle\") pod \"cinder-db-sync-przwv\" (UID: \"65f773e4-09e5-4312-8c6b-9176a1a022f0\") " pod="openstack/cinder-db-sync-przwv" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.561363 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65f773e4-09e5-4312-8c6b-9176a1a022f0-scripts\") pod \"cinder-db-sync-przwv\" (UID: \"65f773e4-09e5-4312-8c6b-9176a1a022f0\") " pod="openstack/cinder-db-sync-przwv" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.563383 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/65f773e4-09e5-4312-8c6b-9176a1a022f0-etc-machine-id\") pod \"cinder-db-sync-przwv\" (UID: \"65f773e4-09e5-4312-8c6b-9176a1a022f0\") " pod="openstack/cinder-db-sync-przwv" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.567843 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65f773e4-09e5-4312-8c6b-9176a1a022f0-config-data\") pod \"cinder-db-sync-przwv\" (UID: \"65f773e4-09e5-4312-8c6b-9176a1a022f0\") " pod="openstack/cinder-db-sync-przwv" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.571283 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/65f773e4-09e5-4312-8c6b-9176a1a022f0-db-sync-config-data\") pod \"cinder-db-sync-przwv\" (UID: \"65f773e4-09e5-4312-8c6b-9176a1a022f0\") " pod="openstack/cinder-db-sync-przwv" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.579731 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65f773e4-09e5-4312-8c6b-9176a1a022f0-combined-ca-bundle\") pod \"cinder-db-sync-przwv\" (UID: \"65f773e4-09e5-4312-8c6b-9176a1a022f0\") " pod="openstack/cinder-db-sync-przwv" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.582877 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65f773e4-09e5-4312-8c6b-9176a1a022f0-scripts\") pod \"cinder-db-sync-przwv\" (UID: \"65f773e4-09e5-4312-8c6b-9176a1a022f0\") " pod="openstack/cinder-db-sync-przwv" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.591151 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlw9m\" (UniqueName: \"kubernetes.io/projected/65f773e4-09e5-4312-8c6b-9176a1a022f0-kube-api-access-dlw9m\") pod \"cinder-db-sync-przwv\" (UID: \"65f773e4-09e5-4312-8c6b-9176a1a022f0\") " pod="openstack/cinder-db-sync-przwv" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.602332 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-przwv" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.640424 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-f4rpt"] Oct 11 01:11:59 crc kubenswrapper[4743]: W1011 01:11:59.677974 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod476a4c6e_ddae_4974_a899_78a8f1ee973d.slice/crio-5989f7e1faa849964c8d05d5d494b306b13ce63c3028c3f076fe4f2376cfa2f2 WatchSource:0}: Error finding container 5989f7e1faa849964c8d05d5d494b306b13ce63c3028c3f076fe4f2376cfa2f2: Status 404 returned error can't find the container with id 5989f7e1faa849964c8d05d5d494b306b13ce63c3028c3f076fe4f2376cfa2f2 Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.894276 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d","Type":"ContainerStarted","Data":"e086d9b7f67225d474a6748ad81b767991a3103ef0ccdcd7b0f7231e886e45f0"} Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.897814 4743 generic.go:334] "Generic (PLEG): container finished" podID="7e0bb9e3-337b-443b-baa8-8ea69d351ea1" containerID="2bfe9875a178be1418db1466a53073392e86b327aca29ae3ce6d9aa6667e8240" exitCode=0 Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.897874 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-z4tv4" event={"ID":"7e0bb9e3-337b-443b-baa8-8ea69d351ea1","Type":"ContainerDied","Data":"2bfe9875a178be1418db1466a53073392e86b327aca29ae3ce6d9aa6667e8240"} Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.901281 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xq7tg" event={"ID":"bca21196-09b5-4f5a-b690-19afdf3318b4","Type":"ContainerStarted","Data":"fe4f06c5afe259bd4b23460bfc524174724ed17a5da1b63ebc03431735a3daca"} Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.902836 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-f4rpt" event={"ID":"476a4c6e-ddae-4974-a899-78a8f1ee973d","Type":"ContainerStarted","Data":"5989f7e1faa849964c8d05d5d494b306b13ce63c3028c3f076fe4f2376cfa2f2"} Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.933613 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xq7tg" podStartSLOduration=2.933595744 podStartE2EDuration="2.933595744s" podCreationTimestamp="2025-10-11 01:11:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:11:59.931266775 +0000 UTC m=+1214.584247182" watchObservedRunningTime="2025-10-11 01:11:59.933595744 +0000 UTC m=+1214.586576141" Oct 11 01:11:59 crc kubenswrapper[4743]: I1011 01:11:59.958436 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-s49bx"] Oct 11 01:11:59 crc kubenswrapper[4743]: W1011 01:11:59.977333 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f85e310_6acf_40d8_b07d_9e1c9b4d997b.slice/crio-3b830f93be95d366b5539383f3016e9186805775008cd18356a4760de5b2c9e7 WatchSource:0}: Error finding container 3b830f93be95d366b5539383f3016e9186805775008cd18356a4760de5b2c9e7: Status 404 returned error can't find the container with id 3b830f93be95d366b5539383f3016e9186805775008cd18356a4760de5b2c9e7 Oct 11 01:12:00 crc kubenswrapper[4743]: I1011 01:12:00.107556 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-6pzxd"] Oct 11 01:12:00 crc kubenswrapper[4743]: W1011 01:12:00.118584 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod615827fb_c1f4_46c6_8014_00c71fe2403b.slice/crio-d85b2952cca6b8b5169af722e61ce5c5349ed99f25b681cc42051f2a88760911 WatchSource:0}: Error finding container d85b2952cca6b8b5169af722e61ce5c5349ed99f25b681cc42051f2a88760911: Status 404 returned error can't find the container with id d85b2952cca6b8b5169af722e61ce5c5349ed99f25b681cc42051f2a88760911 Oct 11 01:12:00 crc kubenswrapper[4743]: I1011 01:12:00.218484 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-przwv"] Oct 11 01:12:00 crc kubenswrapper[4743]: W1011 01:12:00.228601 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65f773e4_09e5_4312_8c6b_9176a1a022f0.slice/crio-69fb0628444ca1a53808d3b17d168b4ae98929f731e2b8062727367ca9ebedc4 WatchSource:0}: Error finding container 69fb0628444ca1a53808d3b17d168b4ae98929f731e2b8062727367ca9ebedc4: Status 404 returned error can't find the container with id 69fb0628444ca1a53808d3b17d168b4ae98929f731e2b8062727367ca9ebedc4 Oct 11 01:12:00 crc kubenswrapper[4743]: I1011 01:12:00.982606 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-s49bx" event={"ID":"2f85e310-6acf-40d8-b07d-9e1c9b4d997b","Type":"ContainerStarted","Data":"15eb13889fe265ecb53a8acb3699eb24d0795d5cfaded1b59312c356d6249424"} Oct 11 01:12:00 crc kubenswrapper[4743]: I1011 01:12:00.983376 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-s49bx" event={"ID":"2f85e310-6acf-40d8-b07d-9e1c9b4d997b","Type":"ContainerStarted","Data":"3b830f93be95d366b5539383f3016e9186805775008cd18356a4760de5b2c9e7"} Oct 11 01:12:00 crc kubenswrapper[4743]: I1011 01:12:00.986134 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-przwv" event={"ID":"65f773e4-09e5-4312-8c6b-9176a1a022f0","Type":"ContainerStarted","Data":"69fb0628444ca1a53808d3b17d168b4ae98929f731e2b8062727367ca9ebedc4"} Oct 11 01:12:00 crc kubenswrapper[4743]: I1011 01:12:00.988724 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6pzxd" event={"ID":"615827fb-c1f4-46c6-8014-00c71fe2403b","Type":"ContainerStarted","Data":"d85b2952cca6b8b5169af722e61ce5c5349ed99f25b681cc42051f2a88760911"} Oct 11 01:12:01 crc kubenswrapper[4743]: I1011 01:12:01.018272 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-s49bx" podStartSLOduration=2.0182538 podStartE2EDuration="2.0182538s" podCreationTimestamp="2025-10-11 01:11:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:12:01.011982271 +0000 UTC m=+1215.664962658" watchObservedRunningTime="2025-10-11 01:12:01.0182538 +0000 UTC m=+1215.671234197" Oct 11 01:12:01 crc kubenswrapper[4743]: I1011 01:12:01.516342 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-z4tv4" Oct 11 01:12:01 crc kubenswrapper[4743]: I1011 01:12:01.603621 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-scripts\") pod \"7e0bb9e3-337b-443b-baa8-8ea69d351ea1\" (UID: \"7e0bb9e3-337b-443b-baa8-8ea69d351ea1\") " Oct 11 01:12:01 crc kubenswrapper[4743]: I1011 01:12:01.603703 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-config-data\") pod \"7e0bb9e3-337b-443b-baa8-8ea69d351ea1\" (UID: \"7e0bb9e3-337b-443b-baa8-8ea69d351ea1\") " Oct 11 01:12:01 crc kubenswrapper[4743]: I1011 01:12:01.603759 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-combined-ca-bundle\") pod \"7e0bb9e3-337b-443b-baa8-8ea69d351ea1\" (UID: \"7e0bb9e3-337b-443b-baa8-8ea69d351ea1\") " Oct 11 01:12:01 crc kubenswrapper[4743]: I1011 01:12:01.603879 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-logs\") pod \"7e0bb9e3-337b-443b-baa8-8ea69d351ea1\" (UID: \"7e0bb9e3-337b-443b-baa8-8ea69d351ea1\") " Oct 11 01:12:01 crc kubenswrapper[4743]: I1011 01:12:01.603903 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dq9z\" (UniqueName: \"kubernetes.io/projected/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-kube-api-access-5dq9z\") pod \"7e0bb9e3-337b-443b-baa8-8ea69d351ea1\" (UID: \"7e0bb9e3-337b-443b-baa8-8ea69d351ea1\") " Oct 11 01:12:01 crc kubenswrapper[4743]: I1011 01:12:01.627002 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-scripts" (OuterVolumeSpecName: "scripts") pod "7e0bb9e3-337b-443b-baa8-8ea69d351ea1" (UID: "7e0bb9e3-337b-443b-baa8-8ea69d351ea1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:01 crc kubenswrapper[4743]: I1011 01:12:01.627942 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-logs" (OuterVolumeSpecName: "logs") pod "7e0bb9e3-337b-443b-baa8-8ea69d351ea1" (UID: "7e0bb9e3-337b-443b-baa8-8ea69d351ea1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:12:01 crc kubenswrapper[4743]: I1011 01:12:01.648863 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-kube-api-access-5dq9z" (OuterVolumeSpecName: "kube-api-access-5dq9z") pod "7e0bb9e3-337b-443b-baa8-8ea69d351ea1" (UID: "7e0bb9e3-337b-443b-baa8-8ea69d351ea1"). InnerVolumeSpecName "kube-api-access-5dq9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:12:01 crc kubenswrapper[4743]: I1011 01:12:01.700037 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e0bb9e3-337b-443b-baa8-8ea69d351ea1" (UID: "7e0bb9e3-337b-443b-baa8-8ea69d351ea1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:01 crc kubenswrapper[4743]: I1011 01:12:01.731196 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:01 crc kubenswrapper[4743]: I1011 01:12:01.731227 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:01 crc kubenswrapper[4743]: I1011 01:12:01.731238 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-logs\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:01 crc kubenswrapper[4743]: I1011 01:12:01.731245 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dq9z\" (UniqueName: \"kubernetes.io/projected/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-kube-api-access-5dq9z\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:01 crc kubenswrapper[4743]: I1011 01:12:01.784059 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-config-data" (OuterVolumeSpecName: "config-data") pod "7e0bb9e3-337b-443b-baa8-8ea69d351ea1" (UID: "7e0bb9e3-337b-443b-baa8-8ea69d351ea1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:01 crc kubenswrapper[4743]: I1011 01:12:01.832371 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e0bb9e3-337b-443b-baa8-8ea69d351ea1-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.022752 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-z4tv4" event={"ID":"7e0bb9e3-337b-443b-baa8-8ea69d351ea1","Type":"ContainerDied","Data":"8c5663c91c4e792da1ff16a0e1ddc611dca00f90c25b87d0699b13a59ae362a8"} Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.022799 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c5663c91c4e792da1ff16a0e1ddc611dca00f90c25b87d0699b13a59ae362a8" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.022830 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-z4tv4" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.026078 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-979bff964-bxbgb"] Oct 11 01:12:02 crc kubenswrapper[4743]: E1011 01:12:02.026496 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0bb9e3-337b-443b-baa8-8ea69d351ea1" containerName="placement-db-sync" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.026513 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0bb9e3-337b-443b-baa8-8ea69d351ea1" containerName="placement-db-sync" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.026731 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0bb9e3-337b-443b-baa8-8ea69d351ea1" containerName="placement-db-sync" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.028231 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.031624 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.031897 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-4dds5" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.032024 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.032644 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.035093 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.052289 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-979bff964-bxbgb"] Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.139913 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjf8l\" (UniqueName: \"kubernetes.io/projected/8f846c8e-d28a-4a2e-a5b6-bfc739de275b-kube-api-access-jjf8l\") pod \"placement-979bff964-bxbgb\" (UID: \"8f846c8e-d28a-4a2e-a5b6-bfc739de275b\") " pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.139978 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f846c8e-d28a-4a2e-a5b6-bfc739de275b-combined-ca-bundle\") pod \"placement-979bff964-bxbgb\" (UID: \"8f846c8e-d28a-4a2e-a5b6-bfc739de275b\") " pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.140024 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f846c8e-d28a-4a2e-a5b6-bfc739de275b-public-tls-certs\") pod \"placement-979bff964-bxbgb\" (UID: \"8f846c8e-d28a-4a2e-a5b6-bfc739de275b\") " pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.140047 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f846c8e-d28a-4a2e-a5b6-bfc739de275b-config-data\") pod \"placement-979bff964-bxbgb\" (UID: \"8f846c8e-d28a-4a2e-a5b6-bfc739de275b\") " pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.140073 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f846c8e-d28a-4a2e-a5b6-bfc739de275b-logs\") pod \"placement-979bff964-bxbgb\" (UID: \"8f846c8e-d28a-4a2e-a5b6-bfc739de275b\") " pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.140096 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f846c8e-d28a-4a2e-a5b6-bfc739de275b-scripts\") pod \"placement-979bff964-bxbgb\" (UID: \"8f846c8e-d28a-4a2e-a5b6-bfc739de275b\") " pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.140154 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f846c8e-d28a-4a2e-a5b6-bfc739de275b-internal-tls-certs\") pod \"placement-979bff964-bxbgb\" (UID: \"8f846c8e-d28a-4a2e-a5b6-bfc739de275b\") " pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.241931 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f846c8e-d28a-4a2e-a5b6-bfc739de275b-public-tls-certs\") pod \"placement-979bff964-bxbgb\" (UID: \"8f846c8e-d28a-4a2e-a5b6-bfc739de275b\") " pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.242552 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f846c8e-d28a-4a2e-a5b6-bfc739de275b-config-data\") pod \"placement-979bff964-bxbgb\" (UID: \"8f846c8e-d28a-4a2e-a5b6-bfc739de275b\") " pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.243552 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f846c8e-d28a-4a2e-a5b6-bfc739de275b-logs\") pod \"placement-979bff964-bxbgb\" (UID: \"8f846c8e-d28a-4a2e-a5b6-bfc739de275b\") " pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.243616 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f846c8e-d28a-4a2e-a5b6-bfc739de275b-scripts\") pod \"placement-979bff964-bxbgb\" (UID: \"8f846c8e-d28a-4a2e-a5b6-bfc739de275b\") " pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.243710 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f846c8e-d28a-4a2e-a5b6-bfc739de275b-internal-tls-certs\") pod \"placement-979bff964-bxbgb\" (UID: \"8f846c8e-d28a-4a2e-a5b6-bfc739de275b\") " pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.243827 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjf8l\" (UniqueName: \"kubernetes.io/projected/8f846c8e-d28a-4a2e-a5b6-bfc739de275b-kube-api-access-jjf8l\") pod \"placement-979bff964-bxbgb\" (UID: \"8f846c8e-d28a-4a2e-a5b6-bfc739de275b\") " pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.243939 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f846c8e-d28a-4a2e-a5b6-bfc739de275b-combined-ca-bundle\") pod \"placement-979bff964-bxbgb\" (UID: \"8f846c8e-d28a-4a2e-a5b6-bfc739de275b\") " pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.244028 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f846c8e-d28a-4a2e-a5b6-bfc739de275b-logs\") pod \"placement-979bff964-bxbgb\" (UID: \"8f846c8e-d28a-4a2e-a5b6-bfc739de275b\") " pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.249382 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f846c8e-d28a-4a2e-a5b6-bfc739de275b-config-data\") pod \"placement-979bff964-bxbgb\" (UID: \"8f846c8e-d28a-4a2e-a5b6-bfc739de275b\") " pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.251099 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f846c8e-d28a-4a2e-a5b6-bfc739de275b-scripts\") pod \"placement-979bff964-bxbgb\" (UID: \"8f846c8e-d28a-4a2e-a5b6-bfc739de275b\") " pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.261586 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjf8l\" (UniqueName: \"kubernetes.io/projected/8f846c8e-d28a-4a2e-a5b6-bfc739de275b-kube-api-access-jjf8l\") pod \"placement-979bff964-bxbgb\" (UID: \"8f846c8e-d28a-4a2e-a5b6-bfc739de275b\") " pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.263055 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f846c8e-d28a-4a2e-a5b6-bfc739de275b-combined-ca-bundle\") pod \"placement-979bff964-bxbgb\" (UID: \"8f846c8e-d28a-4a2e-a5b6-bfc739de275b\") " pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.263412 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f846c8e-d28a-4a2e-a5b6-bfc739de275b-internal-tls-certs\") pod \"placement-979bff964-bxbgb\" (UID: \"8f846c8e-d28a-4a2e-a5b6-bfc739de275b\") " pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.266251 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f846c8e-d28a-4a2e-a5b6-bfc739de275b-public-tls-certs\") pod \"placement-979bff964-bxbgb\" (UID: \"8f846c8e-d28a-4a2e-a5b6-bfc739de275b\") " pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.355057 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:02 crc kubenswrapper[4743]: I1011 01:12:02.818785 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-979bff964-bxbgb"] Oct 11 01:12:04 crc kubenswrapper[4743]: I1011 01:12:04.045906 4743 generic.go:334] "Generic (PLEG): container finished" podID="bca21196-09b5-4f5a-b690-19afdf3318b4" containerID="fe4f06c5afe259bd4b23460bfc524174724ed17a5da1b63ebc03431735a3daca" exitCode=0 Oct 11 01:12:04 crc kubenswrapper[4743]: I1011 01:12:04.046208 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xq7tg" event={"ID":"bca21196-09b5-4f5a-b690-19afdf3318b4","Type":"ContainerDied","Data":"fe4f06c5afe259bd4b23460bfc524174724ed17a5da1b63ebc03431735a3daca"} Oct 11 01:12:09 crc kubenswrapper[4743]: I1011 01:12:09.096571 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-979bff964-bxbgb" event={"ID":"8f846c8e-d28a-4a2e-a5b6-bfc739de275b","Type":"ContainerStarted","Data":"2ab78e2949fb4fa808d3b891b2a14ca767c41ebbf09dbafc58b68ba3c426a26c"} Oct 11 01:12:19 crc kubenswrapper[4743]: I1011 01:12:19.216340 4743 generic.go:334] "Generic (PLEG): container finished" podID="2f85e310-6acf-40d8-b07d-9e1c9b4d997b" containerID="15eb13889fe265ecb53a8acb3699eb24d0795d5cfaded1b59312c356d6249424" exitCode=0 Oct 11 01:12:19 crc kubenswrapper[4743]: I1011 01:12:19.216468 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-s49bx" event={"ID":"2f85e310-6acf-40d8-b07d-9e1c9b4d997b","Type":"ContainerDied","Data":"15eb13889fe265ecb53a8acb3699eb24d0795d5cfaded1b59312c356d6249424"} Oct 11 01:12:19 crc kubenswrapper[4743]: I1011 01:12:19.548026 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xq7tg" Oct 11 01:12:19 crc kubenswrapper[4743]: I1011 01:12:19.701830 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-combined-ca-bundle\") pod \"bca21196-09b5-4f5a-b690-19afdf3318b4\" (UID: \"bca21196-09b5-4f5a-b690-19afdf3318b4\") " Oct 11 01:12:19 crc kubenswrapper[4743]: I1011 01:12:19.702144 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-fernet-keys\") pod \"bca21196-09b5-4f5a-b690-19afdf3318b4\" (UID: \"bca21196-09b5-4f5a-b690-19afdf3318b4\") " Oct 11 01:12:19 crc kubenswrapper[4743]: I1011 01:12:19.702167 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bbj4\" (UniqueName: \"kubernetes.io/projected/bca21196-09b5-4f5a-b690-19afdf3318b4-kube-api-access-5bbj4\") pod \"bca21196-09b5-4f5a-b690-19afdf3318b4\" (UID: \"bca21196-09b5-4f5a-b690-19afdf3318b4\") " Oct 11 01:12:19 crc kubenswrapper[4743]: I1011 01:12:19.702224 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-scripts\") pod \"bca21196-09b5-4f5a-b690-19afdf3318b4\" (UID: \"bca21196-09b5-4f5a-b690-19afdf3318b4\") " Oct 11 01:12:19 crc kubenswrapper[4743]: I1011 01:12:19.702247 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-config-data\") pod \"bca21196-09b5-4f5a-b690-19afdf3318b4\" (UID: \"bca21196-09b5-4f5a-b690-19afdf3318b4\") " Oct 11 01:12:19 crc kubenswrapper[4743]: I1011 01:12:19.702279 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-credential-keys\") pod \"bca21196-09b5-4f5a-b690-19afdf3318b4\" (UID: \"bca21196-09b5-4f5a-b690-19afdf3318b4\") " Oct 11 01:12:19 crc kubenswrapper[4743]: I1011 01:12:19.708944 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "bca21196-09b5-4f5a-b690-19afdf3318b4" (UID: "bca21196-09b5-4f5a-b690-19afdf3318b4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:19 crc kubenswrapper[4743]: I1011 01:12:19.709935 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "bca21196-09b5-4f5a-b690-19afdf3318b4" (UID: "bca21196-09b5-4f5a-b690-19afdf3318b4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:19 crc kubenswrapper[4743]: I1011 01:12:19.709993 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-scripts" (OuterVolumeSpecName: "scripts") pod "bca21196-09b5-4f5a-b690-19afdf3318b4" (UID: "bca21196-09b5-4f5a-b690-19afdf3318b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:19 crc kubenswrapper[4743]: I1011 01:12:19.721637 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bca21196-09b5-4f5a-b690-19afdf3318b4-kube-api-access-5bbj4" (OuterVolumeSpecName: "kube-api-access-5bbj4") pod "bca21196-09b5-4f5a-b690-19afdf3318b4" (UID: "bca21196-09b5-4f5a-b690-19afdf3318b4"). InnerVolumeSpecName "kube-api-access-5bbj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:12:19 crc kubenswrapper[4743]: I1011 01:12:19.744489 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bca21196-09b5-4f5a-b690-19afdf3318b4" (UID: "bca21196-09b5-4f5a-b690-19afdf3318b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:19 crc kubenswrapper[4743]: I1011 01:12:19.760489 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-config-data" (OuterVolumeSpecName: "config-data") pod "bca21196-09b5-4f5a-b690-19afdf3318b4" (UID: "bca21196-09b5-4f5a-b690-19afdf3318b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:19 crc kubenswrapper[4743]: I1011 01:12:19.804970 4743 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:19 crc kubenswrapper[4743]: I1011 01:12:19.805009 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:19 crc kubenswrapper[4743]: I1011 01:12:19.805023 4743 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:19 crc kubenswrapper[4743]: I1011 01:12:19.805035 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bbj4\" (UniqueName: \"kubernetes.io/projected/bca21196-09b5-4f5a-b690-19afdf3318b4-kube-api-access-5bbj4\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:19 crc kubenswrapper[4743]: I1011 01:12:19.805050 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:19 crc kubenswrapper[4743]: I1011 01:12:19.805062 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bca21196-09b5-4f5a-b690-19afdf3318b4-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:20 crc kubenswrapper[4743]: E1011 01:12:20.039716 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Oct 11 01:12:20 crc kubenswrapper[4743]: E1011 01:12:20.039947 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t4z8m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-6pzxd_openstack(615827fb-c1f4-46c6-8014-00c71fe2403b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 11 01:12:20 crc kubenswrapper[4743]: E1011 01:12:20.041042 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-6pzxd" podUID="615827fb-c1f4-46c6-8014-00c71fe2403b" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.229131 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xq7tg" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.230391 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xq7tg" event={"ID":"bca21196-09b5-4f5a-b690-19afdf3318b4","Type":"ContainerDied","Data":"6cb7f6c66a09d4553d55d7d29dab74982ef213dba0017818e02949cad0b45d4e"} Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.230472 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cb7f6c66a09d4553d55d7d29dab74982ef213dba0017818e02949cad0b45d4e" Oct 11 01:12:20 crc kubenswrapper[4743]: E1011 01:12:20.231978 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-6pzxd" podUID="615827fb-c1f4-46c6-8014-00c71fe2403b" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.654058 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5bd8997d9d-dpxn8"] Oct 11 01:12:20 crc kubenswrapper[4743]: E1011 01:12:20.654969 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bca21196-09b5-4f5a-b690-19afdf3318b4" containerName="keystone-bootstrap" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.654992 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bca21196-09b5-4f5a-b690-19afdf3318b4" containerName="keystone-bootstrap" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.655221 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="bca21196-09b5-4f5a-b690-19afdf3318b4" containerName="keystone-bootstrap" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.656104 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.660820 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.661088 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.661245 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.661278 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.661506 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4qpmq" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.663341 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.678075 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5bd8997d9d-dpxn8"] Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.824673 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f249d47-90a4-4fc9-8bb8-e61bc0143ae7-public-tls-certs\") pod \"keystone-5bd8997d9d-dpxn8\" (UID: \"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7\") " pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.824741 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f249d47-90a4-4fc9-8bb8-e61bc0143ae7-config-data\") pod \"keystone-5bd8997d9d-dpxn8\" (UID: \"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7\") " pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.824783 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8f249d47-90a4-4fc9-8bb8-e61bc0143ae7-fernet-keys\") pod \"keystone-5bd8997d9d-dpxn8\" (UID: \"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7\") " pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.824848 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f249d47-90a4-4fc9-8bb8-e61bc0143ae7-combined-ca-bundle\") pod \"keystone-5bd8997d9d-dpxn8\" (UID: \"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7\") " pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.825019 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8f249d47-90a4-4fc9-8bb8-e61bc0143ae7-credential-keys\") pod \"keystone-5bd8997d9d-dpxn8\" (UID: \"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7\") " pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.825161 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grl6f\" (UniqueName: \"kubernetes.io/projected/8f249d47-90a4-4fc9-8bb8-e61bc0143ae7-kube-api-access-grl6f\") pod \"keystone-5bd8997d9d-dpxn8\" (UID: \"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7\") " pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.825245 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f249d47-90a4-4fc9-8bb8-e61bc0143ae7-scripts\") pod \"keystone-5bd8997d9d-dpxn8\" (UID: \"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7\") " pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.825283 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f249d47-90a4-4fc9-8bb8-e61bc0143ae7-internal-tls-certs\") pod \"keystone-5bd8997d9d-dpxn8\" (UID: \"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7\") " pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.929016 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f249d47-90a4-4fc9-8bb8-e61bc0143ae7-internal-tls-certs\") pod \"keystone-5bd8997d9d-dpxn8\" (UID: \"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7\") " pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.929622 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f249d47-90a4-4fc9-8bb8-e61bc0143ae7-public-tls-certs\") pod \"keystone-5bd8997d9d-dpxn8\" (UID: \"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7\") " pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.929670 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f249d47-90a4-4fc9-8bb8-e61bc0143ae7-config-data\") pod \"keystone-5bd8997d9d-dpxn8\" (UID: \"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7\") " pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.929709 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8f249d47-90a4-4fc9-8bb8-e61bc0143ae7-fernet-keys\") pod \"keystone-5bd8997d9d-dpxn8\" (UID: \"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7\") " pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.929740 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f249d47-90a4-4fc9-8bb8-e61bc0143ae7-combined-ca-bundle\") pod \"keystone-5bd8997d9d-dpxn8\" (UID: \"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7\") " pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.929774 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8f249d47-90a4-4fc9-8bb8-e61bc0143ae7-credential-keys\") pod \"keystone-5bd8997d9d-dpxn8\" (UID: \"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7\") " pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.929817 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grl6f\" (UniqueName: \"kubernetes.io/projected/8f249d47-90a4-4fc9-8bb8-e61bc0143ae7-kube-api-access-grl6f\") pod \"keystone-5bd8997d9d-dpxn8\" (UID: \"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7\") " pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.929870 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f249d47-90a4-4fc9-8bb8-e61bc0143ae7-scripts\") pod \"keystone-5bd8997d9d-dpxn8\" (UID: \"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7\") " pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.934062 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8f249d47-90a4-4fc9-8bb8-e61bc0143ae7-credential-keys\") pod \"keystone-5bd8997d9d-dpxn8\" (UID: \"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7\") " pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.934236 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f249d47-90a4-4fc9-8bb8-e61bc0143ae7-scripts\") pod \"keystone-5bd8997d9d-dpxn8\" (UID: \"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7\") " pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.935820 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f249d47-90a4-4fc9-8bb8-e61bc0143ae7-config-data\") pod \"keystone-5bd8997d9d-dpxn8\" (UID: \"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7\") " pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.936083 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f249d47-90a4-4fc9-8bb8-e61bc0143ae7-public-tls-certs\") pod \"keystone-5bd8997d9d-dpxn8\" (UID: \"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7\") " pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.936407 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f249d47-90a4-4fc9-8bb8-e61bc0143ae7-combined-ca-bundle\") pod \"keystone-5bd8997d9d-dpxn8\" (UID: \"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7\") " pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.937404 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f249d47-90a4-4fc9-8bb8-e61bc0143ae7-internal-tls-certs\") pod \"keystone-5bd8997d9d-dpxn8\" (UID: \"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7\") " pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.937685 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8f249d47-90a4-4fc9-8bb8-e61bc0143ae7-fernet-keys\") pod \"keystone-5bd8997d9d-dpxn8\" (UID: \"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7\") " pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.950476 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grl6f\" (UniqueName: \"kubernetes.io/projected/8f249d47-90a4-4fc9-8bb8-e61bc0143ae7-kube-api-access-grl6f\") pod \"keystone-5bd8997d9d-dpxn8\" (UID: \"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7\") " pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:20 crc kubenswrapper[4743]: I1011 01:12:20.987622 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:21 crc kubenswrapper[4743]: E1011 01:12:21.329312 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 11 01:12:21 crc kubenswrapper[4743]: E1011 01:12:21.329968 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dlw9m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-przwv_openstack(65f773e4-09e5-4312-8c6b-9176a1a022f0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 11 01:12:21 crc kubenswrapper[4743]: E1011 01:12:21.331154 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-przwv" podUID="65f773e4-09e5-4312-8c6b-9176a1a022f0" Oct 11 01:12:21 crc kubenswrapper[4743]: I1011 01:12:21.440007 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-s49bx" Oct 11 01:12:21 crc kubenswrapper[4743]: I1011 01:12:21.546105 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x89p4\" (UniqueName: \"kubernetes.io/projected/2f85e310-6acf-40d8-b07d-9e1c9b4d997b-kube-api-access-x89p4\") pod \"2f85e310-6acf-40d8-b07d-9e1c9b4d997b\" (UID: \"2f85e310-6acf-40d8-b07d-9e1c9b4d997b\") " Oct 11 01:12:21 crc kubenswrapper[4743]: I1011 01:12:21.546273 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f85e310-6acf-40d8-b07d-9e1c9b4d997b-config\") pod \"2f85e310-6acf-40d8-b07d-9e1c9b4d997b\" (UID: \"2f85e310-6acf-40d8-b07d-9e1c9b4d997b\") " Oct 11 01:12:21 crc kubenswrapper[4743]: I1011 01:12:21.546376 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f85e310-6acf-40d8-b07d-9e1c9b4d997b-combined-ca-bundle\") pod \"2f85e310-6acf-40d8-b07d-9e1c9b4d997b\" (UID: \"2f85e310-6acf-40d8-b07d-9e1c9b4d997b\") " Oct 11 01:12:21 crc kubenswrapper[4743]: I1011 01:12:21.569385 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f85e310-6acf-40d8-b07d-9e1c9b4d997b-kube-api-access-x89p4" (OuterVolumeSpecName: "kube-api-access-x89p4") pod "2f85e310-6acf-40d8-b07d-9e1c9b4d997b" (UID: "2f85e310-6acf-40d8-b07d-9e1c9b4d997b"). InnerVolumeSpecName "kube-api-access-x89p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:12:21 crc kubenswrapper[4743]: I1011 01:12:21.587540 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f85e310-6acf-40d8-b07d-9e1c9b4d997b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f85e310-6acf-40d8-b07d-9e1c9b4d997b" (UID: "2f85e310-6acf-40d8-b07d-9e1c9b4d997b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:21 crc kubenswrapper[4743]: I1011 01:12:21.597961 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f85e310-6acf-40d8-b07d-9e1c9b4d997b-config" (OuterVolumeSpecName: "config") pod "2f85e310-6acf-40d8-b07d-9e1c9b4d997b" (UID: "2f85e310-6acf-40d8-b07d-9e1c9b4d997b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:21 crc kubenswrapper[4743]: I1011 01:12:21.649659 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x89p4\" (UniqueName: \"kubernetes.io/projected/2f85e310-6acf-40d8-b07d-9e1c9b4d997b-kube-api-access-x89p4\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:21 crc kubenswrapper[4743]: I1011 01:12:21.649692 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f85e310-6acf-40d8-b07d-9e1c9b4d997b-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:21 crc kubenswrapper[4743]: I1011 01:12:21.649702 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f85e310-6acf-40d8-b07d-9e1c9b4d997b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:21 crc kubenswrapper[4743]: I1011 01:12:21.858798 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5bd8997d9d-dpxn8"] Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.266721 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-979bff964-bxbgb" event={"ID":"8f846c8e-d28a-4a2e-a5b6-bfc739de275b","Type":"ContainerStarted","Data":"30445e453bf532074710cdf19b68179e0c9d3fa12b79158cc8faa19f3e879fa4"} Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.266960 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-979bff964-bxbgb" event={"ID":"8f846c8e-d28a-4a2e-a5b6-bfc739de275b","Type":"ContainerStarted","Data":"1f842279e80ae654d04c317b2a3cfeaad4b493858cc0b0f9fb9bd36c65c120e1"} Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.267969 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.268064 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.269738 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-f4rpt" event={"ID":"476a4c6e-ddae-4974-a899-78a8f1ee973d","Type":"ContainerStarted","Data":"acca0d906d588645941325e81b8e7c694904bf1ff868d3295899384509bd4526"} Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.271485 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5bd8997d9d-dpxn8" event={"ID":"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7","Type":"ContainerStarted","Data":"405d849c454cec87e2ad320ae15381ac9b6c76285f773f310de4a2d101f85238"} Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.271523 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5bd8997d9d-dpxn8" event={"ID":"8f249d47-90a4-4fc9-8bb8-e61bc0143ae7","Type":"ContainerStarted","Data":"2bdb8b5c8805529e54dde9fd4d0460927f30b42b793fe4c3075b94a8e50f7036"} Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.271581 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.274068 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-s49bx" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.274068 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-s49bx" event={"ID":"2f85e310-6acf-40d8-b07d-9e1c9b4d997b","Type":"ContainerDied","Data":"3b830f93be95d366b5539383f3016e9186805775008cd18356a4760de5b2c9e7"} Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.274108 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b830f93be95d366b5539383f3016e9186805775008cd18356a4760de5b2c9e7" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.276540 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d","Type":"ContainerStarted","Data":"a0c401940204f28e5b9ce6a4c314a41e9043ee912db8b9d4a80b86a9f2610cee"} Oct 11 01:12:22 crc kubenswrapper[4743]: E1011 01:12:22.277672 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-przwv" podUID="65f773e4-09e5-4312-8c6b-9176a1a022f0" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.319327 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-f4rpt" podStartSLOduration=2.731292523 podStartE2EDuration="24.319309666s" podCreationTimestamp="2025-10-11 01:11:58 +0000 UTC" firstStartedPulling="2025-10-11 01:11:59.730211125 +0000 UTC m=+1214.383191522" lastFinishedPulling="2025-10-11 01:12:21.318228258 +0000 UTC m=+1235.971208665" observedRunningTime="2025-10-11 01:12:22.313569214 +0000 UTC m=+1236.966549611" watchObservedRunningTime="2025-10-11 01:12:22.319309666 +0000 UTC m=+1236.972290063" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.321453 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-979bff964-bxbgb" podStartSLOduration=20.321444405 podStartE2EDuration="20.321444405s" podCreationTimestamp="2025-10-11 01:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:12:22.295235891 +0000 UTC m=+1236.948216288" watchObservedRunningTime="2025-10-11 01:12:22.321444405 +0000 UTC m=+1236.974424802" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.346514 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5bd8997d9d-dpxn8" podStartSLOduration=2.346497612 podStartE2EDuration="2.346497612s" podCreationTimestamp="2025-10-11 01:12:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:12:22.343200716 +0000 UTC m=+1236.996181113" watchObservedRunningTime="2025-10-11 01:12:22.346497612 +0000 UTC m=+1236.999478029" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.701938 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-55n9s"] Oct 11 01:12:22 crc kubenswrapper[4743]: E1011 01:12:22.702309 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f85e310-6acf-40d8-b07d-9e1c9b4d997b" containerName="neutron-db-sync" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.702324 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f85e310-6acf-40d8-b07d-9e1c9b4d997b" containerName="neutron-db-sync" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.702488 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f85e310-6acf-40d8-b07d-9e1c9b4d997b" containerName="neutron-db-sync" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.703435 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-55n9s" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.722898 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-55n9s"] Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.788313 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7dfd9c7974-zppf2"] Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.789784 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7dfd9c7974-zppf2" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.793530 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.793801 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.794082 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.794212 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bgp4n" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.821886 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7dfd9c7974-zppf2"] Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.874999 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-55n9s\" (UID: \"ba69e70e-4af2-485e-9566-ec04e2a71a12\") " pod="openstack/dnsmasq-dns-6b7b667979-55n9s" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.875091 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-55n9s\" (UID: \"ba69e70e-4af2-485e-9566-ec04e2a71a12\") " pod="openstack/dnsmasq-dns-6b7b667979-55n9s" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.875167 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-config\") pod \"dnsmasq-dns-6b7b667979-55n9s\" (UID: \"ba69e70e-4af2-485e-9566-ec04e2a71a12\") " pod="openstack/dnsmasq-dns-6b7b667979-55n9s" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.875416 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq9fj\" (UniqueName: \"kubernetes.io/projected/ba69e70e-4af2-485e-9566-ec04e2a71a12-kube-api-access-mq9fj\") pod \"dnsmasq-dns-6b7b667979-55n9s\" (UID: \"ba69e70e-4af2-485e-9566-ec04e2a71a12\") " pod="openstack/dnsmasq-dns-6b7b667979-55n9s" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.875528 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-dns-svc\") pod \"dnsmasq-dns-6b7b667979-55n9s\" (UID: \"ba69e70e-4af2-485e-9566-ec04e2a71a12\") " pod="openstack/dnsmasq-dns-6b7b667979-55n9s" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.875615 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-55n9s\" (UID: \"ba69e70e-4af2-485e-9566-ec04e2a71a12\") " pod="openstack/dnsmasq-dns-6b7b667979-55n9s" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.977479 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-config\") pod \"dnsmasq-dns-6b7b667979-55n9s\" (UID: \"ba69e70e-4af2-485e-9566-ec04e2a71a12\") " pod="openstack/dnsmasq-dns-6b7b667979-55n9s" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.977547 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/54eafe9f-024f-4d60-917b-6e867458632d-ovndb-tls-certs\") pod \"neutron-7dfd9c7974-zppf2\" (UID: \"54eafe9f-024f-4d60-917b-6e867458632d\") " pod="openstack/neutron-7dfd9c7974-zppf2" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.977589 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/54eafe9f-024f-4d60-917b-6e867458632d-httpd-config\") pod \"neutron-7dfd9c7974-zppf2\" (UID: \"54eafe9f-024f-4d60-917b-6e867458632d\") " pod="openstack/neutron-7dfd9c7974-zppf2" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.977635 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq9fj\" (UniqueName: \"kubernetes.io/projected/ba69e70e-4af2-485e-9566-ec04e2a71a12-kube-api-access-mq9fj\") pod \"dnsmasq-dns-6b7b667979-55n9s\" (UID: \"ba69e70e-4af2-485e-9566-ec04e2a71a12\") " pod="openstack/dnsmasq-dns-6b7b667979-55n9s" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.977659 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/54eafe9f-024f-4d60-917b-6e867458632d-config\") pod \"neutron-7dfd9c7974-zppf2\" (UID: \"54eafe9f-024f-4d60-917b-6e867458632d\") " pod="openstack/neutron-7dfd9c7974-zppf2" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.977686 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-dns-svc\") pod \"dnsmasq-dns-6b7b667979-55n9s\" (UID: \"ba69e70e-4af2-485e-9566-ec04e2a71a12\") " pod="openstack/dnsmasq-dns-6b7b667979-55n9s" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.977712 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54eafe9f-024f-4d60-917b-6e867458632d-combined-ca-bundle\") pod \"neutron-7dfd9c7974-zppf2\" (UID: \"54eafe9f-024f-4d60-917b-6e867458632d\") " pod="openstack/neutron-7dfd9c7974-zppf2" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.977736 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-55n9s\" (UID: \"ba69e70e-4af2-485e-9566-ec04e2a71a12\") " pod="openstack/dnsmasq-dns-6b7b667979-55n9s" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.977763 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-55n9s\" (UID: \"ba69e70e-4af2-485e-9566-ec04e2a71a12\") " pod="openstack/dnsmasq-dns-6b7b667979-55n9s" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.977801 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbvzf\" (UniqueName: \"kubernetes.io/projected/54eafe9f-024f-4d60-917b-6e867458632d-kube-api-access-qbvzf\") pod \"neutron-7dfd9c7974-zppf2\" (UID: \"54eafe9f-024f-4d60-917b-6e867458632d\") " pod="openstack/neutron-7dfd9c7974-zppf2" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.977823 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-55n9s\" (UID: \"ba69e70e-4af2-485e-9566-ec04e2a71a12\") " pod="openstack/dnsmasq-dns-6b7b667979-55n9s" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.978974 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-55n9s\" (UID: \"ba69e70e-4af2-485e-9566-ec04e2a71a12\") " pod="openstack/dnsmasq-dns-6b7b667979-55n9s" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.978999 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-dns-svc\") pod \"dnsmasq-dns-6b7b667979-55n9s\" (UID: \"ba69e70e-4af2-485e-9566-ec04e2a71a12\") " pod="openstack/dnsmasq-dns-6b7b667979-55n9s" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.979025 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-config\") pod \"dnsmasq-dns-6b7b667979-55n9s\" (UID: \"ba69e70e-4af2-485e-9566-ec04e2a71a12\") " pod="openstack/dnsmasq-dns-6b7b667979-55n9s" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.979418 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-55n9s\" (UID: \"ba69e70e-4af2-485e-9566-ec04e2a71a12\") " pod="openstack/dnsmasq-dns-6b7b667979-55n9s" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.979893 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-55n9s\" (UID: \"ba69e70e-4af2-485e-9566-ec04e2a71a12\") " pod="openstack/dnsmasq-dns-6b7b667979-55n9s" Oct 11 01:12:22 crc kubenswrapper[4743]: I1011 01:12:22.993868 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq9fj\" (UniqueName: \"kubernetes.io/projected/ba69e70e-4af2-485e-9566-ec04e2a71a12-kube-api-access-mq9fj\") pod \"dnsmasq-dns-6b7b667979-55n9s\" (UID: \"ba69e70e-4af2-485e-9566-ec04e2a71a12\") " pod="openstack/dnsmasq-dns-6b7b667979-55n9s" Oct 11 01:12:23 crc kubenswrapper[4743]: I1011 01:12:23.026885 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-55n9s" Oct 11 01:12:23 crc kubenswrapper[4743]: I1011 01:12:23.079809 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/54eafe9f-024f-4d60-917b-6e867458632d-config\") pod \"neutron-7dfd9c7974-zppf2\" (UID: \"54eafe9f-024f-4d60-917b-6e867458632d\") " pod="openstack/neutron-7dfd9c7974-zppf2" Oct 11 01:12:23 crc kubenswrapper[4743]: I1011 01:12:23.080188 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54eafe9f-024f-4d60-917b-6e867458632d-combined-ca-bundle\") pod \"neutron-7dfd9c7974-zppf2\" (UID: \"54eafe9f-024f-4d60-917b-6e867458632d\") " pod="openstack/neutron-7dfd9c7974-zppf2" Oct 11 01:12:23 crc kubenswrapper[4743]: I1011 01:12:23.080252 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbvzf\" (UniqueName: \"kubernetes.io/projected/54eafe9f-024f-4d60-917b-6e867458632d-kube-api-access-qbvzf\") pod \"neutron-7dfd9c7974-zppf2\" (UID: \"54eafe9f-024f-4d60-917b-6e867458632d\") " pod="openstack/neutron-7dfd9c7974-zppf2" Oct 11 01:12:23 crc kubenswrapper[4743]: I1011 01:12:23.080298 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/54eafe9f-024f-4d60-917b-6e867458632d-ovndb-tls-certs\") pod \"neutron-7dfd9c7974-zppf2\" (UID: \"54eafe9f-024f-4d60-917b-6e867458632d\") " pod="openstack/neutron-7dfd9c7974-zppf2" Oct 11 01:12:23 crc kubenswrapper[4743]: I1011 01:12:23.080337 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/54eafe9f-024f-4d60-917b-6e867458632d-httpd-config\") pod \"neutron-7dfd9c7974-zppf2\" (UID: \"54eafe9f-024f-4d60-917b-6e867458632d\") " pod="openstack/neutron-7dfd9c7974-zppf2" Oct 11 01:12:23 crc kubenswrapper[4743]: I1011 01:12:23.088603 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/54eafe9f-024f-4d60-917b-6e867458632d-ovndb-tls-certs\") pod \"neutron-7dfd9c7974-zppf2\" (UID: \"54eafe9f-024f-4d60-917b-6e867458632d\") " pod="openstack/neutron-7dfd9c7974-zppf2" Oct 11 01:12:23 crc kubenswrapper[4743]: I1011 01:12:23.089110 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54eafe9f-024f-4d60-917b-6e867458632d-combined-ca-bundle\") pod \"neutron-7dfd9c7974-zppf2\" (UID: \"54eafe9f-024f-4d60-917b-6e867458632d\") " pod="openstack/neutron-7dfd9c7974-zppf2" Oct 11 01:12:23 crc kubenswrapper[4743]: I1011 01:12:23.089968 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/54eafe9f-024f-4d60-917b-6e867458632d-config\") pod \"neutron-7dfd9c7974-zppf2\" (UID: \"54eafe9f-024f-4d60-917b-6e867458632d\") " pod="openstack/neutron-7dfd9c7974-zppf2" Oct 11 01:12:23 crc kubenswrapper[4743]: I1011 01:12:23.092585 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/54eafe9f-024f-4d60-917b-6e867458632d-httpd-config\") pod \"neutron-7dfd9c7974-zppf2\" (UID: \"54eafe9f-024f-4d60-917b-6e867458632d\") " pod="openstack/neutron-7dfd9c7974-zppf2" Oct 11 01:12:23 crc kubenswrapper[4743]: I1011 01:12:23.101359 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbvzf\" (UniqueName: \"kubernetes.io/projected/54eafe9f-024f-4d60-917b-6e867458632d-kube-api-access-qbvzf\") pod \"neutron-7dfd9c7974-zppf2\" (UID: \"54eafe9f-024f-4d60-917b-6e867458632d\") " pod="openstack/neutron-7dfd9c7974-zppf2" Oct 11 01:12:23 crc kubenswrapper[4743]: I1011 01:12:23.119197 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7dfd9c7974-zppf2" Oct 11 01:12:23 crc kubenswrapper[4743]: I1011 01:12:23.479540 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-55n9s"] Oct 11 01:12:23 crc kubenswrapper[4743]: W1011 01:12:23.481128 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba69e70e_4af2_485e_9566_ec04e2a71a12.slice/crio-d109268df20299a9476693016b217bb32fa8f5d8b15fb7771f32289003619271 WatchSource:0}: Error finding container d109268df20299a9476693016b217bb32fa8f5d8b15fb7771f32289003619271: Status 404 returned error can't find the container with id d109268df20299a9476693016b217bb32fa8f5d8b15fb7771f32289003619271 Oct 11 01:12:23 crc kubenswrapper[4743]: I1011 01:12:23.677191 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7dfd9c7974-zppf2"] Oct 11 01:12:23 crc kubenswrapper[4743]: W1011 01:12:23.695213 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54eafe9f_024f_4d60_917b_6e867458632d.slice/crio-f3ba5a3847eade89ce93855d754f6890a97719696b02cf1445a1dac50e1d4fa3 WatchSource:0}: Error finding container f3ba5a3847eade89ce93855d754f6890a97719696b02cf1445a1dac50e1d4fa3: Status 404 returned error can't find the container with id f3ba5a3847eade89ce93855d754f6890a97719696b02cf1445a1dac50e1d4fa3 Oct 11 01:12:24 crc kubenswrapper[4743]: I1011 01:12:24.298677 4743 generic.go:334] "Generic (PLEG): container finished" podID="ba69e70e-4af2-485e-9566-ec04e2a71a12" containerID="953514ea9d56bd3c8b6498d971973cd9187040f217c98c7d08f4c23765125a71" exitCode=0 Oct 11 01:12:24 crc kubenswrapper[4743]: I1011 01:12:24.298978 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-55n9s" event={"ID":"ba69e70e-4af2-485e-9566-ec04e2a71a12","Type":"ContainerDied","Data":"953514ea9d56bd3c8b6498d971973cd9187040f217c98c7d08f4c23765125a71"} Oct 11 01:12:24 crc kubenswrapper[4743]: I1011 01:12:24.299001 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-55n9s" event={"ID":"ba69e70e-4af2-485e-9566-ec04e2a71a12","Type":"ContainerStarted","Data":"d109268df20299a9476693016b217bb32fa8f5d8b15fb7771f32289003619271"} Oct 11 01:12:24 crc kubenswrapper[4743]: I1011 01:12:24.303945 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dfd9c7974-zppf2" event={"ID":"54eafe9f-024f-4d60-917b-6e867458632d","Type":"ContainerStarted","Data":"f71b4cf1f0d1af682e06225191bf46c6508528b97a6d00b4bea132078cf70eca"} Oct 11 01:12:24 crc kubenswrapper[4743]: I1011 01:12:24.303982 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dfd9c7974-zppf2" event={"ID":"54eafe9f-024f-4d60-917b-6e867458632d","Type":"ContainerStarted","Data":"5f210a41596dfcc8d1509da34eba889691b3ccabdb5ae5b9a810960d3018b7f8"} Oct 11 01:12:24 crc kubenswrapper[4743]: I1011 01:12:24.303992 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dfd9c7974-zppf2" event={"ID":"54eafe9f-024f-4d60-917b-6e867458632d","Type":"ContainerStarted","Data":"f3ba5a3847eade89ce93855d754f6890a97719696b02cf1445a1dac50e1d4fa3"} Oct 11 01:12:24 crc kubenswrapper[4743]: I1011 01:12:24.304035 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7dfd9c7974-zppf2" Oct 11 01:12:24 crc kubenswrapper[4743]: I1011 01:12:24.349541 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7dfd9c7974-zppf2" podStartSLOduration=2.349522607 podStartE2EDuration="2.349522607s" podCreationTimestamp="2025-10-11 01:12:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:12:24.339207519 +0000 UTC m=+1238.992187926" watchObservedRunningTime="2025-10-11 01:12:24.349522607 +0000 UTC m=+1239.002503004" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.138333 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5496cd5f5c-c9jx6"] Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.140889 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5496cd5f5c-c9jx6" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.143323 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.145627 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.153888 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5496cd5f5c-c9jx6"] Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.312666 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-55n9s" event={"ID":"ba69e70e-4af2-485e-9566-ec04e2a71a12","Type":"ContainerStarted","Data":"df89795983d01f7c54609fcbf03ce7a51c26b7ec1aeae9f825eb3bf09a9318ac"} Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.313254 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-55n9s" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.324140 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdc6c654-6370-4cc4-99a1-c13dfd402b14-combined-ca-bundle\") pod \"neutron-5496cd5f5c-c9jx6\" (UID: \"fdc6c654-6370-4cc4-99a1-c13dfd402b14\") " pod="openstack/neutron-5496cd5f5c-c9jx6" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.324234 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fdc6c654-6370-4cc4-99a1-c13dfd402b14-httpd-config\") pod \"neutron-5496cd5f5c-c9jx6\" (UID: \"fdc6c654-6370-4cc4-99a1-c13dfd402b14\") " pod="openstack/neutron-5496cd5f5c-c9jx6" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.324493 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fdc6c654-6370-4cc4-99a1-c13dfd402b14-config\") pod \"neutron-5496cd5f5c-c9jx6\" (UID: \"fdc6c654-6370-4cc4-99a1-c13dfd402b14\") " pod="openstack/neutron-5496cd5f5c-c9jx6" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.324619 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdc6c654-6370-4cc4-99a1-c13dfd402b14-internal-tls-certs\") pod \"neutron-5496cd5f5c-c9jx6\" (UID: \"fdc6c654-6370-4cc4-99a1-c13dfd402b14\") " pod="openstack/neutron-5496cd5f5c-c9jx6" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.324720 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6wtc\" (UniqueName: \"kubernetes.io/projected/fdc6c654-6370-4cc4-99a1-c13dfd402b14-kube-api-access-f6wtc\") pod \"neutron-5496cd5f5c-c9jx6\" (UID: \"fdc6c654-6370-4cc4-99a1-c13dfd402b14\") " pod="openstack/neutron-5496cd5f5c-c9jx6" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.324795 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdc6c654-6370-4cc4-99a1-c13dfd402b14-ovndb-tls-certs\") pod \"neutron-5496cd5f5c-c9jx6\" (UID: \"fdc6c654-6370-4cc4-99a1-c13dfd402b14\") " pod="openstack/neutron-5496cd5f5c-c9jx6" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.324821 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdc6c654-6370-4cc4-99a1-c13dfd402b14-public-tls-certs\") pod \"neutron-5496cd5f5c-c9jx6\" (UID: \"fdc6c654-6370-4cc4-99a1-c13dfd402b14\") " pod="openstack/neutron-5496cd5f5c-c9jx6" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.332047 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-55n9s" podStartSLOduration=3.332027396 podStartE2EDuration="3.332027396s" podCreationTimestamp="2025-10-11 01:12:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:12:25.330514401 +0000 UTC m=+1239.983494798" watchObservedRunningTime="2025-10-11 01:12:25.332027396 +0000 UTC m=+1239.985007793" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.426160 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fdc6c654-6370-4cc4-99a1-c13dfd402b14-config\") pod \"neutron-5496cd5f5c-c9jx6\" (UID: \"fdc6c654-6370-4cc4-99a1-c13dfd402b14\") " pod="openstack/neutron-5496cd5f5c-c9jx6" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.426843 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdc6c654-6370-4cc4-99a1-c13dfd402b14-internal-tls-certs\") pod \"neutron-5496cd5f5c-c9jx6\" (UID: \"fdc6c654-6370-4cc4-99a1-c13dfd402b14\") " pod="openstack/neutron-5496cd5f5c-c9jx6" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.427385 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6wtc\" (UniqueName: \"kubernetes.io/projected/fdc6c654-6370-4cc4-99a1-c13dfd402b14-kube-api-access-f6wtc\") pod \"neutron-5496cd5f5c-c9jx6\" (UID: \"fdc6c654-6370-4cc4-99a1-c13dfd402b14\") " pod="openstack/neutron-5496cd5f5c-c9jx6" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.427446 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdc6c654-6370-4cc4-99a1-c13dfd402b14-ovndb-tls-certs\") pod \"neutron-5496cd5f5c-c9jx6\" (UID: \"fdc6c654-6370-4cc4-99a1-c13dfd402b14\") " pod="openstack/neutron-5496cd5f5c-c9jx6" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.427473 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdc6c654-6370-4cc4-99a1-c13dfd402b14-public-tls-certs\") pod \"neutron-5496cd5f5c-c9jx6\" (UID: \"fdc6c654-6370-4cc4-99a1-c13dfd402b14\") " pod="openstack/neutron-5496cd5f5c-c9jx6" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.427535 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdc6c654-6370-4cc4-99a1-c13dfd402b14-combined-ca-bundle\") pod \"neutron-5496cd5f5c-c9jx6\" (UID: \"fdc6c654-6370-4cc4-99a1-c13dfd402b14\") " pod="openstack/neutron-5496cd5f5c-c9jx6" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.427564 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fdc6c654-6370-4cc4-99a1-c13dfd402b14-httpd-config\") pod \"neutron-5496cd5f5c-c9jx6\" (UID: \"fdc6c654-6370-4cc4-99a1-c13dfd402b14\") " pod="openstack/neutron-5496cd5f5c-c9jx6" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.431484 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fdc6c654-6370-4cc4-99a1-c13dfd402b14-config\") pod \"neutron-5496cd5f5c-c9jx6\" (UID: \"fdc6c654-6370-4cc4-99a1-c13dfd402b14\") " pod="openstack/neutron-5496cd5f5c-c9jx6" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.431832 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdc6c654-6370-4cc4-99a1-c13dfd402b14-ovndb-tls-certs\") pod \"neutron-5496cd5f5c-c9jx6\" (UID: \"fdc6c654-6370-4cc4-99a1-c13dfd402b14\") " pod="openstack/neutron-5496cd5f5c-c9jx6" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.432263 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdc6c654-6370-4cc4-99a1-c13dfd402b14-public-tls-certs\") pod \"neutron-5496cd5f5c-c9jx6\" (UID: \"fdc6c654-6370-4cc4-99a1-c13dfd402b14\") " pod="openstack/neutron-5496cd5f5c-c9jx6" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.434828 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdc6c654-6370-4cc4-99a1-c13dfd402b14-internal-tls-certs\") pod \"neutron-5496cd5f5c-c9jx6\" (UID: \"fdc6c654-6370-4cc4-99a1-c13dfd402b14\") " pod="openstack/neutron-5496cd5f5c-c9jx6" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.436064 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdc6c654-6370-4cc4-99a1-c13dfd402b14-combined-ca-bundle\") pod \"neutron-5496cd5f5c-c9jx6\" (UID: \"fdc6c654-6370-4cc4-99a1-c13dfd402b14\") " pod="openstack/neutron-5496cd5f5c-c9jx6" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.439135 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fdc6c654-6370-4cc4-99a1-c13dfd402b14-httpd-config\") pod \"neutron-5496cd5f5c-c9jx6\" (UID: \"fdc6c654-6370-4cc4-99a1-c13dfd402b14\") " pod="openstack/neutron-5496cd5f5c-c9jx6" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.456265 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6wtc\" (UniqueName: \"kubernetes.io/projected/fdc6c654-6370-4cc4-99a1-c13dfd402b14-kube-api-access-f6wtc\") pod \"neutron-5496cd5f5c-c9jx6\" (UID: \"fdc6c654-6370-4cc4-99a1-c13dfd402b14\") " pod="openstack/neutron-5496cd5f5c-c9jx6" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.468740 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5496cd5f5c-c9jx6" Oct 11 01:12:25 crc kubenswrapper[4743]: I1011 01:12:25.980958 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5496cd5f5c-c9jx6"] Oct 11 01:12:29 crc kubenswrapper[4743]: I1011 01:12:29.353075 4743 generic.go:334] "Generic (PLEG): container finished" podID="476a4c6e-ddae-4974-a899-78a8f1ee973d" containerID="acca0d906d588645941325e81b8e7c694904bf1ff868d3295899384509bd4526" exitCode=0 Oct 11 01:12:29 crc kubenswrapper[4743]: I1011 01:12:29.353151 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-f4rpt" event={"ID":"476a4c6e-ddae-4974-a899-78a8f1ee973d","Type":"ContainerDied","Data":"acca0d906d588645941325e81b8e7c694904bf1ff868d3295899384509bd4526"} Oct 11 01:12:29 crc kubenswrapper[4743]: I1011 01:12:29.356576 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5496cd5f5c-c9jx6" event={"ID":"fdc6c654-6370-4cc4-99a1-c13dfd402b14","Type":"ContainerStarted","Data":"d35dd274fdfb7b258695d45624559dac48af6c3ff60224df4bbed74fd8ec211f"} Oct 11 01:12:30 crc kubenswrapper[4743]: I1011 01:12:30.369702 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d","Type":"ContainerStarted","Data":"2cd44e542b88d959e7c7eb1910b04761f93430d968b425410133aebfc269c77c"} Oct 11 01:12:30 crc kubenswrapper[4743]: I1011 01:12:30.369763 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" containerName="ceilometer-central-agent" containerID="cri-o://5e31815cc6bb002a54d9b48fcc0bd3a79d54a858afc2f6c912547a5573fc65ba" gracePeriod=30 Oct 11 01:12:30 crc kubenswrapper[4743]: I1011 01:12:30.369793 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" containerName="sg-core" containerID="cri-o://a0c401940204f28e5b9ce6a4c314a41e9043ee912db8b9d4a80b86a9f2610cee" gracePeriod=30 Oct 11 01:12:30 crc kubenswrapper[4743]: I1011 01:12:30.369804 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" containerName="ceilometer-notification-agent" containerID="cri-o://e086d9b7f67225d474a6748ad81b767991a3103ef0ccdcd7b0f7231e886e45f0" gracePeriod=30 Oct 11 01:12:30 crc kubenswrapper[4743]: I1011 01:12:30.369829 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" containerName="proxy-httpd" containerID="cri-o://2cd44e542b88d959e7c7eb1910b04761f93430d968b425410133aebfc269c77c" gracePeriod=30 Oct 11 01:12:30 crc kubenswrapper[4743]: I1011 01:12:30.371069 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 11 01:12:30 crc kubenswrapper[4743]: I1011 01:12:30.384966 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5496cd5f5c-c9jx6" event={"ID":"fdc6c654-6370-4cc4-99a1-c13dfd402b14","Type":"ContainerStarted","Data":"046b0858dad1491905216cb2b18dae27e52ff60e50b1fee9e6cf88b1457f2a7d"} Oct 11 01:12:30 crc kubenswrapper[4743]: I1011 01:12:30.410084 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.7261469969999998 podStartE2EDuration="44.410051245s" podCreationTimestamp="2025-10-11 01:11:46 +0000 UTC" firstStartedPulling="2025-10-11 01:11:47.222420395 +0000 UTC m=+1201.875400792" lastFinishedPulling="2025-10-11 01:12:29.906324633 +0000 UTC m=+1244.559305040" observedRunningTime="2025-10-11 01:12:30.398050249 +0000 UTC m=+1245.051030696" watchObservedRunningTime="2025-10-11 01:12:30.410051245 +0000 UTC m=+1245.063031682" Oct 11 01:12:30 crc kubenswrapper[4743]: I1011 01:12:30.878195 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-f4rpt" Oct 11 01:12:31 crc kubenswrapper[4743]: I1011 01:12:31.039361 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/476a4c6e-ddae-4974-a899-78a8f1ee973d-config-data\") pod \"476a4c6e-ddae-4974-a899-78a8f1ee973d\" (UID: \"476a4c6e-ddae-4974-a899-78a8f1ee973d\") " Oct 11 01:12:31 crc kubenswrapper[4743]: I1011 01:12:31.039451 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/476a4c6e-ddae-4974-a899-78a8f1ee973d-combined-ca-bundle\") pod \"476a4c6e-ddae-4974-a899-78a8f1ee973d\" (UID: \"476a4c6e-ddae-4974-a899-78a8f1ee973d\") " Oct 11 01:12:31 crc kubenswrapper[4743]: I1011 01:12:31.039559 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5bpr\" (UniqueName: \"kubernetes.io/projected/476a4c6e-ddae-4974-a899-78a8f1ee973d-kube-api-access-q5bpr\") pod \"476a4c6e-ddae-4974-a899-78a8f1ee973d\" (UID: \"476a4c6e-ddae-4974-a899-78a8f1ee973d\") " Oct 11 01:12:31 crc kubenswrapper[4743]: I1011 01:12:31.044198 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/476a4c6e-ddae-4974-a899-78a8f1ee973d-kube-api-access-q5bpr" (OuterVolumeSpecName: "kube-api-access-q5bpr") pod "476a4c6e-ddae-4974-a899-78a8f1ee973d" (UID: "476a4c6e-ddae-4974-a899-78a8f1ee973d"). InnerVolumeSpecName "kube-api-access-q5bpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:12:31 crc kubenswrapper[4743]: I1011 01:12:31.064358 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/476a4c6e-ddae-4974-a899-78a8f1ee973d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "476a4c6e-ddae-4974-a899-78a8f1ee973d" (UID: "476a4c6e-ddae-4974-a899-78a8f1ee973d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:31 crc kubenswrapper[4743]: I1011 01:12:31.142143 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5bpr\" (UniqueName: \"kubernetes.io/projected/476a4c6e-ddae-4974-a899-78a8f1ee973d-kube-api-access-q5bpr\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:31 crc kubenswrapper[4743]: I1011 01:12:31.142434 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/476a4c6e-ddae-4974-a899-78a8f1ee973d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:31 crc kubenswrapper[4743]: I1011 01:12:31.144419 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/476a4c6e-ddae-4974-a899-78a8f1ee973d-config-data" (OuterVolumeSpecName: "config-data") pod "476a4c6e-ddae-4974-a899-78a8f1ee973d" (UID: "476a4c6e-ddae-4974-a899-78a8f1ee973d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:31 crc kubenswrapper[4743]: I1011 01:12:31.244900 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/476a4c6e-ddae-4974-a899-78a8f1ee973d-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:31 crc kubenswrapper[4743]: I1011 01:12:31.431062 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5496cd5f5c-c9jx6" event={"ID":"fdc6c654-6370-4cc4-99a1-c13dfd402b14","Type":"ContainerStarted","Data":"0ee9ef0e4027d41bcf3fcce3f34247b45a1ac205d9e9e4477038ab98a41312d2"} Oct 11 01:12:31 crc kubenswrapper[4743]: I1011 01:12:31.431169 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5496cd5f5c-c9jx6" Oct 11 01:12:31 crc kubenswrapper[4743]: I1011 01:12:31.437729 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-f4rpt" event={"ID":"476a4c6e-ddae-4974-a899-78a8f1ee973d","Type":"ContainerDied","Data":"5989f7e1faa849964c8d05d5d494b306b13ce63c3028c3f076fe4f2376cfa2f2"} Oct 11 01:12:31 crc kubenswrapper[4743]: I1011 01:12:31.437773 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5989f7e1faa849964c8d05d5d494b306b13ce63c3028c3f076fe4f2376cfa2f2" Oct 11 01:12:31 crc kubenswrapper[4743]: I1011 01:12:31.437737 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-f4rpt" Oct 11 01:12:31 crc kubenswrapper[4743]: I1011 01:12:31.453711 4743 generic.go:334] "Generic (PLEG): container finished" podID="897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" containerID="2cd44e542b88d959e7c7eb1910b04761f93430d968b425410133aebfc269c77c" exitCode=0 Oct 11 01:12:31 crc kubenswrapper[4743]: I1011 01:12:31.453744 4743 generic.go:334] "Generic (PLEG): container finished" podID="897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" containerID="a0c401940204f28e5b9ce6a4c314a41e9043ee912db8b9d4a80b86a9f2610cee" exitCode=2 Oct 11 01:12:31 crc kubenswrapper[4743]: I1011 01:12:31.453752 4743 generic.go:334] "Generic (PLEG): container finished" podID="897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" containerID="5e31815cc6bb002a54d9b48fcc0bd3a79d54a858afc2f6c912547a5573fc65ba" exitCode=0 Oct 11 01:12:31 crc kubenswrapper[4743]: I1011 01:12:31.453774 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d","Type":"ContainerDied","Data":"2cd44e542b88d959e7c7eb1910b04761f93430d968b425410133aebfc269c77c"} Oct 11 01:12:31 crc kubenswrapper[4743]: I1011 01:12:31.453802 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d","Type":"ContainerDied","Data":"a0c401940204f28e5b9ce6a4c314a41e9043ee912db8b9d4a80b86a9f2610cee"} Oct 11 01:12:31 crc kubenswrapper[4743]: I1011 01:12:31.453813 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d","Type":"ContainerDied","Data":"5e31815cc6bb002a54d9b48fcc0bd3a79d54a858afc2f6c912547a5573fc65ba"} Oct 11 01:12:31 crc kubenswrapper[4743]: I1011 01:12:31.465196 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5496cd5f5c-c9jx6" podStartSLOduration=6.465176237 podStartE2EDuration="6.465176237s" podCreationTimestamp="2025-10-11 01:12:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:12:31.456956778 +0000 UTC m=+1246.109937195" watchObservedRunningTime="2025-10-11 01:12:31.465176237 +0000 UTC m=+1246.118156634" Oct 11 01:12:33 crc kubenswrapper[4743]: I1011 01:12:33.030041 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-55n9s" Oct 11 01:12:33 crc kubenswrapper[4743]: I1011 01:12:33.090080 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-m99m4"] Oct 11 01:12:33 crc kubenswrapper[4743]: I1011 01:12:33.090369 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" podUID="e9d40a6d-5220-4537-bb4b-d4248101d864" containerName="dnsmasq-dns" containerID="cri-o://ba06ee2a5eb0ceba4aabba735361a4537bee8d4058f703e2cb13ff6effec3d2e" gracePeriod=10 Oct 11 01:12:33 crc kubenswrapper[4743]: I1011 01:12:33.474124 4743 generic.go:334] "Generic (PLEG): container finished" podID="897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" containerID="e086d9b7f67225d474a6748ad81b767991a3103ef0ccdcd7b0f7231e886e45f0" exitCode=0 Oct 11 01:12:33 crc kubenswrapper[4743]: I1011 01:12:33.474170 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d","Type":"ContainerDied","Data":"e086d9b7f67225d474a6748ad81b767991a3103ef0ccdcd7b0f7231e886e45f0"} Oct 11 01:12:34 crc kubenswrapper[4743]: I1011 01:12:34.359516 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:34 crc kubenswrapper[4743]: I1011 01:12:34.500712 4743 generic.go:334] "Generic (PLEG): container finished" podID="e9d40a6d-5220-4537-bb4b-d4248101d864" containerID="ba06ee2a5eb0ceba4aabba735361a4537bee8d4058f703e2cb13ff6effec3d2e" exitCode=0 Oct 11 01:12:34 crc kubenswrapper[4743]: I1011 01:12:34.500776 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" event={"ID":"e9d40a6d-5220-4537-bb4b-d4248101d864","Type":"ContainerDied","Data":"ba06ee2a5eb0ceba4aabba735361a4537bee8d4058f703e2cb13ff6effec3d2e"} Oct 11 01:12:35 crc kubenswrapper[4743]: I1011 01:12:35.210556 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-979bff964-bxbgb" Oct 11 01:12:36 crc kubenswrapper[4743]: I1011 01:12:36.429437 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" podUID="e9d40a6d-5220-4537-bb4b-d4248101d864" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.163:5353: connect: connection refused" Oct 11 01:12:37 crc kubenswrapper[4743]: I1011 01:12:37.889244 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:12:37 crc kubenswrapper[4743]: I1011 01:12:37.907417 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" Oct 11 01:12:37 crc kubenswrapper[4743]: I1011 01:12:37.983919 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-scripts\") pod \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " Oct 11 01:12:37 crc kubenswrapper[4743]: I1011 01:12:37.983988 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-log-httpd\") pod \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " Oct 11 01:12:37 crc kubenswrapper[4743]: I1011 01:12:37.984028 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-sg-core-conf-yaml\") pod \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " Oct 11 01:12:37 crc kubenswrapper[4743]: I1011 01:12:37.984104 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-run-httpd\") pod \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " Oct 11 01:12:37 crc kubenswrapper[4743]: I1011 01:12:37.984163 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-combined-ca-bundle\") pod \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " Oct 11 01:12:37 crc kubenswrapper[4743]: I1011 01:12:37.984212 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-config-data\") pod \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " Oct 11 01:12:37 crc kubenswrapper[4743]: I1011 01:12:37.984275 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv7m6\" (UniqueName: \"kubernetes.io/projected/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-kube-api-access-zv7m6\") pod \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\" (UID: \"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d\") " Oct 11 01:12:37 crc kubenswrapper[4743]: I1011 01:12:37.984604 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" (UID: "897a231b-400d-4d3b-b6eb-8bcee1ad5d1d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:12:37 crc kubenswrapper[4743]: I1011 01:12:37.984688 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" (UID: "897a231b-400d-4d3b-b6eb-8bcee1ad5d1d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:12:37 crc kubenswrapper[4743]: I1011 01:12:37.990920 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-kube-api-access-zv7m6" (OuterVolumeSpecName: "kube-api-access-zv7m6") pod "897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" (UID: "897a231b-400d-4d3b-b6eb-8bcee1ad5d1d"). InnerVolumeSpecName "kube-api-access-zv7m6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.007138 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-scripts" (OuterVolumeSpecName: "scripts") pod "897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" (UID: "897a231b-400d-4d3b-b6eb-8bcee1ad5d1d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.032391 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" (UID: "897a231b-400d-4d3b-b6eb-8bcee1ad5d1d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.080902 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" (UID: "897a231b-400d-4d3b-b6eb-8bcee1ad5d1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.085574 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-ovsdbserver-nb\") pod \"e9d40a6d-5220-4537-bb4b-d4248101d864\" (UID: \"e9d40a6d-5220-4537-bb4b-d4248101d864\") " Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.085612 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-dns-svc\") pod \"e9d40a6d-5220-4537-bb4b-d4248101d864\" (UID: \"e9d40a6d-5220-4537-bb4b-d4248101d864\") " Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.085778 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgpfr\" (UniqueName: \"kubernetes.io/projected/e9d40a6d-5220-4537-bb4b-d4248101d864-kube-api-access-fgpfr\") pod \"e9d40a6d-5220-4537-bb4b-d4248101d864\" (UID: \"e9d40a6d-5220-4537-bb4b-d4248101d864\") " Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.085885 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-config\") pod \"e9d40a6d-5220-4537-bb4b-d4248101d864\" (UID: \"e9d40a6d-5220-4537-bb4b-d4248101d864\") " Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.085903 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-dns-swift-storage-0\") pod \"e9d40a6d-5220-4537-bb4b-d4248101d864\" (UID: \"e9d40a6d-5220-4537-bb4b-d4248101d864\") " Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.085947 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-ovsdbserver-sb\") pod \"e9d40a6d-5220-4537-bb4b-d4248101d864\" (UID: \"e9d40a6d-5220-4537-bb4b-d4248101d864\") " Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.086322 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.086340 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.086349 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.086358 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv7m6\" (UniqueName: \"kubernetes.io/projected/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-kube-api-access-zv7m6\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.086367 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.086375 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.089201 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-config-data" (OuterVolumeSpecName: "config-data") pod "897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" (UID: "897a231b-400d-4d3b-b6eb-8bcee1ad5d1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.090026 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9d40a6d-5220-4537-bb4b-d4248101d864-kube-api-access-fgpfr" (OuterVolumeSpecName: "kube-api-access-fgpfr") pod "e9d40a6d-5220-4537-bb4b-d4248101d864" (UID: "e9d40a6d-5220-4537-bb4b-d4248101d864"). InnerVolumeSpecName "kube-api-access-fgpfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.128143 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e9d40a6d-5220-4537-bb4b-d4248101d864" (UID: "e9d40a6d-5220-4537-bb4b-d4248101d864"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.132522 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-config" (OuterVolumeSpecName: "config") pod "e9d40a6d-5220-4537-bb4b-d4248101d864" (UID: "e9d40a6d-5220-4537-bb4b-d4248101d864"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.138922 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e9d40a6d-5220-4537-bb4b-d4248101d864" (UID: "e9d40a6d-5220-4537-bb4b-d4248101d864"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.146990 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e9d40a6d-5220-4537-bb4b-d4248101d864" (UID: "e9d40a6d-5220-4537-bb4b-d4248101d864"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.163764 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e9d40a6d-5220-4537-bb4b-d4248101d864" (UID: "e9d40a6d-5220-4537-bb4b-d4248101d864"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.188514 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.188539 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgpfr\" (UniqueName: \"kubernetes.io/projected/e9d40a6d-5220-4537-bb4b-d4248101d864-kube-api-access-fgpfr\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.188549 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.188556 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.188564 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.188572 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.188579 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e9d40a6d-5220-4537-bb4b-d4248101d864-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.544648 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"897a231b-400d-4d3b-b6eb-8bcee1ad5d1d","Type":"ContainerDied","Data":"182450f29bacd468b53883ed4ede946e02cb832335cd9fb784d7f0120ab454f1"} Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.544762 4743 scope.go:117] "RemoveContainer" containerID="2cd44e542b88d959e7c7eb1910b04761f93430d968b425410133aebfc269c77c" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.544839 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.546512 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" event={"ID":"e9d40a6d-5220-4537-bb4b-d4248101d864","Type":"ContainerDied","Data":"e856e77f4cfc97b7f3f4de0cb6f7b1b55e284939ba6c68d89cb607fb0b22d158"} Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.546616 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-m99m4" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.595256 4743 scope.go:117] "RemoveContainer" containerID="a0c401940204f28e5b9ce6a4c314a41e9043ee912db8b9d4a80b86a9f2610cee" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.609992 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.622942 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.633298 4743 scope.go:117] "RemoveContainer" containerID="e086d9b7f67225d474a6748ad81b767991a3103ef0ccdcd7b0f7231e886e45f0" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.645959 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-m99m4"] Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.654640 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-m99m4"] Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.670995 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:12:38 crc kubenswrapper[4743]: E1011 01:12:38.671391 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" containerName="ceilometer-notification-agent" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.671404 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" containerName="ceilometer-notification-agent" Oct 11 01:12:38 crc kubenswrapper[4743]: E1011 01:12:38.671424 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9d40a6d-5220-4537-bb4b-d4248101d864" containerName="dnsmasq-dns" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.671430 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9d40a6d-5220-4537-bb4b-d4248101d864" containerName="dnsmasq-dns" Oct 11 01:12:38 crc kubenswrapper[4743]: E1011 01:12:38.671437 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" containerName="ceilometer-central-agent" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.671444 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" containerName="ceilometer-central-agent" Oct 11 01:12:38 crc kubenswrapper[4743]: E1011 01:12:38.671463 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9d40a6d-5220-4537-bb4b-d4248101d864" containerName="init" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.671468 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9d40a6d-5220-4537-bb4b-d4248101d864" containerName="init" Oct 11 01:12:38 crc kubenswrapper[4743]: E1011 01:12:38.671478 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" containerName="proxy-httpd" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.671484 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" containerName="proxy-httpd" Oct 11 01:12:38 crc kubenswrapper[4743]: E1011 01:12:38.671507 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="476a4c6e-ddae-4974-a899-78a8f1ee973d" containerName="heat-db-sync" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.671514 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="476a4c6e-ddae-4974-a899-78a8f1ee973d" containerName="heat-db-sync" Oct 11 01:12:38 crc kubenswrapper[4743]: E1011 01:12:38.671523 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" containerName="sg-core" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.671530 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" containerName="sg-core" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.671697 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9d40a6d-5220-4537-bb4b-d4248101d864" containerName="dnsmasq-dns" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.671715 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" containerName="proxy-httpd" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.671726 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" containerName="ceilometer-notification-agent" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.671742 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" containerName="ceilometer-central-agent" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.671754 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" containerName="sg-core" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.671764 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="476a4c6e-ddae-4974-a899-78a8f1ee973d" containerName="heat-db-sync" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.673486 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.675720 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.675732 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.680996 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.693072 4743 scope.go:117] "RemoveContainer" containerID="5e31815cc6bb002a54d9b48fcc0bd3a79d54a858afc2f6c912547a5573fc65ba" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.724865 4743 scope.go:117] "RemoveContainer" containerID="ba06ee2a5eb0ceba4aabba735361a4537bee8d4058f703e2cb13ff6effec3d2e" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.767981 4743 scope.go:117] "RemoveContainer" containerID="7ec6bf9ba82a05f95bba6448f0c6a65d716c6b6d4e212991d59441fa2916cea7" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.801229 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7246d8a-9560-4212-8219-c7ac80cd7152-scripts\") pod \"ceilometer-0\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " pod="openstack/ceilometer-0" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.801320 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kns5r\" (UniqueName: \"kubernetes.io/projected/d7246d8a-9560-4212-8219-c7ac80cd7152-kube-api-access-kns5r\") pod \"ceilometer-0\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " pod="openstack/ceilometer-0" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.801358 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7246d8a-9560-4212-8219-c7ac80cd7152-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " pod="openstack/ceilometer-0" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.801387 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7246d8a-9560-4212-8219-c7ac80cd7152-log-httpd\") pod \"ceilometer-0\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " pod="openstack/ceilometer-0" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.801428 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7246d8a-9560-4212-8219-c7ac80cd7152-run-httpd\") pod \"ceilometer-0\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " pod="openstack/ceilometer-0" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.801697 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7246d8a-9560-4212-8219-c7ac80cd7152-config-data\") pod \"ceilometer-0\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " pod="openstack/ceilometer-0" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.801793 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7246d8a-9560-4212-8219-c7ac80cd7152-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " pod="openstack/ceilometer-0" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.903405 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7246d8a-9560-4212-8219-c7ac80cd7152-log-httpd\") pod \"ceilometer-0\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " pod="openstack/ceilometer-0" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.903471 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7246d8a-9560-4212-8219-c7ac80cd7152-run-httpd\") pod \"ceilometer-0\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " pod="openstack/ceilometer-0" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.903529 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7246d8a-9560-4212-8219-c7ac80cd7152-config-data\") pod \"ceilometer-0\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " pod="openstack/ceilometer-0" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.903551 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7246d8a-9560-4212-8219-c7ac80cd7152-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " pod="openstack/ceilometer-0" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.903616 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7246d8a-9560-4212-8219-c7ac80cd7152-scripts\") pod \"ceilometer-0\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " pod="openstack/ceilometer-0" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.903636 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kns5r\" (UniqueName: \"kubernetes.io/projected/d7246d8a-9560-4212-8219-c7ac80cd7152-kube-api-access-kns5r\") pod \"ceilometer-0\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " pod="openstack/ceilometer-0" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.903654 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7246d8a-9560-4212-8219-c7ac80cd7152-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " pod="openstack/ceilometer-0" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.903789 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7246d8a-9560-4212-8219-c7ac80cd7152-log-httpd\") pod \"ceilometer-0\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " pod="openstack/ceilometer-0" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.904379 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7246d8a-9560-4212-8219-c7ac80cd7152-run-httpd\") pod \"ceilometer-0\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " pod="openstack/ceilometer-0" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.907156 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7246d8a-9560-4212-8219-c7ac80cd7152-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " pod="openstack/ceilometer-0" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.907401 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7246d8a-9560-4212-8219-c7ac80cd7152-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " pod="openstack/ceilometer-0" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.908844 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7246d8a-9560-4212-8219-c7ac80cd7152-config-data\") pod \"ceilometer-0\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " pod="openstack/ceilometer-0" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.910908 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7246d8a-9560-4212-8219-c7ac80cd7152-scripts\") pod \"ceilometer-0\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " pod="openstack/ceilometer-0" Oct 11 01:12:38 crc kubenswrapper[4743]: I1011 01:12:38.920316 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kns5r\" (UniqueName: \"kubernetes.io/projected/d7246d8a-9560-4212-8219-c7ac80cd7152-kube-api-access-kns5r\") pod \"ceilometer-0\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " pod="openstack/ceilometer-0" Oct 11 01:12:39 crc kubenswrapper[4743]: I1011 01:12:39.002585 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:12:39 crc kubenswrapper[4743]: I1011 01:12:39.470960 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:12:39 crc kubenswrapper[4743]: W1011 01:12:39.476009 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7246d8a_9560_4212_8219_c7ac80cd7152.slice/crio-9c5a1e2503354355dc2581b3639bc19a615d6c5e22d20664a235a14351a418b0 WatchSource:0}: Error finding container 9c5a1e2503354355dc2581b3639bc19a615d6c5e22d20664a235a14351a418b0: Status 404 returned error can't find the container with id 9c5a1e2503354355dc2581b3639bc19a615d6c5e22d20664a235a14351a418b0 Oct 11 01:12:39 crc kubenswrapper[4743]: I1011 01:12:39.562595 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-przwv" event={"ID":"65f773e4-09e5-4312-8c6b-9176a1a022f0","Type":"ContainerStarted","Data":"ddf99e43c15d33e7535c0a343463db45039ea0315650d5d7bfd64c5d18796d4a"} Oct 11 01:12:39 crc kubenswrapper[4743]: I1011 01:12:39.566774 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7246d8a-9560-4212-8219-c7ac80cd7152","Type":"ContainerStarted","Data":"9c5a1e2503354355dc2581b3639bc19a615d6c5e22d20664a235a14351a418b0"} Oct 11 01:12:39 crc kubenswrapper[4743]: I1011 01:12:39.569180 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6pzxd" event={"ID":"615827fb-c1f4-46c6-8014-00c71fe2403b","Type":"ContainerStarted","Data":"ea0d864863ed6198fb923dbfb39f2a713c60ca3cab3a50f8f2e3148ceea188ea"} Oct 11 01:12:39 crc kubenswrapper[4743]: I1011 01:12:39.582213 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-przwv" podStartSLOduration=2.11286409 podStartE2EDuration="40.582194731s" podCreationTimestamp="2025-10-11 01:11:59 +0000 UTC" firstStartedPulling="2025-10-11 01:12:00.231100552 +0000 UTC m=+1214.884080949" lastFinishedPulling="2025-10-11 01:12:38.700431193 +0000 UTC m=+1253.353411590" observedRunningTime="2025-10-11 01:12:39.57390218 +0000 UTC m=+1254.226882577" watchObservedRunningTime="2025-10-11 01:12:39.582194731 +0000 UTC m=+1254.235175138" Oct 11 01:12:39 crc kubenswrapper[4743]: I1011 01:12:39.595251 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-6pzxd" podStartSLOduration=2.242140391 podStartE2EDuration="40.595233231s" podCreationTimestamp="2025-10-11 01:11:59 +0000 UTC" firstStartedPulling="2025-10-11 01:12:00.125558844 +0000 UTC m=+1214.778539241" lastFinishedPulling="2025-10-11 01:12:38.478651684 +0000 UTC m=+1253.131632081" observedRunningTime="2025-10-11 01:12:39.591224649 +0000 UTC m=+1254.244205086" watchObservedRunningTime="2025-10-11 01:12:39.595233231 +0000 UTC m=+1254.248213648" Oct 11 01:12:40 crc kubenswrapper[4743]: I1011 01:12:40.106502 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="897a231b-400d-4d3b-b6eb-8bcee1ad5d1d" path="/var/lib/kubelet/pods/897a231b-400d-4d3b-b6eb-8bcee1ad5d1d/volumes" Oct 11 01:12:40 crc kubenswrapper[4743]: I1011 01:12:40.107529 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9d40a6d-5220-4537-bb4b-d4248101d864" path="/var/lib/kubelet/pods/e9d40a6d-5220-4537-bb4b-d4248101d864/volumes" Oct 11 01:12:40 crc kubenswrapper[4743]: I1011 01:12:40.604275 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7246d8a-9560-4212-8219-c7ac80cd7152","Type":"ContainerStarted","Data":"27fd22327259dc3b21f40a159ab8864185943820c6906261cc15648a64eafc5e"} Oct 11 01:12:41 crc kubenswrapper[4743]: I1011 01:12:41.621806 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7246d8a-9560-4212-8219-c7ac80cd7152","Type":"ContainerStarted","Data":"0a94cc2959a3caeae8a1da71778a8aee6fad22ab3592c272efa29502d5ebb7ed"} Oct 11 01:12:41 crc kubenswrapper[4743]: I1011 01:12:41.622341 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7246d8a-9560-4212-8219-c7ac80cd7152","Type":"ContainerStarted","Data":"c5f9e37a17454f1ed04f86d7dae58744f264061e298c54740d65a49894bd0560"} Oct 11 01:12:41 crc kubenswrapper[4743]: I1011 01:12:41.623614 4743 generic.go:334] "Generic (PLEG): container finished" podID="615827fb-c1f4-46c6-8014-00c71fe2403b" containerID="ea0d864863ed6198fb923dbfb39f2a713c60ca3cab3a50f8f2e3148ceea188ea" exitCode=0 Oct 11 01:12:41 crc kubenswrapper[4743]: I1011 01:12:41.623641 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6pzxd" event={"ID":"615827fb-c1f4-46c6-8014-00c71fe2403b","Type":"ContainerDied","Data":"ea0d864863ed6198fb923dbfb39f2a713c60ca3cab3a50f8f2e3148ceea188ea"} Oct 11 01:12:43 crc kubenswrapper[4743]: I1011 01:12:43.146579 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6pzxd" Oct 11 01:12:43 crc kubenswrapper[4743]: I1011 01:12:43.289396 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/615827fb-c1f4-46c6-8014-00c71fe2403b-db-sync-config-data\") pod \"615827fb-c1f4-46c6-8014-00c71fe2403b\" (UID: \"615827fb-c1f4-46c6-8014-00c71fe2403b\") " Oct 11 01:12:43 crc kubenswrapper[4743]: I1011 01:12:43.289616 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4z8m\" (UniqueName: \"kubernetes.io/projected/615827fb-c1f4-46c6-8014-00c71fe2403b-kube-api-access-t4z8m\") pod \"615827fb-c1f4-46c6-8014-00c71fe2403b\" (UID: \"615827fb-c1f4-46c6-8014-00c71fe2403b\") " Oct 11 01:12:43 crc kubenswrapper[4743]: I1011 01:12:43.289665 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615827fb-c1f4-46c6-8014-00c71fe2403b-combined-ca-bundle\") pod \"615827fb-c1f4-46c6-8014-00c71fe2403b\" (UID: \"615827fb-c1f4-46c6-8014-00c71fe2403b\") " Oct 11 01:12:43 crc kubenswrapper[4743]: I1011 01:12:43.293962 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/615827fb-c1f4-46c6-8014-00c71fe2403b-kube-api-access-t4z8m" (OuterVolumeSpecName: "kube-api-access-t4z8m") pod "615827fb-c1f4-46c6-8014-00c71fe2403b" (UID: "615827fb-c1f4-46c6-8014-00c71fe2403b"). InnerVolumeSpecName "kube-api-access-t4z8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:12:43 crc kubenswrapper[4743]: I1011 01:12:43.294331 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/615827fb-c1f4-46c6-8014-00c71fe2403b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "615827fb-c1f4-46c6-8014-00c71fe2403b" (UID: "615827fb-c1f4-46c6-8014-00c71fe2403b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:43 crc kubenswrapper[4743]: I1011 01:12:43.326153 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/615827fb-c1f4-46c6-8014-00c71fe2403b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "615827fb-c1f4-46c6-8014-00c71fe2403b" (UID: "615827fb-c1f4-46c6-8014-00c71fe2403b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:43 crc kubenswrapper[4743]: I1011 01:12:43.391502 4743 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/615827fb-c1f4-46c6-8014-00c71fe2403b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:43 crc kubenswrapper[4743]: I1011 01:12:43.391531 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4z8m\" (UniqueName: \"kubernetes.io/projected/615827fb-c1f4-46c6-8014-00c71fe2403b-kube-api-access-t4z8m\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:43 crc kubenswrapper[4743]: I1011 01:12:43.391543 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615827fb-c1f4-46c6-8014-00c71fe2403b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:43 crc kubenswrapper[4743]: I1011 01:12:43.648025 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7246d8a-9560-4212-8219-c7ac80cd7152","Type":"ContainerStarted","Data":"d4f73c76f87cb075388136a5c2308bb0f7c040ab5eea76e45c831be279d24d65"} Oct 11 01:12:43 crc kubenswrapper[4743]: I1011 01:12:43.648112 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 11 01:12:43 crc kubenswrapper[4743]: I1011 01:12:43.651021 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6pzxd" Oct 11 01:12:43 crc kubenswrapper[4743]: I1011 01:12:43.651085 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6pzxd" event={"ID":"615827fb-c1f4-46c6-8014-00c71fe2403b","Type":"ContainerDied","Data":"d85b2952cca6b8b5169af722e61ce5c5349ed99f25b681cc42051f2a88760911"} Oct 11 01:12:43 crc kubenswrapper[4743]: I1011 01:12:43.651180 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d85b2952cca6b8b5169af722e61ce5c5349ed99f25b681cc42051f2a88760911" Oct 11 01:12:43 crc kubenswrapper[4743]: I1011 01:12:43.663518 4743 generic.go:334] "Generic (PLEG): container finished" podID="65f773e4-09e5-4312-8c6b-9176a1a022f0" containerID="ddf99e43c15d33e7535c0a343463db45039ea0315650d5d7bfd64c5d18796d4a" exitCode=0 Oct 11 01:12:43 crc kubenswrapper[4743]: I1011 01:12:43.663604 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-przwv" event={"ID":"65f773e4-09e5-4312-8c6b-9176a1a022f0","Type":"ContainerDied","Data":"ddf99e43c15d33e7535c0a343463db45039ea0315650d5d7bfd64c5d18796d4a"} Oct 11 01:12:43 crc kubenswrapper[4743]: I1011 01:12:43.696691 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.965484129 podStartE2EDuration="5.696657667s" podCreationTimestamp="2025-10-11 01:12:38 +0000 UTC" firstStartedPulling="2025-10-11 01:12:39.482138056 +0000 UTC m=+1254.135118443" lastFinishedPulling="2025-10-11 01:12:43.213311584 +0000 UTC m=+1257.866291981" observedRunningTime="2025-10-11 01:12:43.677804343 +0000 UTC m=+1258.330784760" watchObservedRunningTime="2025-10-11 01:12:43.696657667 +0000 UTC m=+1258.349638114" Oct 11 01:12:43 crc kubenswrapper[4743]: I1011 01:12:43.982461 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5f98bddd87-6kv6r"] Oct 11 01:12:43 crc kubenswrapper[4743]: E1011 01:12:43.982908 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615827fb-c1f4-46c6-8014-00c71fe2403b" containerName="barbican-db-sync" Oct 11 01:12:43 crc kubenswrapper[4743]: I1011 01:12:43.982920 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="615827fb-c1f4-46c6-8014-00c71fe2403b" containerName="barbican-db-sync" Oct 11 01:12:43 crc kubenswrapper[4743]: I1011 01:12:43.983133 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="615827fb-c1f4-46c6-8014-00c71fe2403b" containerName="barbican-db-sync" Oct 11 01:12:43 crc kubenswrapper[4743]: I1011 01:12:43.984254 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5f98bddd87-6kv6r" Oct 11 01:12:43 crc kubenswrapper[4743]: I1011 01:12:43.992677 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5zgpr" Oct 11 01:12:43 crc kubenswrapper[4743]: I1011 01:12:43.993147 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 11 01:12:43 crc kubenswrapper[4743]: I1011 01:12:43.993312 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.000914 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-785cc87c98-slsn7"] Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.002600 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-785cc87c98-slsn7" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.006711 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.011100 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5f98bddd87-6kv6r"] Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.026611 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-785cc87c98-slsn7"] Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.090246 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-tjssf"] Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.094514 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.104593 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4wll\" (UniqueName: \"kubernetes.io/projected/7f793c4b-6627-4c4b-9f2c-529641700221-kube-api-access-z4wll\") pod \"barbican-keystone-listener-785cc87c98-slsn7\" (UID: \"7f793c4b-6627-4c4b-9f2c-529641700221\") " pod="openstack/barbican-keystone-listener-785cc87c98-slsn7" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.104642 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f793c4b-6627-4c4b-9f2c-529641700221-config-data-custom\") pod \"barbican-keystone-listener-785cc87c98-slsn7\" (UID: \"7f793c4b-6627-4c4b-9f2c-529641700221\") " pod="openstack/barbican-keystone-listener-785cc87c98-slsn7" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.104666 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f793c4b-6627-4c4b-9f2c-529641700221-config-data\") pod \"barbican-keystone-listener-785cc87c98-slsn7\" (UID: \"7f793c4b-6627-4c4b-9f2c-529641700221\") " pod="openstack/barbican-keystone-listener-785cc87c98-slsn7" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.104693 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f793c4b-6627-4c4b-9f2c-529641700221-logs\") pod \"barbican-keystone-listener-785cc87c98-slsn7\" (UID: \"7f793c4b-6627-4c4b-9f2c-529641700221\") " pod="openstack/barbican-keystone-listener-785cc87c98-slsn7" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.104718 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d-config-data-custom\") pod \"barbican-worker-5f98bddd87-6kv6r\" (UID: \"b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d\") " pod="openstack/barbican-worker-5f98bddd87-6kv6r" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.104761 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d-config-data\") pod \"barbican-worker-5f98bddd87-6kv6r\" (UID: \"b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d\") " pod="openstack/barbican-worker-5f98bddd87-6kv6r" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.104885 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d-combined-ca-bundle\") pod \"barbican-worker-5f98bddd87-6kv6r\" (UID: \"b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d\") " pod="openstack/barbican-worker-5f98bddd87-6kv6r" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.104943 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d-logs\") pod \"barbican-worker-5f98bddd87-6kv6r\" (UID: \"b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d\") " pod="openstack/barbican-worker-5f98bddd87-6kv6r" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.104960 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f793c4b-6627-4c4b-9f2c-529641700221-combined-ca-bundle\") pod \"barbican-keystone-listener-785cc87c98-slsn7\" (UID: \"7f793c4b-6627-4c4b-9f2c-529641700221\") " pod="openstack/barbican-keystone-listener-785cc87c98-slsn7" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.104980 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6t6m\" (UniqueName: \"kubernetes.io/projected/b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d-kube-api-access-n6t6m\") pod \"barbican-worker-5f98bddd87-6kv6r\" (UID: \"b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d\") " pod="openstack/barbican-worker-5f98bddd87-6kv6r" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.115555 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-tjssf"] Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.178817 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-8464ff7fb4-cflsg"] Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.185596 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8464ff7fb4-cflsg" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.189481 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.196286 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8464ff7fb4-cflsg"] Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.210324 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-tjssf\" (UID: \"129c74fb-0970-4b08-958c-b0d079c8e65a\") " pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.210382 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d-config-data\") pod \"barbican-worker-5f98bddd87-6kv6r\" (UID: \"b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d\") " pod="openstack/barbican-worker-5f98bddd87-6kv6r" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.210476 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d-combined-ca-bundle\") pod \"barbican-worker-5f98bddd87-6kv6r\" (UID: \"b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d\") " pod="openstack/barbican-worker-5f98bddd87-6kv6r" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.210497 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-tjssf\" (UID: \"129c74fb-0970-4b08-958c-b0d079c8e65a\") " pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.210552 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d-logs\") pod \"barbican-worker-5f98bddd87-6kv6r\" (UID: \"b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d\") " pod="openstack/barbican-worker-5f98bddd87-6kv6r" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.210577 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f793c4b-6627-4c4b-9f2c-529641700221-combined-ca-bundle\") pod \"barbican-keystone-listener-785cc87c98-slsn7\" (UID: \"7f793c4b-6627-4c4b-9f2c-529641700221\") " pod="openstack/barbican-keystone-listener-785cc87c98-slsn7" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.210601 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6t6m\" (UniqueName: \"kubernetes.io/projected/b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d-kube-api-access-n6t6m\") pod \"barbican-worker-5f98bddd87-6kv6r\" (UID: \"b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d\") " pod="openstack/barbican-worker-5f98bddd87-6kv6r" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.210663 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-config\") pod \"dnsmasq-dns-848cf88cfc-tjssf\" (UID: \"129c74fb-0970-4b08-958c-b0d079c8e65a\") " pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.210704 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-tjssf\" (UID: \"129c74fb-0970-4b08-958c-b0d079c8e65a\") " pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.210748 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw7jl\" (UniqueName: \"kubernetes.io/projected/129c74fb-0970-4b08-958c-b0d079c8e65a-kube-api-access-xw7jl\") pod \"dnsmasq-dns-848cf88cfc-tjssf\" (UID: \"129c74fb-0970-4b08-958c-b0d079c8e65a\") " pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.210806 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4wll\" (UniqueName: \"kubernetes.io/projected/7f793c4b-6627-4c4b-9f2c-529641700221-kube-api-access-z4wll\") pod \"barbican-keystone-listener-785cc87c98-slsn7\" (UID: \"7f793c4b-6627-4c4b-9f2c-529641700221\") " pod="openstack/barbican-keystone-listener-785cc87c98-slsn7" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.210835 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f793c4b-6627-4c4b-9f2c-529641700221-config-data-custom\") pod \"barbican-keystone-listener-785cc87c98-slsn7\" (UID: \"7f793c4b-6627-4c4b-9f2c-529641700221\") " pod="openstack/barbican-keystone-listener-785cc87c98-slsn7" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.210876 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f793c4b-6627-4c4b-9f2c-529641700221-config-data\") pod \"barbican-keystone-listener-785cc87c98-slsn7\" (UID: \"7f793c4b-6627-4c4b-9f2c-529641700221\") " pod="openstack/barbican-keystone-listener-785cc87c98-slsn7" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.210910 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-tjssf\" (UID: \"129c74fb-0970-4b08-958c-b0d079c8e65a\") " pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.210944 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f793c4b-6627-4c4b-9f2c-529641700221-logs\") pod \"barbican-keystone-listener-785cc87c98-slsn7\" (UID: \"7f793c4b-6627-4c4b-9f2c-529641700221\") " pod="openstack/barbican-keystone-listener-785cc87c98-slsn7" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.210973 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d-config-data-custom\") pod \"barbican-worker-5f98bddd87-6kv6r\" (UID: \"b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d\") " pod="openstack/barbican-worker-5f98bddd87-6kv6r" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.212846 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d-logs\") pod \"barbican-worker-5f98bddd87-6kv6r\" (UID: \"b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d\") " pod="openstack/barbican-worker-5f98bddd87-6kv6r" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.219871 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f793c4b-6627-4c4b-9f2c-529641700221-config-data-custom\") pod \"barbican-keystone-listener-785cc87c98-slsn7\" (UID: \"7f793c4b-6627-4c4b-9f2c-529641700221\") " pod="openstack/barbican-keystone-listener-785cc87c98-slsn7" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.219898 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d-combined-ca-bundle\") pod \"barbican-worker-5f98bddd87-6kv6r\" (UID: \"b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d\") " pod="openstack/barbican-worker-5f98bddd87-6kv6r" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.220741 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f793c4b-6627-4c4b-9f2c-529641700221-combined-ca-bundle\") pod \"barbican-keystone-listener-785cc87c98-slsn7\" (UID: \"7f793c4b-6627-4c4b-9f2c-529641700221\") " pod="openstack/barbican-keystone-listener-785cc87c98-slsn7" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.222794 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f793c4b-6627-4c4b-9f2c-529641700221-logs\") pod \"barbican-keystone-listener-785cc87c98-slsn7\" (UID: \"7f793c4b-6627-4c4b-9f2c-529641700221\") " pod="openstack/barbican-keystone-listener-785cc87c98-slsn7" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.226669 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d-config-data-custom\") pod \"barbican-worker-5f98bddd87-6kv6r\" (UID: \"b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d\") " pod="openstack/barbican-worker-5f98bddd87-6kv6r" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.227529 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d-config-data\") pod \"barbican-worker-5f98bddd87-6kv6r\" (UID: \"b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d\") " pod="openstack/barbican-worker-5f98bddd87-6kv6r" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.235965 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6t6m\" (UniqueName: \"kubernetes.io/projected/b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d-kube-api-access-n6t6m\") pod \"barbican-worker-5f98bddd87-6kv6r\" (UID: \"b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d\") " pod="openstack/barbican-worker-5f98bddd87-6kv6r" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.237606 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4wll\" (UniqueName: \"kubernetes.io/projected/7f793c4b-6627-4c4b-9f2c-529641700221-kube-api-access-z4wll\") pod \"barbican-keystone-listener-785cc87c98-slsn7\" (UID: \"7f793c4b-6627-4c4b-9f2c-529641700221\") " pod="openstack/barbican-keystone-listener-785cc87c98-slsn7" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.240783 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f793c4b-6627-4c4b-9f2c-529641700221-config-data\") pod \"barbican-keystone-listener-785cc87c98-slsn7\" (UID: \"7f793c4b-6627-4c4b-9f2c-529641700221\") " pod="openstack/barbican-keystone-listener-785cc87c98-slsn7" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.313230 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-tjssf\" (UID: \"129c74fb-0970-4b08-958c-b0d079c8e65a\") " pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.313536 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25jm4\" (UniqueName: \"kubernetes.io/projected/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-kube-api-access-25jm4\") pod \"barbican-api-8464ff7fb4-cflsg\" (UID: \"ffe3be15-eb32-4556-b0a8-099aa3f9e09b\") " pod="openstack/barbican-api-8464ff7fb4-cflsg" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.314304 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-tjssf\" (UID: \"129c74fb-0970-4b08-958c-b0d079c8e65a\") " pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.314414 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-config-data\") pod \"barbican-api-8464ff7fb4-cflsg\" (UID: \"ffe3be15-eb32-4556-b0a8-099aa3f9e09b\") " pod="openstack/barbican-api-8464ff7fb4-cflsg" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.314528 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-tjssf\" (UID: \"129c74fb-0970-4b08-958c-b0d079c8e65a\") " pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.314628 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-config-data-custom\") pod \"barbican-api-8464ff7fb4-cflsg\" (UID: \"ffe3be15-eb32-4556-b0a8-099aa3f9e09b\") " pod="openstack/barbican-api-8464ff7fb4-cflsg" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.314697 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-combined-ca-bundle\") pod \"barbican-api-8464ff7fb4-cflsg\" (UID: \"ffe3be15-eb32-4556-b0a8-099aa3f9e09b\") " pod="openstack/barbican-api-8464ff7fb4-cflsg" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.314778 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-logs\") pod \"barbican-api-8464ff7fb4-cflsg\" (UID: \"ffe3be15-eb32-4556-b0a8-099aa3f9e09b\") " pod="openstack/barbican-api-8464ff7fb4-cflsg" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.314879 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-config\") pod \"dnsmasq-dns-848cf88cfc-tjssf\" (UID: \"129c74fb-0970-4b08-958c-b0d079c8e65a\") " pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.314958 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-tjssf\" (UID: \"129c74fb-0970-4b08-958c-b0d079c8e65a\") " pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.315043 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw7jl\" (UniqueName: \"kubernetes.io/projected/129c74fb-0970-4b08-958c-b0d079c8e65a-kube-api-access-xw7jl\") pod \"dnsmasq-dns-848cf88cfc-tjssf\" (UID: \"129c74fb-0970-4b08-958c-b0d079c8e65a\") " pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.315330 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-tjssf\" (UID: \"129c74fb-0970-4b08-958c-b0d079c8e65a\") " pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.315398 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-tjssf\" (UID: \"129c74fb-0970-4b08-958c-b0d079c8e65a\") " pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.314714 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-tjssf\" (UID: \"129c74fb-0970-4b08-958c-b0d079c8e65a\") " pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.316216 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-config\") pod \"dnsmasq-dns-848cf88cfc-tjssf\" (UID: \"129c74fb-0970-4b08-958c-b0d079c8e65a\") " pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.316430 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-tjssf\" (UID: \"129c74fb-0970-4b08-958c-b0d079c8e65a\") " pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.331559 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw7jl\" (UniqueName: \"kubernetes.io/projected/129c74fb-0970-4b08-958c-b0d079c8e65a-kube-api-access-xw7jl\") pod \"dnsmasq-dns-848cf88cfc-tjssf\" (UID: \"129c74fb-0970-4b08-958c-b0d079c8e65a\") " pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.342480 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5f98bddd87-6kv6r" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.349655 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-785cc87c98-slsn7" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.413788 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.416948 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-config-data\") pod \"barbican-api-8464ff7fb4-cflsg\" (UID: \"ffe3be15-eb32-4556-b0a8-099aa3f9e09b\") " pod="openstack/barbican-api-8464ff7fb4-cflsg" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.417057 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-config-data-custom\") pod \"barbican-api-8464ff7fb4-cflsg\" (UID: \"ffe3be15-eb32-4556-b0a8-099aa3f9e09b\") " pod="openstack/barbican-api-8464ff7fb4-cflsg" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.417091 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-combined-ca-bundle\") pod \"barbican-api-8464ff7fb4-cflsg\" (UID: \"ffe3be15-eb32-4556-b0a8-099aa3f9e09b\") " pod="openstack/barbican-api-8464ff7fb4-cflsg" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.417120 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-logs\") pod \"barbican-api-8464ff7fb4-cflsg\" (UID: \"ffe3be15-eb32-4556-b0a8-099aa3f9e09b\") " pod="openstack/barbican-api-8464ff7fb4-cflsg" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.417240 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25jm4\" (UniqueName: \"kubernetes.io/projected/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-kube-api-access-25jm4\") pod \"barbican-api-8464ff7fb4-cflsg\" (UID: \"ffe3be15-eb32-4556-b0a8-099aa3f9e09b\") " pod="openstack/barbican-api-8464ff7fb4-cflsg" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.418366 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-logs\") pod \"barbican-api-8464ff7fb4-cflsg\" (UID: \"ffe3be15-eb32-4556-b0a8-099aa3f9e09b\") " pod="openstack/barbican-api-8464ff7fb4-cflsg" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.424384 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-config-data-custom\") pod \"barbican-api-8464ff7fb4-cflsg\" (UID: \"ffe3be15-eb32-4556-b0a8-099aa3f9e09b\") " pod="openstack/barbican-api-8464ff7fb4-cflsg" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.429930 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-config-data\") pod \"barbican-api-8464ff7fb4-cflsg\" (UID: \"ffe3be15-eb32-4556-b0a8-099aa3f9e09b\") " pod="openstack/barbican-api-8464ff7fb4-cflsg" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.430921 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-combined-ca-bundle\") pod \"barbican-api-8464ff7fb4-cflsg\" (UID: \"ffe3be15-eb32-4556-b0a8-099aa3f9e09b\") " pod="openstack/barbican-api-8464ff7fb4-cflsg" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.448760 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25jm4\" (UniqueName: \"kubernetes.io/projected/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-kube-api-access-25jm4\") pod \"barbican-api-8464ff7fb4-cflsg\" (UID: \"ffe3be15-eb32-4556-b0a8-099aa3f9e09b\") " pod="openstack/barbican-api-8464ff7fb4-cflsg" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.457669 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.457982 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.501987 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8464ff7fb4-cflsg" Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.815553 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-785cc87c98-slsn7"] Oct 11 01:12:44 crc kubenswrapper[4743]: W1011 01:12:44.819634 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f793c4b_6627_4c4b_9f2c_529641700221.slice/crio-355332f7fa19661b39266cfb007d7ba9248efcfa443a1b8054309e69c4f0ebc6 WatchSource:0}: Error finding container 355332f7fa19661b39266cfb007d7ba9248efcfa443a1b8054309e69c4f0ebc6: Status 404 returned error can't find the container with id 355332f7fa19661b39266cfb007d7ba9248efcfa443a1b8054309e69c4f0ebc6 Oct 11 01:12:44 crc kubenswrapper[4743]: I1011 01:12:44.864150 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5f98bddd87-6kv6r"] Oct 11 01:12:44 crc kubenswrapper[4743]: W1011 01:12:44.869981 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8ff2f6e_e247_4ff9_9c07_8ba2bf4d5b1d.slice/crio-91278b959edfe85f154dca948e188ee5a7425abcded489568571fc5c64538468 WatchSource:0}: Error finding container 91278b959edfe85f154dca948e188ee5a7425abcded489568571fc5c64538468: Status 404 returned error can't find the container with id 91278b959edfe85f154dca948e188ee5a7425abcded489568571fc5c64538468 Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.048517 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-tjssf"] Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.070460 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8464ff7fb4-cflsg"] Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.262626 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-przwv" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.336123 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/65f773e4-09e5-4312-8c6b-9176a1a022f0-etc-machine-id\") pod \"65f773e4-09e5-4312-8c6b-9176a1a022f0\" (UID: \"65f773e4-09e5-4312-8c6b-9176a1a022f0\") " Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.336459 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlw9m\" (UniqueName: \"kubernetes.io/projected/65f773e4-09e5-4312-8c6b-9176a1a022f0-kube-api-access-dlw9m\") pod \"65f773e4-09e5-4312-8c6b-9176a1a022f0\" (UID: \"65f773e4-09e5-4312-8c6b-9176a1a022f0\") " Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.336605 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65f773e4-09e5-4312-8c6b-9176a1a022f0-scripts\") pod \"65f773e4-09e5-4312-8c6b-9176a1a022f0\" (UID: \"65f773e4-09e5-4312-8c6b-9176a1a022f0\") " Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.337531 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/65f773e4-09e5-4312-8c6b-9176a1a022f0-db-sync-config-data\") pod \"65f773e4-09e5-4312-8c6b-9176a1a022f0\" (UID: \"65f773e4-09e5-4312-8c6b-9176a1a022f0\") " Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.338278 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65f773e4-09e5-4312-8c6b-9176a1a022f0-config-data\") pod \"65f773e4-09e5-4312-8c6b-9176a1a022f0\" (UID: \"65f773e4-09e5-4312-8c6b-9176a1a022f0\") " Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.338485 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65f773e4-09e5-4312-8c6b-9176a1a022f0-combined-ca-bundle\") pod \"65f773e4-09e5-4312-8c6b-9176a1a022f0\" (UID: \"65f773e4-09e5-4312-8c6b-9176a1a022f0\") " Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.336259 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65f773e4-09e5-4312-8c6b-9176a1a022f0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "65f773e4-09e5-4312-8c6b-9176a1a022f0" (UID: "65f773e4-09e5-4312-8c6b-9176a1a022f0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.339884 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/65f773e4-09e5-4312-8c6b-9176a1a022f0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.342353 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65f773e4-09e5-4312-8c6b-9176a1a022f0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "65f773e4-09e5-4312-8c6b-9176a1a022f0" (UID: "65f773e4-09e5-4312-8c6b-9176a1a022f0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.342616 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65f773e4-09e5-4312-8c6b-9176a1a022f0-scripts" (OuterVolumeSpecName: "scripts") pod "65f773e4-09e5-4312-8c6b-9176a1a022f0" (UID: "65f773e4-09e5-4312-8c6b-9176a1a022f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.342639 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65f773e4-09e5-4312-8c6b-9176a1a022f0-kube-api-access-dlw9m" (OuterVolumeSpecName: "kube-api-access-dlw9m") pod "65f773e4-09e5-4312-8c6b-9176a1a022f0" (UID: "65f773e4-09e5-4312-8c6b-9176a1a022f0"). InnerVolumeSpecName "kube-api-access-dlw9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.363713 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65f773e4-09e5-4312-8c6b-9176a1a022f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65f773e4-09e5-4312-8c6b-9176a1a022f0" (UID: "65f773e4-09e5-4312-8c6b-9176a1a022f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.409262 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65f773e4-09e5-4312-8c6b-9176a1a022f0-config-data" (OuterVolumeSpecName: "config-data") pod "65f773e4-09e5-4312-8c6b-9176a1a022f0" (UID: "65f773e4-09e5-4312-8c6b-9176a1a022f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.441945 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlw9m\" (UniqueName: \"kubernetes.io/projected/65f773e4-09e5-4312-8c6b-9176a1a022f0-kube-api-access-dlw9m\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.441978 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65f773e4-09e5-4312-8c6b-9176a1a022f0-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.441988 4743 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/65f773e4-09e5-4312-8c6b-9176a1a022f0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.441996 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65f773e4-09e5-4312-8c6b-9176a1a022f0-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.442005 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65f773e4-09e5-4312-8c6b-9176a1a022f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.698403 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8464ff7fb4-cflsg" event={"ID":"ffe3be15-eb32-4556-b0a8-099aa3f9e09b","Type":"ContainerStarted","Data":"d4a7bd7d5a4901c481938f32145318f55b80f105120d6621da256b1884c3e303"} Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.698451 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8464ff7fb4-cflsg" event={"ID":"ffe3be15-eb32-4556-b0a8-099aa3f9e09b","Type":"ContainerStarted","Data":"accc35cb6f42fa340ac6711d731006e67208463df221504816393af32b3ce5f2"} Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.699474 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-785cc87c98-slsn7" event={"ID":"7f793c4b-6627-4c4b-9f2c-529641700221","Type":"ContainerStarted","Data":"355332f7fa19661b39266cfb007d7ba9248efcfa443a1b8054309e69c4f0ebc6"} Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.701033 4743 generic.go:334] "Generic (PLEG): container finished" podID="129c74fb-0970-4b08-958c-b0d079c8e65a" containerID="4e12013ab5986bd51d18a28768e1d34e44ee9f24d9e4e67c430e65d484e05711" exitCode=0 Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.701077 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" event={"ID":"129c74fb-0970-4b08-958c-b0d079c8e65a","Type":"ContainerDied","Data":"4e12013ab5986bd51d18a28768e1d34e44ee9f24d9e4e67c430e65d484e05711"} Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.701145 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" event={"ID":"129c74fb-0970-4b08-958c-b0d079c8e65a","Type":"ContainerStarted","Data":"5bb155e2ccc540894ac2c0375423fe6bd2e6f8727606f6298d4a794e7bbf590b"} Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.716617 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-przwv" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.717154 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-przwv" event={"ID":"65f773e4-09e5-4312-8c6b-9176a1a022f0","Type":"ContainerDied","Data":"69fb0628444ca1a53808d3b17d168b4ae98929f731e2b8062727367ca9ebedc4"} Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.717193 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69fb0628444ca1a53808d3b17d168b4ae98929f731e2b8062727367ca9ebedc4" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.722270 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f98bddd87-6kv6r" event={"ID":"b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d","Type":"ContainerStarted","Data":"91278b959edfe85f154dca948e188ee5a7425abcded489568571fc5c64538468"} Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.872288 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 11 01:12:45 crc kubenswrapper[4743]: E1011 01:12:45.872895 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65f773e4-09e5-4312-8c6b-9176a1a022f0" containerName="cinder-db-sync" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.872920 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="65f773e4-09e5-4312-8c6b-9176a1a022f0" containerName="cinder-db-sync" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.873175 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="65f773e4-09e5-4312-8c6b-9176a1a022f0" containerName="cinder-db-sync" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.874276 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.880001 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-sdbk7" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.880128 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.880683 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.880928 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.888754 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.963273 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/47a67cb1-322e-4da4-b240-1a89ff62fa51-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"47a67cb1-322e-4da4-b240-1a89ff62fa51\") " pod="openstack/cinder-scheduler-0" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.963457 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47a67cb1-322e-4da4-b240-1a89ff62fa51-config-data\") pod \"cinder-scheduler-0\" (UID: \"47a67cb1-322e-4da4-b240-1a89ff62fa51\") " pod="openstack/cinder-scheduler-0" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.963531 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47a67cb1-322e-4da4-b240-1a89ff62fa51-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"47a67cb1-322e-4da4-b240-1a89ff62fa51\") " pod="openstack/cinder-scheduler-0" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.963632 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqmks\" (UniqueName: \"kubernetes.io/projected/47a67cb1-322e-4da4-b240-1a89ff62fa51-kube-api-access-tqmks\") pod \"cinder-scheduler-0\" (UID: \"47a67cb1-322e-4da4-b240-1a89ff62fa51\") " pod="openstack/cinder-scheduler-0" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.963754 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47a67cb1-322e-4da4-b240-1a89ff62fa51-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"47a67cb1-322e-4da4-b240-1a89ff62fa51\") " pod="openstack/cinder-scheduler-0" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.963840 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47a67cb1-322e-4da4-b240-1a89ff62fa51-scripts\") pod \"cinder-scheduler-0\" (UID: \"47a67cb1-322e-4da4-b240-1a89ff62fa51\") " pod="openstack/cinder-scheduler-0" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.968504 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-tjssf"] Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.984929 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-bg4dw"] Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.986978 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" Oct 11 01:12:45 crc kubenswrapper[4743]: I1011 01:12:45.990373 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-bg4dw"] Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.065621 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-config\") pod \"dnsmasq-dns-6578955fd5-bg4dw\" (UID: \"ef875954-7f31-4d4d-acec-56789e002001\") " pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.067089 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/47a67cb1-322e-4da4-b240-1a89ff62fa51-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"47a67cb1-322e-4da4-b240-1a89ff62fa51\") " pod="openstack/cinder-scheduler-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.067155 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-dns-svc\") pod \"dnsmasq-dns-6578955fd5-bg4dw\" (UID: \"ef875954-7f31-4d4d-acec-56789e002001\") " pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.067181 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f2xk\" (UniqueName: \"kubernetes.io/projected/ef875954-7f31-4d4d-acec-56789e002001-kube-api-access-2f2xk\") pod \"dnsmasq-dns-6578955fd5-bg4dw\" (UID: \"ef875954-7f31-4d4d-acec-56789e002001\") " pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.067210 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47a67cb1-322e-4da4-b240-1a89ff62fa51-config-data\") pod \"cinder-scheduler-0\" (UID: \"47a67cb1-322e-4da4-b240-1a89ff62fa51\") " pod="openstack/cinder-scheduler-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.067233 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-bg4dw\" (UID: \"ef875954-7f31-4d4d-acec-56789e002001\") " pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.067268 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47a67cb1-322e-4da4-b240-1a89ff62fa51-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"47a67cb1-322e-4da4-b240-1a89ff62fa51\") " pod="openstack/cinder-scheduler-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.067316 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqmks\" (UniqueName: \"kubernetes.io/projected/47a67cb1-322e-4da4-b240-1a89ff62fa51-kube-api-access-tqmks\") pod \"cinder-scheduler-0\" (UID: \"47a67cb1-322e-4da4-b240-1a89ff62fa51\") " pod="openstack/cinder-scheduler-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.067363 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-bg4dw\" (UID: \"ef875954-7f31-4d4d-acec-56789e002001\") " pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.067421 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-bg4dw\" (UID: \"ef875954-7f31-4d4d-acec-56789e002001\") " pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.067609 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47a67cb1-322e-4da4-b240-1a89ff62fa51-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"47a67cb1-322e-4da4-b240-1a89ff62fa51\") " pod="openstack/cinder-scheduler-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.067660 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47a67cb1-322e-4da4-b240-1a89ff62fa51-scripts\") pod \"cinder-scheduler-0\" (UID: \"47a67cb1-322e-4da4-b240-1a89ff62fa51\") " pod="openstack/cinder-scheduler-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.067886 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/47a67cb1-322e-4da4-b240-1a89ff62fa51-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"47a67cb1-322e-4da4-b240-1a89ff62fa51\") " pod="openstack/cinder-scheduler-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.074049 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.074249 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.074317 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47a67cb1-322e-4da4-b240-1a89ff62fa51-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"47a67cb1-322e-4da4-b240-1a89ff62fa51\") " pod="openstack/cinder-scheduler-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.074392 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 11 01:12:46 crc kubenswrapper[4743]: E1011 01:12:46.074454 4743 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 11 01:12:46 crc kubenswrapper[4743]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/129c74fb-0970-4b08-958c-b0d079c8e65a/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 11 01:12:46 crc kubenswrapper[4743]: > podSandboxID="5bb155e2ccc540894ac2c0375423fe6bd2e6f8727606f6298d4a794e7bbf590b" Oct 11 01:12:46 crc kubenswrapper[4743]: E1011 01:12:46.074597 4743 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 11 01:12:46 crc kubenswrapper[4743]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n99h8bhd9h696h649h588h5c6h658h5b4h57fh65h89h5f5h56h696h5dh8h57h597h68ch568h58dh66hf4h675h598h588h67dhb5h69h5dh6bq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xw7jl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-848cf88cfc-tjssf_openstack(129c74fb-0970-4b08-958c-b0d079c8e65a): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/129c74fb-0970-4b08-958c-b0d079c8e65a/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 11 01:12:46 crc kubenswrapper[4743]: > logger="UnhandledError" Oct 11 01:12:46 crc kubenswrapper[4743]: E1011 01:12:46.076028 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/129c74fb-0970-4b08-958c-b0d079c8e65a/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" podUID="129c74fb-0970-4b08-958c-b0d079c8e65a" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.083480 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47a67cb1-322e-4da4-b240-1a89ff62fa51-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"47a67cb1-322e-4da4-b240-1a89ff62fa51\") " pod="openstack/cinder-scheduler-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.084736 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47a67cb1-322e-4da4-b240-1a89ff62fa51-scripts\") pod \"cinder-scheduler-0\" (UID: \"47a67cb1-322e-4da4-b240-1a89ff62fa51\") " pod="openstack/cinder-scheduler-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.089605 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqmks\" (UniqueName: \"kubernetes.io/projected/47a67cb1-322e-4da4-b240-1a89ff62fa51-kube-api-access-tqmks\") pod \"cinder-scheduler-0\" (UID: \"47a67cb1-322e-4da4-b240-1a89ff62fa51\") " pod="openstack/cinder-scheduler-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.120083 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47a67cb1-322e-4da4-b240-1a89ff62fa51-config-data\") pod \"cinder-scheduler-0\" (UID: \"47a67cb1-322e-4da4-b240-1a89ff62fa51\") " pod="openstack/cinder-scheduler-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.170468 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-bg4dw\" (UID: \"ef875954-7f31-4d4d-acec-56789e002001\") " pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.170565 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-bg4dw\" (UID: \"ef875954-7f31-4d4d-acec-56789e002001\") " pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.170613 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-bg4dw\" (UID: \"ef875954-7f31-4d4d-acec-56789e002001\") " pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.170687 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-config\") pod \"dnsmasq-dns-6578955fd5-bg4dw\" (UID: \"ef875954-7f31-4d4d-acec-56789e002001\") " pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.170743 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-dns-svc\") pod \"dnsmasq-dns-6578955fd5-bg4dw\" (UID: \"ef875954-7f31-4d4d-acec-56789e002001\") " pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.170774 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f2xk\" (UniqueName: \"kubernetes.io/projected/ef875954-7f31-4d4d-acec-56789e002001-kube-api-access-2f2xk\") pod \"dnsmasq-dns-6578955fd5-bg4dw\" (UID: \"ef875954-7f31-4d4d-acec-56789e002001\") " pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.171903 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-bg4dw\" (UID: \"ef875954-7f31-4d4d-acec-56789e002001\") " pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.173808 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-config\") pod \"dnsmasq-dns-6578955fd5-bg4dw\" (UID: \"ef875954-7f31-4d4d-acec-56789e002001\") " pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.174378 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-bg4dw\" (UID: \"ef875954-7f31-4d4d-acec-56789e002001\") " pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.175093 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-dns-svc\") pod \"dnsmasq-dns-6578955fd5-bg4dw\" (UID: \"ef875954-7f31-4d4d-acec-56789e002001\") " pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.183618 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-bg4dw\" (UID: \"ef875954-7f31-4d4d-acec-56789e002001\") " pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.190932 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.196658 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.205817 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.208343 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f2xk\" (UniqueName: \"kubernetes.io/projected/ef875954-7f31-4d4d-acec-56789e002001-kube-api-access-2f2xk\") pod \"dnsmasq-dns-6578955fd5-bg4dw\" (UID: \"ef875954-7f31-4d4d-acec-56789e002001\") " pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.222974 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-sdbk7" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.223129 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.231650 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.278321 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61c1298d-9f47-46f2-8ed9-53e958ba4893-config-data-custom\") pod \"cinder-api-0\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " pod="openstack/cinder-api-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.279035 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61c1298d-9f47-46f2-8ed9-53e958ba4893-logs\") pod \"cinder-api-0\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " pod="openstack/cinder-api-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.279177 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61c1298d-9f47-46f2-8ed9-53e958ba4893-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " pod="openstack/cinder-api-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.279286 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrbgl\" (UniqueName: \"kubernetes.io/projected/61c1298d-9f47-46f2-8ed9-53e958ba4893-kube-api-access-xrbgl\") pod \"cinder-api-0\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " pod="openstack/cinder-api-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.279607 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61c1298d-9f47-46f2-8ed9-53e958ba4893-config-data\") pod \"cinder-api-0\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " pod="openstack/cinder-api-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.280649 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61c1298d-9f47-46f2-8ed9-53e958ba4893-scripts\") pod \"cinder-api-0\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " pod="openstack/cinder-api-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.281126 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61c1298d-9f47-46f2-8ed9-53e958ba4893-etc-machine-id\") pod \"cinder-api-0\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " pod="openstack/cinder-api-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.404091 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61c1298d-9f47-46f2-8ed9-53e958ba4893-config-data-custom\") pod \"cinder-api-0\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " pod="openstack/cinder-api-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.404345 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61c1298d-9f47-46f2-8ed9-53e958ba4893-logs\") pod \"cinder-api-0\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " pod="openstack/cinder-api-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.404370 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61c1298d-9f47-46f2-8ed9-53e958ba4893-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " pod="openstack/cinder-api-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.404386 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrbgl\" (UniqueName: \"kubernetes.io/projected/61c1298d-9f47-46f2-8ed9-53e958ba4893-kube-api-access-xrbgl\") pod \"cinder-api-0\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " pod="openstack/cinder-api-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.404404 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61c1298d-9f47-46f2-8ed9-53e958ba4893-config-data\") pod \"cinder-api-0\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " pod="openstack/cinder-api-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.404471 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61c1298d-9f47-46f2-8ed9-53e958ba4893-scripts\") pod \"cinder-api-0\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " pod="openstack/cinder-api-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.404504 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61c1298d-9f47-46f2-8ed9-53e958ba4893-etc-machine-id\") pod \"cinder-api-0\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " pod="openstack/cinder-api-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.405641 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61c1298d-9f47-46f2-8ed9-53e958ba4893-logs\") pod \"cinder-api-0\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " pod="openstack/cinder-api-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.404609 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61c1298d-9f47-46f2-8ed9-53e958ba4893-etc-machine-id\") pod \"cinder-api-0\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " pod="openstack/cinder-api-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.420304 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61c1298d-9f47-46f2-8ed9-53e958ba4893-config-data-custom\") pod \"cinder-api-0\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " pod="openstack/cinder-api-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.425571 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61c1298d-9f47-46f2-8ed9-53e958ba4893-config-data\") pod \"cinder-api-0\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " pod="openstack/cinder-api-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.427014 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrbgl\" (UniqueName: \"kubernetes.io/projected/61c1298d-9f47-46f2-8ed9-53e958ba4893-kube-api-access-xrbgl\") pod \"cinder-api-0\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " pod="openstack/cinder-api-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.428238 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61c1298d-9f47-46f2-8ed9-53e958ba4893-scripts\") pod \"cinder-api-0\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " pod="openstack/cinder-api-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.437004 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61c1298d-9f47-46f2-8ed9-53e958ba4893-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " pod="openstack/cinder-api-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.483426 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.724796 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.779511 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8464ff7fb4-cflsg" event={"ID":"ffe3be15-eb32-4556-b0a8-099aa3f9e09b","Type":"ContainerStarted","Data":"b16be9c60d9b68b7eb2c64a3b283c287454d626202e5f9800cad210192e952fa"} Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.779551 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8464ff7fb4-cflsg" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.779899 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8464ff7fb4-cflsg" Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.823112 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 11 01:12:46 crc kubenswrapper[4743]: I1011 01:12:46.827367 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-8464ff7fb4-cflsg" podStartSLOduration=2.827350914 podStartE2EDuration="2.827350914s" podCreationTimestamp="2025-10-11 01:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:12:46.802339288 +0000 UTC m=+1261.455319705" watchObservedRunningTime="2025-10-11 01:12:46.827350914 +0000 UTC m=+1261.480331301" Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.003704 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-bg4dw"] Oct 11 01:12:47 crc kubenswrapper[4743]: W1011 01:12:47.233289 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef875954_7f31_4d4d_acec_56789e002001.slice/crio-589f592d9fe67ced8fc084adbbc61f2bfb0afcf430a0e99c36dfaf9fd836c9c4 WatchSource:0}: Error finding container 589f592d9fe67ced8fc084adbbc61f2bfb0afcf430a0e99c36dfaf9fd836c9c4: Status 404 returned error can't find the container with id 589f592d9fe67ced8fc084adbbc61f2bfb0afcf430a0e99c36dfaf9fd836c9c4 Oct 11 01:12:47 crc kubenswrapper[4743]: W1011 01:12:47.236138 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47a67cb1_322e_4da4_b240_1a89ff62fa51.slice/crio-8c3adb3c7faffca0559338d0856a94e83c99a1f7bf5b63c7710dcc1ba3d96069 WatchSource:0}: Error finding container 8c3adb3c7faffca0559338d0856a94e83c99a1f7bf5b63c7710dcc1ba3d96069: Status 404 returned error can't find the container with id 8c3adb3c7faffca0559338d0856a94e83c99a1f7bf5b63c7710dcc1ba3d96069 Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.324165 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.426751 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-ovsdbserver-sb\") pod \"129c74fb-0970-4b08-958c-b0d079c8e65a\" (UID: \"129c74fb-0970-4b08-958c-b0d079c8e65a\") " Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.427213 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-config\") pod \"129c74fb-0970-4b08-958c-b0d079c8e65a\" (UID: \"129c74fb-0970-4b08-958c-b0d079c8e65a\") " Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.427614 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-dns-swift-storage-0\") pod \"129c74fb-0970-4b08-958c-b0d079c8e65a\" (UID: \"129c74fb-0970-4b08-958c-b0d079c8e65a\") " Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.427826 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-dns-svc\") pod \"129c74fb-0970-4b08-958c-b0d079c8e65a\" (UID: \"129c74fb-0970-4b08-958c-b0d079c8e65a\") " Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.427992 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw7jl\" (UniqueName: \"kubernetes.io/projected/129c74fb-0970-4b08-958c-b0d079c8e65a-kube-api-access-xw7jl\") pod \"129c74fb-0970-4b08-958c-b0d079c8e65a\" (UID: \"129c74fb-0970-4b08-958c-b0d079c8e65a\") " Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.430022 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-ovsdbserver-nb\") pod \"129c74fb-0970-4b08-958c-b0d079c8e65a\" (UID: \"129c74fb-0970-4b08-958c-b0d079c8e65a\") " Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.432873 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/129c74fb-0970-4b08-958c-b0d079c8e65a-kube-api-access-xw7jl" (OuterVolumeSpecName: "kube-api-access-xw7jl") pod "129c74fb-0970-4b08-958c-b0d079c8e65a" (UID: "129c74fb-0970-4b08-958c-b0d079c8e65a"). InnerVolumeSpecName "kube-api-access-xw7jl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.494123 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-config" (OuterVolumeSpecName: "config") pod "129c74fb-0970-4b08-958c-b0d079c8e65a" (UID: "129c74fb-0970-4b08-958c-b0d079c8e65a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.496753 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "129c74fb-0970-4b08-958c-b0d079c8e65a" (UID: "129c74fb-0970-4b08-958c-b0d079c8e65a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.500393 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "129c74fb-0970-4b08-958c-b0d079c8e65a" (UID: "129c74fb-0970-4b08-958c-b0d079c8e65a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.518330 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "129c74fb-0970-4b08-958c-b0d079c8e65a" (UID: "129c74fb-0970-4b08-958c-b0d079c8e65a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.531536 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "129c74fb-0970-4b08-958c-b0d079c8e65a" (UID: "129c74fb-0970-4b08-958c-b0d079c8e65a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.532260 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-ovsdbserver-nb\") pod \"129c74fb-0970-4b08-958c-b0d079c8e65a\" (UID: \"129c74fb-0970-4b08-958c-b0d079c8e65a\") " Oct 11 01:12:47 crc kubenswrapper[4743]: W1011 01:12:47.532419 4743 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/129c74fb-0970-4b08-958c-b0d079c8e65a/volumes/kubernetes.io~configmap/ovsdbserver-nb Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.532438 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "129c74fb-0970-4b08-958c-b0d079c8e65a" (UID: "129c74fb-0970-4b08-958c-b0d079c8e65a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.532983 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.533103 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.533208 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.533921 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw7jl\" (UniqueName: \"kubernetes.io/projected/129c74fb-0970-4b08-958c-b0d079c8e65a-kube-api-access-xw7jl\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.534339 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.534480 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/129c74fb-0970-4b08-958c-b0d079c8e65a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.844076 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" event={"ID":"129c74fb-0970-4b08-958c-b0d079c8e65a","Type":"ContainerDied","Data":"5bb155e2ccc540894ac2c0375423fe6bd2e6f8727606f6298d4a794e7bbf590b"} Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.844409 4743 scope.go:117] "RemoveContainer" containerID="4e12013ab5986bd51d18a28768e1d34e44ee9f24d9e4e67c430e65d484e05711" Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.844569 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-tjssf" Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.874082 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"47a67cb1-322e-4da4-b240-1a89ff62fa51","Type":"ContainerStarted","Data":"8c3adb3c7faffca0559338d0856a94e83c99a1f7bf5b63c7710dcc1ba3d96069"} Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.876080 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" event={"ID":"ef875954-7f31-4d4d-acec-56789e002001","Type":"ContainerStarted","Data":"589f592d9fe67ced8fc084adbbc61f2bfb0afcf430a0e99c36dfaf9fd836c9c4"} Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.968131 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-tjssf"] Oct 11 01:12:47 crc kubenswrapper[4743]: I1011 01:12:47.988548 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-tjssf"] Oct 11 01:12:48 crc kubenswrapper[4743]: I1011 01:12:48.118207 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="129c74fb-0970-4b08-958c-b0d079c8e65a" path="/var/lib/kubelet/pods/129c74fb-0970-4b08-958c-b0d079c8e65a/volumes" Oct 11 01:12:48 crc kubenswrapper[4743]: I1011 01:12:48.259657 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 11 01:12:48 crc kubenswrapper[4743]: I1011 01:12:48.898789 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"47a67cb1-322e-4da4-b240-1a89ff62fa51","Type":"ContainerStarted","Data":"6846e6af5444ce8c462a5a23f7cd099fd16fb4ebfc5877d55b867bda3337d2e6"} Oct 11 01:12:48 crc kubenswrapper[4743]: I1011 01:12:48.901878 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-785cc87c98-slsn7" event={"ID":"7f793c4b-6627-4c4b-9f2c-529641700221","Type":"ContainerStarted","Data":"370e4d0b1347402a53017d20c718caa5fec8c1075ddf3c2f3234a45a9d423ba6"} Oct 11 01:12:48 crc kubenswrapper[4743]: I1011 01:12:48.901917 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-785cc87c98-slsn7" event={"ID":"7f793c4b-6627-4c4b-9f2c-529641700221","Type":"ContainerStarted","Data":"ccf9b4880ab376dbc618692966608edaed11f6fd69880c7cd2143f2f6d7701be"} Oct 11 01:12:48 crc kubenswrapper[4743]: I1011 01:12:48.909732 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"61c1298d-9f47-46f2-8ed9-53e958ba4893","Type":"ContainerStarted","Data":"7bfa0d65a772ebd14e9550586daa0254deebcb737bc0a66450295d3b19c744da"} Oct 11 01:12:48 crc kubenswrapper[4743]: I1011 01:12:48.921329 4743 generic.go:334] "Generic (PLEG): container finished" podID="ef875954-7f31-4d4d-acec-56789e002001" containerID="e26153410b18f47bcd88287e1c66b5fd354ff6a52689cc62138a3101449d1e81" exitCode=0 Oct 11 01:12:48 crc kubenswrapper[4743]: I1011 01:12:48.922230 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" event={"ID":"ef875954-7f31-4d4d-acec-56789e002001","Type":"ContainerDied","Data":"e26153410b18f47bcd88287e1c66b5fd354ff6a52689cc62138a3101449d1e81"} Oct 11 01:12:48 crc kubenswrapper[4743]: I1011 01:12:48.931962 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-785cc87c98-slsn7" podStartSLOduration=2.9015728320000003 podStartE2EDuration="5.931947509s" podCreationTimestamp="2025-10-11 01:12:43 +0000 UTC" firstStartedPulling="2025-10-11 01:12:44.823657555 +0000 UTC m=+1259.476637952" lastFinishedPulling="2025-10-11 01:12:47.854032232 +0000 UTC m=+1262.507012629" observedRunningTime="2025-10-11 01:12:48.917379313 +0000 UTC m=+1263.570359710" watchObservedRunningTime="2025-10-11 01:12:48.931947509 +0000 UTC m=+1263.584927906" Oct 11 01:12:48 crc kubenswrapper[4743]: I1011 01:12:48.932947 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f98bddd87-6kv6r" event={"ID":"b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d","Type":"ContainerStarted","Data":"c941f1e4ac60e80b8cd2905ad639f1ef2fdd89af8699174178aac9881de370e8"} Oct 11 01:12:48 crc kubenswrapper[4743]: I1011 01:12:48.933006 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f98bddd87-6kv6r" event={"ID":"b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d","Type":"ContainerStarted","Data":"4e9d98294dac84f37b15208a55372339d86b7ecbf69119637325105404e6e47f"} Oct 11 01:12:48 crc kubenswrapper[4743]: I1011 01:12:48.960196 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5f98bddd87-6kv6r" podStartSLOduration=3.041593447 podStartE2EDuration="5.960180309s" podCreationTimestamp="2025-10-11 01:12:43 +0000 UTC" firstStartedPulling="2025-10-11 01:12:44.872210783 +0000 UTC m=+1259.525191180" lastFinishedPulling="2025-10-11 01:12:47.790797645 +0000 UTC m=+1262.443778042" observedRunningTime="2025-10-11 01:12:48.958269505 +0000 UTC m=+1263.611249902" watchObservedRunningTime="2025-10-11 01:12:48.960180309 +0000 UTC m=+1263.613160706" Oct 11 01:12:49 crc kubenswrapper[4743]: I1011 01:12:49.944768 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"61c1298d-9f47-46f2-8ed9-53e958ba4893","Type":"ContainerStarted","Data":"f62e9a61b06eea87d05cedd2b6ede6b032a9756be08b3ad7dd48d161f963249d"} Oct 11 01:12:49 crc kubenswrapper[4743]: I1011 01:12:49.945307 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"61c1298d-9f47-46f2-8ed9-53e958ba4893","Type":"ContainerStarted","Data":"3597209a3d060ce1ea56604372cd198ebe067d778d7e848118067956b1e93c29"} Oct 11 01:12:49 crc kubenswrapper[4743]: I1011 01:12:49.948138 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"47a67cb1-322e-4da4-b240-1a89ff62fa51","Type":"ContainerStarted","Data":"cc15c18c0b3aab5a7851e156d795c905240d201714233ac67857ccd85cc21b78"} Oct 11 01:12:49 crc kubenswrapper[4743]: I1011 01:12:49.952315 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" event={"ID":"ef875954-7f31-4d4d-acec-56789e002001","Type":"ContainerStarted","Data":"efc5feb2db287d79f71abf737483c9a699a58fff5eb0a5bbe557a237d343efd7"} Oct 11 01:12:49 crc kubenswrapper[4743]: I1011 01:12:49.952348 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" Oct 11 01:12:49 crc kubenswrapper[4743]: I1011 01:12:49.994883 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.050365715 podStartE2EDuration="4.994845189s" podCreationTimestamp="2025-10-11 01:12:45 +0000 UTC" firstStartedPulling="2025-10-11 01:12:47.239263282 +0000 UTC m=+1261.892243679" lastFinishedPulling="2025-10-11 01:12:48.183742756 +0000 UTC m=+1262.836723153" observedRunningTime="2025-10-11 01:12:49.986246411 +0000 UTC m=+1264.639226808" watchObservedRunningTime="2025-10-11 01:12:49.994845189 +0000 UTC m=+1264.647825596" Oct 11 01:12:49 crc kubenswrapper[4743]: I1011 01:12:49.997079 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.99707125 podStartE2EDuration="3.99707125s" podCreationTimestamp="2025-10-11 01:12:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:12:49.971590374 +0000 UTC m=+1264.624570771" watchObservedRunningTime="2025-10-11 01:12:49.99707125 +0000 UTC m=+1264.650051657" Oct 11 01:12:50 crc kubenswrapper[4743]: I1011 01:12:50.006087 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" podStartSLOduration=5.006064978 podStartE2EDuration="5.006064978s" podCreationTimestamp="2025-10-11 01:12:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:12:49.999951167 +0000 UTC m=+1264.652931564" watchObservedRunningTime="2025-10-11 01:12:50.006064978 +0000 UTC m=+1264.659045375" Oct 11 01:12:50 crc kubenswrapper[4743]: I1011 01:12:50.959463 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.161602 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.233845 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.624309 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-67c9948594-q58d2"] Oct 11 01:12:51 crc kubenswrapper[4743]: E1011 01:12:51.624722 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="129c74fb-0970-4b08-958c-b0d079c8e65a" containerName="init" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.624738 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="129c74fb-0970-4b08-958c-b0d079c8e65a" containerName="init" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.624916 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="129c74fb-0970-4b08-958c-b0d079c8e65a" containerName="init" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.625965 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.627550 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.627809 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.634703 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-67c9948594-q58d2"] Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.727415 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cd00887-e25b-4548-8084-5efab4f9cb27-config-data-custom\") pod \"barbican-api-67c9948594-q58d2\" (UID: \"9cd00887-e25b-4548-8084-5efab4f9cb27\") " pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.727463 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd00887-e25b-4548-8084-5efab4f9cb27-combined-ca-bundle\") pod \"barbican-api-67c9948594-q58d2\" (UID: \"9cd00887-e25b-4548-8084-5efab4f9cb27\") " pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.727496 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd00887-e25b-4548-8084-5efab4f9cb27-public-tls-certs\") pod \"barbican-api-67c9948594-q58d2\" (UID: \"9cd00887-e25b-4548-8084-5efab4f9cb27\") " pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.727531 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cd00887-e25b-4548-8084-5efab4f9cb27-logs\") pod \"barbican-api-67c9948594-q58d2\" (UID: \"9cd00887-e25b-4548-8084-5efab4f9cb27\") " pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.727557 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cd00887-e25b-4548-8084-5efab4f9cb27-config-data\") pod \"barbican-api-67c9948594-q58d2\" (UID: \"9cd00887-e25b-4548-8084-5efab4f9cb27\") " pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.727623 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd00887-e25b-4548-8084-5efab4f9cb27-internal-tls-certs\") pod \"barbican-api-67c9948594-q58d2\" (UID: \"9cd00887-e25b-4548-8084-5efab4f9cb27\") " pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.727688 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw6fs\" (UniqueName: \"kubernetes.io/projected/9cd00887-e25b-4548-8084-5efab4f9cb27-kube-api-access-dw6fs\") pod \"barbican-api-67c9948594-q58d2\" (UID: \"9cd00887-e25b-4548-8084-5efab4f9cb27\") " pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.829388 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cd00887-e25b-4548-8084-5efab4f9cb27-config-data-custom\") pod \"barbican-api-67c9948594-q58d2\" (UID: \"9cd00887-e25b-4548-8084-5efab4f9cb27\") " pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.829464 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd00887-e25b-4548-8084-5efab4f9cb27-combined-ca-bundle\") pod \"barbican-api-67c9948594-q58d2\" (UID: \"9cd00887-e25b-4548-8084-5efab4f9cb27\") " pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.829500 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd00887-e25b-4548-8084-5efab4f9cb27-public-tls-certs\") pod \"barbican-api-67c9948594-q58d2\" (UID: \"9cd00887-e25b-4548-8084-5efab4f9cb27\") " pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.829538 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cd00887-e25b-4548-8084-5efab4f9cb27-logs\") pod \"barbican-api-67c9948594-q58d2\" (UID: \"9cd00887-e25b-4548-8084-5efab4f9cb27\") " pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.829572 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cd00887-e25b-4548-8084-5efab4f9cb27-config-data\") pod \"barbican-api-67c9948594-q58d2\" (UID: \"9cd00887-e25b-4548-8084-5efab4f9cb27\") " pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.829663 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd00887-e25b-4548-8084-5efab4f9cb27-internal-tls-certs\") pod \"barbican-api-67c9948594-q58d2\" (UID: \"9cd00887-e25b-4548-8084-5efab4f9cb27\") " pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.829715 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw6fs\" (UniqueName: \"kubernetes.io/projected/9cd00887-e25b-4548-8084-5efab4f9cb27-kube-api-access-dw6fs\") pod \"barbican-api-67c9948594-q58d2\" (UID: \"9cd00887-e25b-4548-8084-5efab4f9cb27\") " pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.833175 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cd00887-e25b-4548-8084-5efab4f9cb27-logs\") pod \"barbican-api-67c9948594-q58d2\" (UID: \"9cd00887-e25b-4548-8084-5efab4f9cb27\") " pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.839043 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd00887-e25b-4548-8084-5efab4f9cb27-public-tls-certs\") pod \"barbican-api-67c9948594-q58d2\" (UID: \"9cd00887-e25b-4548-8084-5efab4f9cb27\") " pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.840990 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd00887-e25b-4548-8084-5efab4f9cb27-internal-tls-certs\") pod \"barbican-api-67c9948594-q58d2\" (UID: \"9cd00887-e25b-4548-8084-5efab4f9cb27\") " pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.841795 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cd00887-e25b-4548-8084-5efab4f9cb27-config-data-custom\") pod \"barbican-api-67c9948594-q58d2\" (UID: \"9cd00887-e25b-4548-8084-5efab4f9cb27\") " pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.843518 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cd00887-e25b-4548-8084-5efab4f9cb27-config-data\") pod \"barbican-api-67c9948594-q58d2\" (UID: \"9cd00887-e25b-4548-8084-5efab4f9cb27\") " pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.847477 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd00887-e25b-4548-8084-5efab4f9cb27-combined-ca-bundle\") pod \"barbican-api-67c9948594-q58d2\" (UID: \"9cd00887-e25b-4548-8084-5efab4f9cb27\") " pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.853794 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw6fs\" (UniqueName: \"kubernetes.io/projected/9cd00887-e25b-4548-8084-5efab4f9cb27-kube-api-access-dw6fs\") pod \"barbican-api-67c9948594-q58d2\" (UID: \"9cd00887-e25b-4548-8084-5efab4f9cb27\") " pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:12:51 crc kubenswrapper[4743]: I1011 01:12:51.951441 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:12:52 crc kubenswrapper[4743]: I1011 01:12:52.460961 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-67c9948594-q58d2"] Oct 11 01:12:52 crc kubenswrapper[4743]: W1011 01:12:52.469335 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cd00887_e25b_4548_8084_5efab4f9cb27.slice/crio-5ab752eb5ac887e2a3811ee72be798610a1f04093b7da39bc3e06f59f8bbc5f5 WatchSource:0}: Error finding container 5ab752eb5ac887e2a3811ee72be798610a1f04093b7da39bc3e06f59f8bbc5f5: Status 404 returned error can't find the container with id 5ab752eb5ac887e2a3811ee72be798610a1f04093b7da39bc3e06f59f8bbc5f5 Oct 11 01:12:52 crc kubenswrapper[4743]: I1011 01:12:52.975902 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5bd8997d9d-dpxn8" Oct 11 01:12:52 crc kubenswrapper[4743]: I1011 01:12:52.981208 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67c9948594-q58d2" event={"ID":"9cd00887-e25b-4548-8084-5efab4f9cb27","Type":"ContainerStarted","Data":"610277826f8a318459a2e39422ba6579c282bf7745a5a2a28cd87eacc3523b8a"} Oct 11 01:12:52 crc kubenswrapper[4743]: I1011 01:12:52.981255 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67c9948594-q58d2" event={"ID":"9cd00887-e25b-4548-8084-5efab4f9cb27","Type":"ContainerStarted","Data":"175f6d07bacbf3172a4fcba5100a62fb11e06644d6da756a8f64cc4232bed9c4"} Oct 11 01:12:52 crc kubenswrapper[4743]: I1011 01:12:52.981273 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67c9948594-q58d2" event={"ID":"9cd00887-e25b-4548-8084-5efab4f9cb27","Type":"ContainerStarted","Data":"5ab752eb5ac887e2a3811ee72be798610a1f04093b7da39bc3e06f59f8bbc5f5"} Oct 11 01:12:52 crc kubenswrapper[4743]: I1011 01:12:52.981319 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="61c1298d-9f47-46f2-8ed9-53e958ba4893" containerName="cinder-api-log" containerID="cri-o://3597209a3d060ce1ea56604372cd198ebe067d778d7e848118067956b1e93c29" gracePeriod=30 Oct 11 01:12:52 crc kubenswrapper[4743]: I1011 01:12:52.981372 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="61c1298d-9f47-46f2-8ed9-53e958ba4893" containerName="cinder-api" containerID="cri-o://f62e9a61b06eea87d05cedd2b6ede6b032a9756be08b3ad7dd48d161f963249d" gracePeriod=30 Oct 11 01:12:52 crc kubenswrapper[4743]: I1011 01:12:52.981633 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:12:52 crc kubenswrapper[4743]: I1011 01:12:52.981659 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.071684 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-67c9948594-q58d2" podStartSLOduration=2.071663456 podStartE2EDuration="2.071663456s" podCreationTimestamp="2025-10-11 01:12:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:12:53.06530793 +0000 UTC m=+1267.718288327" watchObservedRunningTime="2025-10-11 01:12:53.071663456 +0000 UTC m=+1267.724643853" Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.165478 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7dfd9c7974-zppf2" Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.688391 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.777566 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61c1298d-9f47-46f2-8ed9-53e958ba4893-logs\") pod \"61c1298d-9f47-46f2-8ed9-53e958ba4893\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.777651 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61c1298d-9f47-46f2-8ed9-53e958ba4893-etc-machine-id\") pod \"61c1298d-9f47-46f2-8ed9-53e958ba4893\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.777779 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrbgl\" (UniqueName: \"kubernetes.io/projected/61c1298d-9f47-46f2-8ed9-53e958ba4893-kube-api-access-xrbgl\") pod \"61c1298d-9f47-46f2-8ed9-53e958ba4893\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.777825 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61c1298d-9f47-46f2-8ed9-53e958ba4893-combined-ca-bundle\") pod \"61c1298d-9f47-46f2-8ed9-53e958ba4893\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.777846 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61c1298d-9f47-46f2-8ed9-53e958ba4893-config-data-custom\") pod \"61c1298d-9f47-46f2-8ed9-53e958ba4893\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.777942 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61c1298d-9f47-46f2-8ed9-53e958ba4893-config-data\") pod \"61c1298d-9f47-46f2-8ed9-53e958ba4893\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.777989 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61c1298d-9f47-46f2-8ed9-53e958ba4893-scripts\") pod \"61c1298d-9f47-46f2-8ed9-53e958ba4893\" (UID: \"61c1298d-9f47-46f2-8ed9-53e958ba4893\") " Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.778115 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61c1298d-9f47-46f2-8ed9-53e958ba4893-logs" (OuterVolumeSpecName: "logs") pod "61c1298d-9f47-46f2-8ed9-53e958ba4893" (UID: "61c1298d-9f47-46f2-8ed9-53e958ba4893"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.778695 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61c1298d-9f47-46f2-8ed9-53e958ba4893-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "61c1298d-9f47-46f2-8ed9-53e958ba4893" (UID: "61c1298d-9f47-46f2-8ed9-53e958ba4893"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.779037 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61c1298d-9f47-46f2-8ed9-53e958ba4893-logs\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.779055 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61c1298d-9f47-46f2-8ed9-53e958ba4893-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.783589 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61c1298d-9f47-46f2-8ed9-53e958ba4893-scripts" (OuterVolumeSpecName: "scripts") pod "61c1298d-9f47-46f2-8ed9-53e958ba4893" (UID: "61c1298d-9f47-46f2-8ed9-53e958ba4893"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.784074 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61c1298d-9f47-46f2-8ed9-53e958ba4893-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "61c1298d-9f47-46f2-8ed9-53e958ba4893" (UID: "61c1298d-9f47-46f2-8ed9-53e958ba4893"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.784149 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61c1298d-9f47-46f2-8ed9-53e958ba4893-kube-api-access-xrbgl" (OuterVolumeSpecName: "kube-api-access-xrbgl") pod "61c1298d-9f47-46f2-8ed9-53e958ba4893" (UID: "61c1298d-9f47-46f2-8ed9-53e958ba4893"). InnerVolumeSpecName "kube-api-access-xrbgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.810240 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61c1298d-9f47-46f2-8ed9-53e958ba4893-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61c1298d-9f47-46f2-8ed9-53e958ba4893" (UID: "61c1298d-9f47-46f2-8ed9-53e958ba4893"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.833314 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61c1298d-9f47-46f2-8ed9-53e958ba4893-config-data" (OuterVolumeSpecName: "config-data") pod "61c1298d-9f47-46f2-8ed9-53e958ba4893" (UID: "61c1298d-9f47-46f2-8ed9-53e958ba4893"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.880518 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrbgl\" (UniqueName: \"kubernetes.io/projected/61c1298d-9f47-46f2-8ed9-53e958ba4893-kube-api-access-xrbgl\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.880567 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61c1298d-9f47-46f2-8ed9-53e958ba4893-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.880578 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61c1298d-9f47-46f2-8ed9-53e958ba4893-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.880587 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61c1298d-9f47-46f2-8ed9-53e958ba4893-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.880597 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61c1298d-9f47-46f2-8ed9-53e958ba4893-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.994986 4743 generic.go:334] "Generic (PLEG): container finished" podID="61c1298d-9f47-46f2-8ed9-53e958ba4893" containerID="f62e9a61b06eea87d05cedd2b6ede6b032a9756be08b3ad7dd48d161f963249d" exitCode=0 Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.995024 4743 generic.go:334] "Generic (PLEG): container finished" podID="61c1298d-9f47-46f2-8ed9-53e958ba4893" containerID="3597209a3d060ce1ea56604372cd198ebe067d778d7e848118067956b1e93c29" exitCode=143 Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.995033 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.995092 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"61c1298d-9f47-46f2-8ed9-53e958ba4893","Type":"ContainerDied","Data":"f62e9a61b06eea87d05cedd2b6ede6b032a9756be08b3ad7dd48d161f963249d"} Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.995124 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"61c1298d-9f47-46f2-8ed9-53e958ba4893","Type":"ContainerDied","Data":"3597209a3d060ce1ea56604372cd198ebe067d778d7e848118067956b1e93c29"} Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.995168 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"61c1298d-9f47-46f2-8ed9-53e958ba4893","Type":"ContainerDied","Data":"7bfa0d65a772ebd14e9550586daa0254deebcb737bc0a66450295d3b19c744da"} Oct 11 01:12:53 crc kubenswrapper[4743]: I1011 01:12:53.995191 4743 scope.go:117] "RemoveContainer" containerID="f62e9a61b06eea87d05cedd2b6ede6b032a9756be08b3ad7dd48d161f963249d" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.035120 4743 scope.go:117] "RemoveContainer" containerID="3597209a3d060ce1ea56604372cd198ebe067d778d7e848118067956b1e93c29" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.055796 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.070141 4743 scope.go:117] "RemoveContainer" containerID="f62e9a61b06eea87d05cedd2b6ede6b032a9756be08b3ad7dd48d161f963249d" Oct 11 01:12:54 crc kubenswrapper[4743]: E1011 01:12:54.071206 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f62e9a61b06eea87d05cedd2b6ede6b032a9756be08b3ad7dd48d161f963249d\": container with ID starting with f62e9a61b06eea87d05cedd2b6ede6b032a9756be08b3ad7dd48d161f963249d not found: ID does not exist" containerID="f62e9a61b06eea87d05cedd2b6ede6b032a9756be08b3ad7dd48d161f963249d" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.071258 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f62e9a61b06eea87d05cedd2b6ede6b032a9756be08b3ad7dd48d161f963249d"} err="failed to get container status \"f62e9a61b06eea87d05cedd2b6ede6b032a9756be08b3ad7dd48d161f963249d\": rpc error: code = NotFound desc = could not find container \"f62e9a61b06eea87d05cedd2b6ede6b032a9756be08b3ad7dd48d161f963249d\": container with ID starting with f62e9a61b06eea87d05cedd2b6ede6b032a9756be08b3ad7dd48d161f963249d not found: ID does not exist" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.071286 4743 scope.go:117] "RemoveContainer" containerID="3597209a3d060ce1ea56604372cd198ebe067d778d7e848118067956b1e93c29" Oct 11 01:12:54 crc kubenswrapper[4743]: E1011 01:12:54.071950 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3597209a3d060ce1ea56604372cd198ebe067d778d7e848118067956b1e93c29\": container with ID starting with 3597209a3d060ce1ea56604372cd198ebe067d778d7e848118067956b1e93c29 not found: ID does not exist" containerID="3597209a3d060ce1ea56604372cd198ebe067d778d7e848118067956b1e93c29" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.071983 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3597209a3d060ce1ea56604372cd198ebe067d778d7e848118067956b1e93c29"} err="failed to get container status \"3597209a3d060ce1ea56604372cd198ebe067d778d7e848118067956b1e93c29\": rpc error: code = NotFound desc = could not find container \"3597209a3d060ce1ea56604372cd198ebe067d778d7e848118067956b1e93c29\": container with ID starting with 3597209a3d060ce1ea56604372cd198ebe067d778d7e848118067956b1e93c29 not found: ID does not exist" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.072002 4743 scope.go:117] "RemoveContainer" containerID="f62e9a61b06eea87d05cedd2b6ede6b032a9756be08b3ad7dd48d161f963249d" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.072381 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f62e9a61b06eea87d05cedd2b6ede6b032a9756be08b3ad7dd48d161f963249d"} err="failed to get container status \"f62e9a61b06eea87d05cedd2b6ede6b032a9756be08b3ad7dd48d161f963249d\": rpc error: code = NotFound desc = could not find container \"f62e9a61b06eea87d05cedd2b6ede6b032a9756be08b3ad7dd48d161f963249d\": container with ID starting with f62e9a61b06eea87d05cedd2b6ede6b032a9756be08b3ad7dd48d161f963249d not found: ID does not exist" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.072406 4743 scope.go:117] "RemoveContainer" containerID="3597209a3d060ce1ea56604372cd198ebe067d778d7e848118067956b1e93c29" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.073336 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3597209a3d060ce1ea56604372cd198ebe067d778d7e848118067956b1e93c29"} err="failed to get container status \"3597209a3d060ce1ea56604372cd198ebe067d778d7e848118067956b1e93c29\": rpc error: code = NotFound desc = could not find container \"3597209a3d060ce1ea56604372cd198ebe067d778d7e848118067956b1e93c29\": container with ID starting with 3597209a3d060ce1ea56604372cd198ebe067d778d7e848118067956b1e93c29 not found: ID does not exist" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.082103 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.113174 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61c1298d-9f47-46f2-8ed9-53e958ba4893" path="/var/lib/kubelet/pods/61c1298d-9f47-46f2-8ed9-53e958ba4893/volumes" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.114028 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 11 01:12:54 crc kubenswrapper[4743]: E1011 01:12:54.114417 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61c1298d-9f47-46f2-8ed9-53e958ba4893" containerName="cinder-api" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.114436 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="61c1298d-9f47-46f2-8ed9-53e958ba4893" containerName="cinder-api" Oct 11 01:12:54 crc kubenswrapper[4743]: E1011 01:12:54.114461 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61c1298d-9f47-46f2-8ed9-53e958ba4893" containerName="cinder-api-log" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.114470 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="61c1298d-9f47-46f2-8ed9-53e958ba4893" containerName="cinder-api-log" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.114709 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="61c1298d-9f47-46f2-8ed9-53e958ba4893" containerName="cinder-api" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.114744 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="61c1298d-9f47-46f2-8ed9-53e958ba4893" containerName="cinder-api-log" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.116519 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.119784 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.120104 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.120276 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.136058 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.187630 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f904889-28d2-4cfd-86ee-2e5841f9fc04-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.187672 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f904889-28d2-4cfd-86ee-2e5841f9fc04-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.187755 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f904889-28d2-4cfd-86ee-2e5841f9fc04-logs\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.187796 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f904889-28d2-4cfd-86ee-2e5841f9fc04-scripts\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.187813 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f904889-28d2-4cfd-86ee-2e5841f9fc04-config-data-custom\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.187841 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f904889-28d2-4cfd-86ee-2e5841f9fc04-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.187879 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkwvk\" (UniqueName: \"kubernetes.io/projected/9f904889-28d2-4cfd-86ee-2e5841f9fc04-kube-api-access-pkwvk\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.187951 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f904889-28d2-4cfd-86ee-2e5841f9fc04-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.189937 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f904889-28d2-4cfd-86ee-2e5841f9fc04-config-data\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.212567 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.214013 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.219352 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.219380 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.219436 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-9qk2g" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.223168 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.292145 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f904889-28d2-4cfd-86ee-2e5841f9fc04-scripts\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.292225 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f904889-28d2-4cfd-86ee-2e5841f9fc04-config-data-custom\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.292268 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f904889-28d2-4cfd-86ee-2e5841f9fc04-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.292326 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkwvk\" (UniqueName: \"kubernetes.io/projected/9f904889-28d2-4cfd-86ee-2e5841f9fc04-kube-api-access-pkwvk\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.292418 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d37388f-7a99-412d-80ca-5799c0b51dce-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2d37388f-7a99-412d-80ca-5799c0b51dce\") " pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.292509 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mndb\" (UniqueName: \"kubernetes.io/projected/2d37388f-7a99-412d-80ca-5799c0b51dce-kube-api-access-4mndb\") pod \"openstackclient\" (UID: \"2d37388f-7a99-412d-80ca-5799c0b51dce\") " pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.292579 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f904889-28d2-4cfd-86ee-2e5841f9fc04-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.292648 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f904889-28d2-4cfd-86ee-2e5841f9fc04-config-data\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.292725 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2d37388f-7a99-412d-80ca-5799c0b51dce-openstack-config-secret\") pod \"openstackclient\" (UID: \"2d37388f-7a99-412d-80ca-5799c0b51dce\") " pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.292830 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f904889-28d2-4cfd-86ee-2e5841f9fc04-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.292933 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f904889-28d2-4cfd-86ee-2e5841f9fc04-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.293030 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2d37388f-7a99-412d-80ca-5799c0b51dce-openstack-config\") pod \"openstackclient\" (UID: \"2d37388f-7a99-412d-80ca-5799c0b51dce\") " pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.293104 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f904889-28d2-4cfd-86ee-2e5841f9fc04-logs\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.294403 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f904889-28d2-4cfd-86ee-2e5841f9fc04-logs\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.294730 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f904889-28d2-4cfd-86ee-2e5841f9fc04-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.300472 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f904889-28d2-4cfd-86ee-2e5841f9fc04-scripts\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.311488 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f904889-28d2-4cfd-86ee-2e5841f9fc04-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.312156 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f904889-28d2-4cfd-86ee-2e5841f9fc04-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.313314 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f904889-28d2-4cfd-86ee-2e5841f9fc04-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.313492 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f904889-28d2-4cfd-86ee-2e5841f9fc04-config-data\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.314034 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkwvk\" (UniqueName: \"kubernetes.io/projected/9f904889-28d2-4cfd-86ee-2e5841f9fc04-kube-api-access-pkwvk\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.315687 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f904889-28d2-4cfd-86ee-2e5841f9fc04-config-data-custom\") pod \"cinder-api-0\" (UID: \"9f904889-28d2-4cfd-86ee-2e5841f9fc04\") " pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.394559 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2d37388f-7a99-412d-80ca-5799c0b51dce-openstack-config\") pod \"openstackclient\" (UID: \"2d37388f-7a99-412d-80ca-5799c0b51dce\") " pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.394667 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d37388f-7a99-412d-80ca-5799c0b51dce-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2d37388f-7a99-412d-80ca-5799c0b51dce\") " pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.394718 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mndb\" (UniqueName: \"kubernetes.io/projected/2d37388f-7a99-412d-80ca-5799c0b51dce-kube-api-access-4mndb\") pod \"openstackclient\" (UID: \"2d37388f-7a99-412d-80ca-5799c0b51dce\") " pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.394777 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2d37388f-7a99-412d-80ca-5799c0b51dce-openstack-config-secret\") pod \"openstackclient\" (UID: \"2d37388f-7a99-412d-80ca-5799c0b51dce\") " pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.395380 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2d37388f-7a99-412d-80ca-5799c0b51dce-openstack-config\") pod \"openstackclient\" (UID: \"2d37388f-7a99-412d-80ca-5799c0b51dce\") " pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.399659 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2d37388f-7a99-412d-80ca-5799c0b51dce-openstack-config-secret\") pod \"openstackclient\" (UID: \"2d37388f-7a99-412d-80ca-5799c0b51dce\") " pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.399995 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d37388f-7a99-412d-80ca-5799c0b51dce-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2d37388f-7a99-412d-80ca-5799c0b51dce\") " pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.411143 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mndb\" (UniqueName: \"kubernetes.io/projected/2d37388f-7a99-412d-80ca-5799c0b51dce-kube-api-access-4mndb\") pod \"openstackclient\" (UID: \"2d37388f-7a99-412d-80ca-5799c0b51dce\") " pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.437944 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.495929 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.499074 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.507267 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.545348 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.546535 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.597919 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 11 01:12:54 crc kubenswrapper[4743]: E1011 01:12:54.647706 4743 log.go:32] "RunPodSandbox from runtime service failed" err=< Oct 11 01:12:54 crc kubenswrapper[4743]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_2d37388f-7a99-412d-80ca-5799c0b51dce_0(111159acf727f33a19681ca53c67fc33636aa782ea6879c1e1bf9889d6d8072c): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"111159acf727f33a19681ca53c67fc33636aa782ea6879c1e1bf9889d6d8072c" Netns:"/var/run/netns/1b77f7a9-9715-439b-95d4-debb743d18e7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=111159acf727f33a19681ca53c67fc33636aa782ea6879c1e1bf9889d6d8072c;K8S_POD_UID=2d37388f-7a99-412d-80ca-5799c0b51dce" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/2d37388f-7a99-412d-80ca-5799c0b51dce]: expected pod UID "2d37388f-7a99-412d-80ca-5799c0b51dce" but got "c02b1352-1ccf-4856-ad8e-328dab03135e" from Kube API Oct 11 01:12:54 crc kubenswrapper[4743]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Oct 11 01:12:54 crc kubenswrapper[4743]: > Oct 11 01:12:54 crc kubenswrapper[4743]: E1011 01:12:54.647761 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Oct 11 01:12:54 crc kubenswrapper[4743]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_2d37388f-7a99-412d-80ca-5799c0b51dce_0(111159acf727f33a19681ca53c67fc33636aa782ea6879c1e1bf9889d6d8072c): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"111159acf727f33a19681ca53c67fc33636aa782ea6879c1e1bf9889d6d8072c" Netns:"/var/run/netns/1b77f7a9-9715-439b-95d4-debb743d18e7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=111159acf727f33a19681ca53c67fc33636aa782ea6879c1e1bf9889d6d8072c;K8S_POD_UID=2d37388f-7a99-412d-80ca-5799c0b51dce" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/2d37388f-7a99-412d-80ca-5799c0b51dce]: expected pod UID "2d37388f-7a99-412d-80ca-5799c0b51dce" but got "c02b1352-1ccf-4856-ad8e-328dab03135e" from Kube API Oct 11 01:12:54 crc kubenswrapper[4743]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Oct 11 01:12:54 crc kubenswrapper[4743]: > pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.699544 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c02b1352-1ccf-4856-ad8e-328dab03135e-openstack-config-secret\") pod \"openstackclient\" (UID: \"c02b1352-1ccf-4856-ad8e-328dab03135e\") " pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.699660 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c02b1352-1ccf-4856-ad8e-328dab03135e-openstack-config\") pod \"openstackclient\" (UID: \"c02b1352-1ccf-4856-ad8e-328dab03135e\") " pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.699708 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx79g\" (UniqueName: \"kubernetes.io/projected/c02b1352-1ccf-4856-ad8e-328dab03135e-kube-api-access-lx79g\") pod \"openstackclient\" (UID: \"c02b1352-1ccf-4856-ad8e-328dab03135e\") " pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.699730 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c02b1352-1ccf-4856-ad8e-328dab03135e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c02b1352-1ccf-4856-ad8e-328dab03135e\") " pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.800989 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c02b1352-1ccf-4856-ad8e-328dab03135e-openstack-config-secret\") pod \"openstackclient\" (UID: \"c02b1352-1ccf-4856-ad8e-328dab03135e\") " pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.801154 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c02b1352-1ccf-4856-ad8e-328dab03135e-openstack-config\") pod \"openstackclient\" (UID: \"c02b1352-1ccf-4856-ad8e-328dab03135e\") " pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.801222 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx79g\" (UniqueName: \"kubernetes.io/projected/c02b1352-1ccf-4856-ad8e-328dab03135e-kube-api-access-lx79g\") pod \"openstackclient\" (UID: \"c02b1352-1ccf-4856-ad8e-328dab03135e\") " pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.801246 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c02b1352-1ccf-4856-ad8e-328dab03135e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c02b1352-1ccf-4856-ad8e-328dab03135e\") " pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.802633 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c02b1352-1ccf-4856-ad8e-328dab03135e-openstack-config\") pod \"openstackclient\" (UID: \"c02b1352-1ccf-4856-ad8e-328dab03135e\") " pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.809613 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c02b1352-1ccf-4856-ad8e-328dab03135e-openstack-config-secret\") pod \"openstackclient\" (UID: \"c02b1352-1ccf-4856-ad8e-328dab03135e\") " pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.810455 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c02b1352-1ccf-4856-ad8e-328dab03135e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c02b1352-1ccf-4856-ad8e-328dab03135e\") " pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.819431 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx79g\" (UniqueName: \"kubernetes.io/projected/c02b1352-1ccf-4856-ad8e-328dab03135e-kube-api-access-lx79g\") pod \"openstackclient\" (UID: \"c02b1352-1ccf-4856-ad8e-328dab03135e\") " pod="openstack/openstackclient" Oct 11 01:12:54 crc kubenswrapper[4743]: I1011 01:12:54.864490 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 11 01:12:55 crc kubenswrapper[4743]: I1011 01:12:55.002625 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 11 01:12:55 crc kubenswrapper[4743]: W1011 01:12:55.010407 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f904889_28d2_4cfd_86ee_2e5841f9fc04.slice/crio-f9ae06e1fbcf3405ef60600902f6da3a03707825529de76312049ed3ed38e909 WatchSource:0}: Error finding container f9ae06e1fbcf3405ef60600902f6da3a03707825529de76312049ed3ed38e909: Status 404 returned error can't find the container with id f9ae06e1fbcf3405ef60600902f6da3a03707825529de76312049ed3ed38e909 Oct 11 01:12:55 crc kubenswrapper[4743]: I1011 01:12:55.035119 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 11 01:12:55 crc kubenswrapper[4743]: I1011 01:12:55.041252 4743 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="2d37388f-7a99-412d-80ca-5799c0b51dce" podUID="c02b1352-1ccf-4856-ad8e-328dab03135e" Oct 11 01:12:55 crc kubenswrapper[4743]: I1011 01:12:55.079330 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 11 01:12:55 crc kubenswrapper[4743]: I1011 01:12:55.207924 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mndb\" (UniqueName: \"kubernetes.io/projected/2d37388f-7a99-412d-80ca-5799c0b51dce-kube-api-access-4mndb\") pod \"2d37388f-7a99-412d-80ca-5799c0b51dce\" (UID: \"2d37388f-7a99-412d-80ca-5799c0b51dce\") " Oct 11 01:12:55 crc kubenswrapper[4743]: I1011 01:12:55.208369 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d37388f-7a99-412d-80ca-5799c0b51dce-combined-ca-bundle\") pod \"2d37388f-7a99-412d-80ca-5799c0b51dce\" (UID: \"2d37388f-7a99-412d-80ca-5799c0b51dce\") " Oct 11 01:12:55 crc kubenswrapper[4743]: I1011 01:12:55.208577 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2d37388f-7a99-412d-80ca-5799c0b51dce-openstack-config-secret\") pod \"2d37388f-7a99-412d-80ca-5799c0b51dce\" (UID: \"2d37388f-7a99-412d-80ca-5799c0b51dce\") " Oct 11 01:12:55 crc kubenswrapper[4743]: I1011 01:12:55.208687 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2d37388f-7a99-412d-80ca-5799c0b51dce-openstack-config\") pod \"2d37388f-7a99-412d-80ca-5799c0b51dce\" (UID: \"2d37388f-7a99-412d-80ca-5799c0b51dce\") " Oct 11 01:12:55 crc kubenswrapper[4743]: I1011 01:12:55.209664 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d37388f-7a99-412d-80ca-5799c0b51dce-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "2d37388f-7a99-412d-80ca-5799c0b51dce" (UID: "2d37388f-7a99-412d-80ca-5799c0b51dce"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:12:55 crc kubenswrapper[4743]: I1011 01:12:55.216041 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d37388f-7a99-412d-80ca-5799c0b51dce-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "2d37388f-7a99-412d-80ca-5799c0b51dce" (UID: "2d37388f-7a99-412d-80ca-5799c0b51dce"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:55 crc kubenswrapper[4743]: I1011 01:12:55.228083 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d37388f-7a99-412d-80ca-5799c0b51dce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d37388f-7a99-412d-80ca-5799c0b51dce" (UID: "2d37388f-7a99-412d-80ca-5799c0b51dce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:12:55 crc kubenswrapper[4743]: I1011 01:12:55.228149 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d37388f-7a99-412d-80ca-5799c0b51dce-kube-api-access-4mndb" (OuterVolumeSpecName: "kube-api-access-4mndb") pod "2d37388f-7a99-412d-80ca-5799c0b51dce" (UID: "2d37388f-7a99-412d-80ca-5799c0b51dce"). InnerVolumeSpecName "kube-api-access-4mndb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:12:55 crc kubenswrapper[4743]: I1011 01:12:55.311657 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d37388f-7a99-412d-80ca-5799c0b51dce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:55 crc kubenswrapper[4743]: I1011 01:12:55.311711 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2d37388f-7a99-412d-80ca-5799c0b51dce-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:55 crc kubenswrapper[4743]: I1011 01:12:55.311722 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2d37388f-7a99-412d-80ca-5799c0b51dce-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:55 crc kubenswrapper[4743]: I1011 01:12:55.311731 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mndb\" (UniqueName: \"kubernetes.io/projected/2d37388f-7a99-412d-80ca-5799c0b51dce-kube-api-access-4mndb\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:55 crc kubenswrapper[4743]: I1011 01:12:55.354207 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 11 01:12:55 crc kubenswrapper[4743]: I1011 01:12:55.485572 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5496cd5f5c-c9jx6" Oct 11 01:12:55 crc kubenswrapper[4743]: I1011 01:12:55.542875 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7dfd9c7974-zppf2"] Oct 11 01:12:55 crc kubenswrapper[4743]: I1011 01:12:55.543119 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7dfd9c7974-zppf2" podUID="54eafe9f-024f-4d60-917b-6e867458632d" containerName="neutron-api" containerID="cri-o://5f210a41596dfcc8d1509da34eba889691b3ccabdb5ae5b9a810960d3018b7f8" gracePeriod=30 Oct 11 01:12:55 crc kubenswrapper[4743]: I1011 01:12:55.543579 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7dfd9c7974-zppf2" podUID="54eafe9f-024f-4d60-917b-6e867458632d" containerName="neutron-httpd" containerID="cri-o://f71b4cf1f0d1af682e06225191bf46c6508528b97a6d00b4bea132078cf70eca" gracePeriod=30 Oct 11 01:12:56 crc kubenswrapper[4743]: I1011 01:12:56.054423 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9f904889-28d2-4cfd-86ee-2e5841f9fc04","Type":"ContainerStarted","Data":"74b45fdceb934947a4b24e566010255e45173c8c8950280fd04bfce7e116115a"} Oct 11 01:12:56 crc kubenswrapper[4743]: I1011 01:12:56.054715 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9f904889-28d2-4cfd-86ee-2e5841f9fc04","Type":"ContainerStarted","Data":"f9ae06e1fbcf3405ef60600902f6da3a03707825529de76312049ed3ed38e909"} Oct 11 01:12:56 crc kubenswrapper[4743]: I1011 01:12:56.059535 4743 generic.go:334] "Generic (PLEG): container finished" podID="54eafe9f-024f-4d60-917b-6e867458632d" containerID="f71b4cf1f0d1af682e06225191bf46c6508528b97a6d00b4bea132078cf70eca" exitCode=0 Oct 11 01:12:56 crc kubenswrapper[4743]: I1011 01:12:56.059616 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dfd9c7974-zppf2" event={"ID":"54eafe9f-024f-4d60-917b-6e867458632d","Type":"ContainerDied","Data":"f71b4cf1f0d1af682e06225191bf46c6508528b97a6d00b4bea132078cf70eca"} Oct 11 01:12:56 crc kubenswrapper[4743]: I1011 01:12:56.063944 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 11 01:12:56 crc kubenswrapper[4743]: I1011 01:12:56.065944 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c02b1352-1ccf-4856-ad8e-328dab03135e","Type":"ContainerStarted","Data":"42ec3a92626f42a1c88c0eff30f2e87f25f2591a31e48c5fa35e9a7691e61b0e"} Oct 11 01:12:56 crc kubenswrapper[4743]: I1011 01:12:56.068807 4743 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="2d37388f-7a99-412d-80ca-5799c0b51dce" podUID="c02b1352-1ccf-4856-ad8e-328dab03135e" Oct 11 01:12:56 crc kubenswrapper[4743]: I1011 01:12:56.111550 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d37388f-7a99-412d-80ca-5799c0b51dce" path="/var/lib/kubelet/pods/2d37388f-7a99-412d-80ca-5799c0b51dce/volumes" Oct 11 01:12:56 crc kubenswrapper[4743]: I1011 01:12:56.272828 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8464ff7fb4-cflsg" Oct 11 01:12:56 crc kubenswrapper[4743]: I1011 01:12:56.286396 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8464ff7fb4-cflsg" Oct 11 01:12:56 crc kubenswrapper[4743]: I1011 01:12:56.474601 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 11 01:12:56 crc kubenswrapper[4743]: I1011 01:12:56.489829 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" Oct 11 01:12:56 crc kubenswrapper[4743]: I1011 01:12:56.519218 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 11 01:12:56 crc kubenswrapper[4743]: I1011 01:12:56.565956 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-55n9s"] Oct 11 01:12:56 crc kubenswrapper[4743]: I1011 01:12:56.566235 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-55n9s" podUID="ba69e70e-4af2-485e-9566-ec04e2a71a12" containerName="dnsmasq-dns" containerID="cri-o://df89795983d01f7c54609fcbf03ce7a51c26b7ec1aeae9f825eb3bf09a9318ac" gracePeriod=10 Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.079367 4743 generic.go:334] "Generic (PLEG): container finished" podID="ba69e70e-4af2-485e-9566-ec04e2a71a12" containerID="df89795983d01f7c54609fcbf03ce7a51c26b7ec1aeae9f825eb3bf09a9318ac" exitCode=0 Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.079742 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-55n9s" event={"ID":"ba69e70e-4af2-485e-9566-ec04e2a71a12","Type":"ContainerDied","Data":"df89795983d01f7c54609fcbf03ce7a51c26b7ec1aeae9f825eb3bf09a9318ac"} Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.087601 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="47a67cb1-322e-4da4-b240-1a89ff62fa51" containerName="cinder-scheduler" containerID="cri-o://6846e6af5444ce8c462a5a23f7cd099fd16fb4ebfc5877d55b867bda3337d2e6" gracePeriod=30 Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.088640 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9f904889-28d2-4cfd-86ee-2e5841f9fc04","Type":"ContainerStarted","Data":"ab4ce4a6d4b47727bae0bdfb5e1b88513ce7e3d37962e9084ea1f39b776aa1a4"} Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.088686 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.089671 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="47a67cb1-322e-4da4-b240-1a89ff62fa51" containerName="probe" containerID="cri-o://cc15c18c0b3aab5a7851e156d795c905240d201714233ac67857ccd85cc21b78" gracePeriod=30 Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.116655 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.116637651 podStartE2EDuration="3.116637651s" podCreationTimestamp="2025-10-11 01:12:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:12:57.110670333 +0000 UTC m=+1271.763650750" watchObservedRunningTime="2025-10-11 01:12:57.116637651 +0000 UTC m=+1271.769618038" Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.208051 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-55n9s" Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.376277 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-dns-swift-storage-0\") pod \"ba69e70e-4af2-485e-9566-ec04e2a71a12\" (UID: \"ba69e70e-4af2-485e-9566-ec04e2a71a12\") " Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.376337 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-dns-svc\") pod \"ba69e70e-4af2-485e-9566-ec04e2a71a12\" (UID: \"ba69e70e-4af2-485e-9566-ec04e2a71a12\") " Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.376402 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-ovsdbserver-nb\") pod \"ba69e70e-4af2-485e-9566-ec04e2a71a12\" (UID: \"ba69e70e-4af2-485e-9566-ec04e2a71a12\") " Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.376472 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq9fj\" (UniqueName: \"kubernetes.io/projected/ba69e70e-4af2-485e-9566-ec04e2a71a12-kube-api-access-mq9fj\") pod \"ba69e70e-4af2-485e-9566-ec04e2a71a12\" (UID: \"ba69e70e-4af2-485e-9566-ec04e2a71a12\") " Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.376521 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-config\") pod \"ba69e70e-4af2-485e-9566-ec04e2a71a12\" (UID: \"ba69e70e-4af2-485e-9566-ec04e2a71a12\") " Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.376556 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-ovsdbserver-sb\") pod \"ba69e70e-4af2-485e-9566-ec04e2a71a12\" (UID: \"ba69e70e-4af2-485e-9566-ec04e2a71a12\") " Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.395051 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba69e70e-4af2-485e-9566-ec04e2a71a12-kube-api-access-mq9fj" (OuterVolumeSpecName: "kube-api-access-mq9fj") pod "ba69e70e-4af2-485e-9566-ec04e2a71a12" (UID: "ba69e70e-4af2-485e-9566-ec04e2a71a12"). InnerVolumeSpecName "kube-api-access-mq9fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.444459 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-config" (OuterVolumeSpecName: "config") pod "ba69e70e-4af2-485e-9566-ec04e2a71a12" (UID: "ba69e70e-4af2-485e-9566-ec04e2a71a12"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.454196 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ba69e70e-4af2-485e-9566-ec04e2a71a12" (UID: "ba69e70e-4af2-485e-9566-ec04e2a71a12"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.460209 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ba69e70e-4af2-485e-9566-ec04e2a71a12" (UID: "ba69e70e-4af2-485e-9566-ec04e2a71a12"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.460540 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ba69e70e-4af2-485e-9566-ec04e2a71a12" (UID: "ba69e70e-4af2-485e-9566-ec04e2a71a12"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.478977 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.479005 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq9fj\" (UniqueName: \"kubernetes.io/projected/ba69e70e-4af2-485e-9566-ec04e2a71a12-kube-api-access-mq9fj\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.479016 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.479025 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.479033 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.498872 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ba69e70e-4af2-485e-9566-ec04e2a71a12" (UID: "ba69e70e-4af2-485e-9566-ec04e2a71a12"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:12:57 crc kubenswrapper[4743]: I1011 01:12:57.580280 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba69e70e-4af2-485e-9566-ec04e2a71a12-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 11 01:12:58 crc kubenswrapper[4743]: I1011 01:12:58.131105 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-55n9s" event={"ID":"ba69e70e-4af2-485e-9566-ec04e2a71a12","Type":"ContainerDied","Data":"d109268df20299a9476693016b217bb32fa8f5d8b15fb7771f32289003619271"} Oct 11 01:12:58 crc kubenswrapper[4743]: I1011 01:12:58.131388 4743 scope.go:117] "RemoveContainer" containerID="df89795983d01f7c54609fcbf03ce7a51c26b7ec1aeae9f825eb3bf09a9318ac" Oct 11 01:12:58 crc kubenswrapper[4743]: I1011 01:12:58.131164 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-55n9s" Oct 11 01:12:58 crc kubenswrapper[4743]: I1011 01:12:58.167177 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-55n9s"] Oct 11 01:12:58 crc kubenswrapper[4743]: I1011 01:12:58.175614 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-55n9s"] Oct 11 01:12:58 crc kubenswrapper[4743]: I1011 01:12:58.185015 4743 scope.go:117] "RemoveContainer" containerID="953514ea9d56bd3c8b6498d971973cd9187040f217c98c7d08f4c23765125a71" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.145632 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-744b8cd687-p7lgl"] Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.146051 4743 generic.go:334] "Generic (PLEG): container finished" podID="47a67cb1-322e-4da4-b240-1a89ff62fa51" containerID="cc15c18c0b3aab5a7851e156d795c905240d201714233ac67857ccd85cc21b78" exitCode=0 Oct 11 01:12:59 crc kubenswrapper[4743]: E1011 01:12:59.146800 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba69e70e-4af2-485e-9566-ec04e2a71a12" containerName="dnsmasq-dns" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.146819 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba69e70e-4af2-485e-9566-ec04e2a71a12" containerName="dnsmasq-dns" Oct 11 01:12:59 crc kubenswrapper[4743]: E1011 01:12:59.146829 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba69e70e-4af2-485e-9566-ec04e2a71a12" containerName="init" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.146835 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba69e70e-4af2-485e-9566-ec04e2a71a12" containerName="init" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.147071 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba69e70e-4af2-485e-9566-ec04e2a71a12" containerName="dnsmasq-dns" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.155337 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"47a67cb1-322e-4da4-b240-1a89ff62fa51","Type":"ContainerDied","Data":"cc15c18c0b3aab5a7851e156d795c905240d201714233ac67857ccd85cc21b78"} Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.155446 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.160309 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-744b8cd687-p7lgl"] Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.167130 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.167342 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.168465 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.310953 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68219217-d875-4eb2-9611-b9afb0f64c45-log-httpd\") pod \"swift-proxy-744b8cd687-p7lgl\" (UID: \"68219217-d875-4eb2-9611-b9afb0f64c45\") " pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.311000 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68219217-d875-4eb2-9611-b9afb0f64c45-public-tls-certs\") pod \"swift-proxy-744b8cd687-p7lgl\" (UID: \"68219217-d875-4eb2-9611-b9afb0f64c45\") " pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.311049 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68219217-d875-4eb2-9611-b9afb0f64c45-combined-ca-bundle\") pod \"swift-proxy-744b8cd687-p7lgl\" (UID: \"68219217-d875-4eb2-9611-b9afb0f64c45\") " pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.311093 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68219217-d875-4eb2-9611-b9afb0f64c45-internal-tls-certs\") pod \"swift-proxy-744b8cd687-p7lgl\" (UID: \"68219217-d875-4eb2-9611-b9afb0f64c45\") " pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.311117 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzt4n\" (UniqueName: \"kubernetes.io/projected/68219217-d875-4eb2-9611-b9afb0f64c45-kube-api-access-jzt4n\") pod \"swift-proxy-744b8cd687-p7lgl\" (UID: \"68219217-d875-4eb2-9611-b9afb0f64c45\") " pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.311159 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68219217-d875-4eb2-9611-b9afb0f64c45-run-httpd\") pod \"swift-proxy-744b8cd687-p7lgl\" (UID: \"68219217-d875-4eb2-9611-b9afb0f64c45\") " pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.311181 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/68219217-d875-4eb2-9611-b9afb0f64c45-etc-swift\") pod \"swift-proxy-744b8cd687-p7lgl\" (UID: \"68219217-d875-4eb2-9611-b9afb0f64c45\") " pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.311196 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68219217-d875-4eb2-9611-b9afb0f64c45-config-data\") pod \"swift-proxy-744b8cd687-p7lgl\" (UID: \"68219217-d875-4eb2-9611-b9afb0f64c45\") " pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.412525 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68219217-d875-4eb2-9611-b9afb0f64c45-internal-tls-certs\") pod \"swift-proxy-744b8cd687-p7lgl\" (UID: \"68219217-d875-4eb2-9611-b9afb0f64c45\") " pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.412577 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzt4n\" (UniqueName: \"kubernetes.io/projected/68219217-d875-4eb2-9611-b9afb0f64c45-kube-api-access-jzt4n\") pod \"swift-proxy-744b8cd687-p7lgl\" (UID: \"68219217-d875-4eb2-9611-b9afb0f64c45\") " pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.412632 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68219217-d875-4eb2-9611-b9afb0f64c45-run-httpd\") pod \"swift-proxy-744b8cd687-p7lgl\" (UID: \"68219217-d875-4eb2-9611-b9afb0f64c45\") " pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.412659 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/68219217-d875-4eb2-9611-b9afb0f64c45-etc-swift\") pod \"swift-proxy-744b8cd687-p7lgl\" (UID: \"68219217-d875-4eb2-9611-b9afb0f64c45\") " pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.412676 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68219217-d875-4eb2-9611-b9afb0f64c45-config-data\") pod \"swift-proxy-744b8cd687-p7lgl\" (UID: \"68219217-d875-4eb2-9611-b9afb0f64c45\") " pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.412726 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68219217-d875-4eb2-9611-b9afb0f64c45-log-httpd\") pod \"swift-proxy-744b8cd687-p7lgl\" (UID: \"68219217-d875-4eb2-9611-b9afb0f64c45\") " pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.412746 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68219217-d875-4eb2-9611-b9afb0f64c45-public-tls-certs\") pod \"swift-proxy-744b8cd687-p7lgl\" (UID: \"68219217-d875-4eb2-9611-b9afb0f64c45\") " pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.412792 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68219217-d875-4eb2-9611-b9afb0f64c45-combined-ca-bundle\") pod \"swift-proxy-744b8cd687-p7lgl\" (UID: \"68219217-d875-4eb2-9611-b9afb0f64c45\") " pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.413294 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68219217-d875-4eb2-9611-b9afb0f64c45-run-httpd\") pod \"swift-proxy-744b8cd687-p7lgl\" (UID: \"68219217-d875-4eb2-9611-b9afb0f64c45\") " pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.413351 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68219217-d875-4eb2-9611-b9afb0f64c45-log-httpd\") pod \"swift-proxy-744b8cd687-p7lgl\" (UID: \"68219217-d875-4eb2-9611-b9afb0f64c45\") " pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.425534 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68219217-d875-4eb2-9611-b9afb0f64c45-internal-tls-certs\") pod \"swift-proxy-744b8cd687-p7lgl\" (UID: \"68219217-d875-4eb2-9611-b9afb0f64c45\") " pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.425801 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68219217-d875-4eb2-9611-b9afb0f64c45-config-data\") pod \"swift-proxy-744b8cd687-p7lgl\" (UID: \"68219217-d875-4eb2-9611-b9afb0f64c45\") " pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.425907 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/68219217-d875-4eb2-9611-b9afb0f64c45-etc-swift\") pod \"swift-proxy-744b8cd687-p7lgl\" (UID: \"68219217-d875-4eb2-9611-b9afb0f64c45\") " pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.426252 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68219217-d875-4eb2-9611-b9afb0f64c45-public-tls-certs\") pod \"swift-proxy-744b8cd687-p7lgl\" (UID: \"68219217-d875-4eb2-9611-b9afb0f64c45\") " pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.426567 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68219217-d875-4eb2-9611-b9afb0f64c45-combined-ca-bundle\") pod \"swift-proxy-744b8cd687-p7lgl\" (UID: \"68219217-d875-4eb2-9611-b9afb0f64c45\") " pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.429024 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzt4n\" (UniqueName: \"kubernetes.io/projected/68219217-d875-4eb2-9611-b9afb0f64c45-kube-api-access-jzt4n\") pod \"swift-proxy-744b8cd687-p7lgl\" (UID: \"68219217-d875-4eb2-9611-b9afb0f64c45\") " pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:12:59 crc kubenswrapper[4743]: I1011 01:12:59.478562 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.040816 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-744b8cd687-p7lgl"] Oct 11 01:13:00 crc kubenswrapper[4743]: W1011 01:13:00.048514 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68219217_d875_4eb2_9611_b9afb0f64c45.slice/crio-21c2b9a1d7da144cdf5233af0687550ea6ba35861497bddbe2136d7602026da5 WatchSource:0}: Error finding container 21c2b9a1d7da144cdf5233af0687550ea6ba35861497bddbe2136d7602026da5: Status 404 returned error can't find the container with id 21c2b9a1d7da144cdf5233af0687550ea6ba35861497bddbe2136d7602026da5 Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.125414 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba69e70e-4af2-485e-9566-ec04e2a71a12" path="/var/lib/kubelet/pods/ba69e70e-4af2-485e-9566-ec04e2a71a12/volumes" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.138097 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.141258 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7246d8a-9560-4212-8219-c7ac80cd7152" containerName="ceilometer-central-agent" containerID="cri-o://27fd22327259dc3b21f40a159ab8864185943820c6906261cc15648a64eafc5e" gracePeriod=30 Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.141272 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7246d8a-9560-4212-8219-c7ac80cd7152" containerName="proxy-httpd" containerID="cri-o://d4f73c76f87cb075388136a5c2308bb0f7c040ab5eea76e45c831be279d24d65" gracePeriod=30 Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.141304 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7246d8a-9560-4212-8219-c7ac80cd7152" containerName="sg-core" containerID="cri-o://0a94cc2959a3caeae8a1da71778a8aee6fad22ab3592c272efa29502d5ebb7ed" gracePeriod=30 Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.141320 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7246d8a-9560-4212-8219-c7ac80cd7152" containerName="ceilometer-notification-agent" containerID="cri-o://c5f9e37a17454f1ed04f86d7dae58744f264061e298c54740d65a49894bd0560" gracePeriod=30 Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.147589 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.185688 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-744b8cd687-p7lgl" event={"ID":"68219217-d875-4eb2-9611-b9afb0f64c45","Type":"ContainerStarted","Data":"21c2b9a1d7da144cdf5233af0687550ea6ba35861497bddbe2136d7602026da5"} Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.461784 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-h4nh5"] Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.463248 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-h4nh5" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.488746 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-h4nh5"] Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.535624 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls7jv\" (UniqueName: \"kubernetes.io/projected/41e4f286-9bff-400a-9604-81e12333eb6c-kube-api-access-ls7jv\") pod \"nova-api-db-create-h4nh5\" (UID: \"41e4f286-9bff-400a-9604-81e12333eb6c\") " pod="openstack/nova-api-db-create-h4nh5" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.540212 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-dhxf8"] Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.545982 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dhxf8" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.563910 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-dhxf8"] Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.637667 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sv8v\" (UniqueName: \"kubernetes.io/projected/d4193e99-7b11-4285-86c4-7fe1689e4aaa-kube-api-access-7sv8v\") pod \"nova-cell0-db-create-dhxf8\" (UID: \"d4193e99-7b11-4285-86c4-7fe1689e4aaa\") " pod="openstack/nova-cell0-db-create-dhxf8" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.638143 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls7jv\" (UniqueName: \"kubernetes.io/projected/41e4f286-9bff-400a-9604-81e12333eb6c-kube-api-access-ls7jv\") pod \"nova-api-db-create-h4nh5\" (UID: \"41e4f286-9bff-400a-9604-81e12333eb6c\") " pod="openstack/nova-api-db-create-h4nh5" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.666932 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls7jv\" (UniqueName: \"kubernetes.io/projected/41e4f286-9bff-400a-9604-81e12333eb6c-kube-api-access-ls7jv\") pod \"nova-api-db-create-h4nh5\" (UID: \"41e4f286-9bff-400a-9604-81e12333eb6c\") " pod="openstack/nova-api-db-create-h4nh5" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.734244 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-q9wxz"] Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.735730 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-q9wxz" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.739315 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sv8v\" (UniqueName: \"kubernetes.io/projected/d4193e99-7b11-4285-86c4-7fe1689e4aaa-kube-api-access-7sv8v\") pod \"nova-cell0-db-create-dhxf8\" (UID: \"d4193e99-7b11-4285-86c4-7fe1689e4aaa\") " pod="openstack/nova-cell0-db-create-dhxf8" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.745221 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-q9wxz"] Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.761256 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sv8v\" (UniqueName: \"kubernetes.io/projected/d4193e99-7b11-4285-86c4-7fe1689e4aaa-kube-api-access-7sv8v\") pod \"nova-cell0-db-create-dhxf8\" (UID: \"d4193e99-7b11-4285-86c4-7fe1689e4aaa\") " pod="openstack/nova-cell0-db-create-dhxf8" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.791128 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.840344 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47a67cb1-322e-4da4-b240-1a89ff62fa51-combined-ca-bundle\") pod \"47a67cb1-322e-4da4-b240-1a89ff62fa51\" (UID: \"47a67cb1-322e-4da4-b240-1a89ff62fa51\") " Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.840480 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47a67cb1-322e-4da4-b240-1a89ff62fa51-scripts\") pod \"47a67cb1-322e-4da4-b240-1a89ff62fa51\" (UID: \"47a67cb1-322e-4da4-b240-1a89ff62fa51\") " Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.840505 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47a67cb1-322e-4da4-b240-1a89ff62fa51-config-data\") pod \"47a67cb1-322e-4da4-b240-1a89ff62fa51\" (UID: \"47a67cb1-322e-4da4-b240-1a89ff62fa51\") " Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.840604 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/47a67cb1-322e-4da4-b240-1a89ff62fa51-etc-machine-id\") pod \"47a67cb1-322e-4da4-b240-1a89ff62fa51\" (UID: \"47a67cb1-322e-4da4-b240-1a89ff62fa51\") " Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.841045 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47a67cb1-322e-4da4-b240-1a89ff62fa51-config-data-custom\") pod \"47a67cb1-322e-4da4-b240-1a89ff62fa51\" (UID: \"47a67cb1-322e-4da4-b240-1a89ff62fa51\") " Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.841070 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqmks\" (UniqueName: \"kubernetes.io/projected/47a67cb1-322e-4da4-b240-1a89ff62fa51-kube-api-access-tqmks\") pod \"47a67cb1-322e-4da4-b240-1a89ff62fa51\" (UID: \"47a67cb1-322e-4da4-b240-1a89ff62fa51\") " Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.841159 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47a67cb1-322e-4da4-b240-1a89ff62fa51-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "47a67cb1-322e-4da4-b240-1a89ff62fa51" (UID: "47a67cb1-322e-4da4-b240-1a89ff62fa51"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.841337 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58b2j\" (UniqueName: \"kubernetes.io/projected/43d98f64-a3a8-4260-94e4-565c740912d9-kube-api-access-58b2j\") pod \"nova-cell1-db-create-q9wxz\" (UID: \"43d98f64-a3a8-4260-94e4-565c740912d9\") " pod="openstack/nova-cell1-db-create-q9wxz" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.841721 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/47a67cb1-322e-4da4-b240-1a89ff62fa51-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.845694 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47a67cb1-322e-4da4-b240-1a89ff62fa51-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "47a67cb1-322e-4da4-b240-1a89ff62fa51" (UID: "47a67cb1-322e-4da4-b240-1a89ff62fa51"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.846264 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47a67cb1-322e-4da4-b240-1a89ff62fa51-kube-api-access-tqmks" (OuterVolumeSpecName: "kube-api-access-tqmks") pod "47a67cb1-322e-4da4-b240-1a89ff62fa51" (UID: "47a67cb1-322e-4da4-b240-1a89ff62fa51"). InnerVolumeSpecName "kube-api-access-tqmks". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.847722 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47a67cb1-322e-4da4-b240-1a89ff62fa51-scripts" (OuterVolumeSpecName: "scripts") pod "47a67cb1-322e-4da4-b240-1a89ff62fa51" (UID: "47a67cb1-322e-4da4-b240-1a89ff62fa51"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.859035 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-h4nh5" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.919845 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dhxf8" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.935961 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47a67cb1-322e-4da4-b240-1a89ff62fa51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47a67cb1-322e-4da4-b240-1a89ff62fa51" (UID: "47a67cb1-322e-4da4-b240-1a89ff62fa51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.943297 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58b2j\" (UniqueName: \"kubernetes.io/projected/43d98f64-a3a8-4260-94e4-565c740912d9-kube-api-access-58b2j\") pod \"nova-cell1-db-create-q9wxz\" (UID: \"43d98f64-a3a8-4260-94e4-565c740912d9\") " pod="openstack/nova-cell1-db-create-q9wxz" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.943485 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47a67cb1-322e-4da4-b240-1a89ff62fa51-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.943500 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47a67cb1-322e-4da4-b240-1a89ff62fa51-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.943511 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqmks\" (UniqueName: \"kubernetes.io/projected/47a67cb1-322e-4da4-b240-1a89ff62fa51-kube-api-access-tqmks\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.943520 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47a67cb1-322e-4da4-b240-1a89ff62fa51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.962547 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58b2j\" (UniqueName: \"kubernetes.io/projected/43d98f64-a3a8-4260-94e4-565c740912d9-kube-api-access-58b2j\") pod \"nova-cell1-db-create-q9wxz\" (UID: \"43d98f64-a3a8-4260-94e4-565c740912d9\") " pod="openstack/nova-cell1-db-create-q9wxz" Oct 11 01:13:00 crc kubenswrapper[4743]: I1011 01:13:00.971255 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47a67cb1-322e-4da4-b240-1a89ff62fa51-config-data" (OuterVolumeSpecName: "config-data") pod "47a67cb1-322e-4da4-b240-1a89ff62fa51" (UID: "47a67cb1-322e-4da4-b240-1a89ff62fa51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.048368 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47a67cb1-322e-4da4-b240-1a89ff62fa51-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.126533 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-q9wxz" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.226472 4743 generic.go:334] "Generic (PLEG): container finished" podID="47a67cb1-322e-4da4-b240-1a89ff62fa51" containerID="6846e6af5444ce8c462a5a23f7cd099fd16fb4ebfc5877d55b867bda3337d2e6" exitCode=0 Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.226550 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"47a67cb1-322e-4da4-b240-1a89ff62fa51","Type":"ContainerDied","Data":"6846e6af5444ce8c462a5a23f7cd099fd16fb4ebfc5877d55b867bda3337d2e6"} Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.226576 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"47a67cb1-322e-4da4-b240-1a89ff62fa51","Type":"ContainerDied","Data":"8c3adb3c7faffca0559338d0856a94e83c99a1f7bf5b63c7710dcc1ba3d96069"} Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.226595 4743 scope.go:117] "RemoveContainer" containerID="cc15c18c0b3aab5a7851e156d795c905240d201714233ac67857ccd85cc21b78" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.226710 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.249699 4743 generic.go:334] "Generic (PLEG): container finished" podID="d7246d8a-9560-4212-8219-c7ac80cd7152" containerID="d4f73c76f87cb075388136a5c2308bb0f7c040ab5eea76e45c831be279d24d65" exitCode=0 Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.249961 4743 generic.go:334] "Generic (PLEG): container finished" podID="d7246d8a-9560-4212-8219-c7ac80cd7152" containerID="0a94cc2959a3caeae8a1da71778a8aee6fad22ab3592c272efa29502d5ebb7ed" exitCode=2 Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.249971 4743 generic.go:334] "Generic (PLEG): container finished" podID="d7246d8a-9560-4212-8219-c7ac80cd7152" containerID="27fd22327259dc3b21f40a159ab8864185943820c6906261cc15648a64eafc5e" exitCode=0 Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.250020 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7246d8a-9560-4212-8219-c7ac80cd7152","Type":"ContainerDied","Data":"d4f73c76f87cb075388136a5c2308bb0f7c040ab5eea76e45c831be279d24d65"} Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.250045 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7246d8a-9560-4212-8219-c7ac80cd7152","Type":"ContainerDied","Data":"0a94cc2959a3caeae8a1da71778a8aee6fad22ab3592c272efa29502d5ebb7ed"} Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.250055 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7246d8a-9560-4212-8219-c7ac80cd7152","Type":"ContainerDied","Data":"27fd22327259dc3b21f40a159ab8864185943820c6906261cc15648a64eafc5e"} Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.282645 4743 scope.go:117] "RemoveContainer" containerID="6846e6af5444ce8c462a5a23f7cd099fd16fb4ebfc5877d55b867bda3337d2e6" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.283792 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-744b8cd687-p7lgl" event={"ID":"68219217-d875-4eb2-9611-b9afb0f64c45","Type":"ContainerStarted","Data":"8f20d87864859123580ef06c7727b040d9aa278bfcd7d6fb10f3748b20df7b05"} Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.283839 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-744b8cd687-p7lgl" event={"ID":"68219217-d875-4eb2-9611-b9afb0f64c45","Type":"ContainerStarted","Data":"6ff8733d70987b39c56bd93c82df7f6779e937141a7a27c65084d34c01b7a205"} Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.284199 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.284340 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.316919 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.335901 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.355634 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 11 01:13:01 crc kubenswrapper[4743]: E1011 01:13:01.356105 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47a67cb1-322e-4da4-b240-1a89ff62fa51" containerName="probe" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.356119 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="47a67cb1-322e-4da4-b240-1a89ff62fa51" containerName="probe" Oct 11 01:13:01 crc kubenswrapper[4743]: E1011 01:13:01.356158 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47a67cb1-322e-4da4-b240-1a89ff62fa51" containerName="cinder-scheduler" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.356164 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="47a67cb1-322e-4da4-b240-1a89ff62fa51" containerName="cinder-scheduler" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.356410 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="47a67cb1-322e-4da4-b240-1a89ff62fa51" containerName="cinder-scheduler" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.356422 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="47a67cb1-322e-4da4-b240-1a89ff62fa51" containerName="probe" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.357920 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.359141 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-744b8cd687-p7lgl" podStartSLOduration=2.359123576 podStartE2EDuration="2.359123576s" podCreationTimestamp="2025-10-11 01:12:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:13:01.327737273 +0000 UTC m=+1275.980717670" watchObservedRunningTime="2025-10-11 01:13:01.359123576 +0000 UTC m=+1276.012103973" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.367921 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.390809 4743 scope.go:117] "RemoveContainer" containerID="cc15c18c0b3aab5a7851e156d795c905240d201714233ac67857ccd85cc21b78" Oct 11 01:13:01 crc kubenswrapper[4743]: E1011 01:13:01.395499 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc15c18c0b3aab5a7851e156d795c905240d201714233ac67857ccd85cc21b78\": container with ID starting with cc15c18c0b3aab5a7851e156d795c905240d201714233ac67857ccd85cc21b78 not found: ID does not exist" containerID="cc15c18c0b3aab5a7851e156d795c905240d201714233ac67857ccd85cc21b78" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.395554 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc15c18c0b3aab5a7851e156d795c905240d201714233ac67857ccd85cc21b78"} err="failed to get container status \"cc15c18c0b3aab5a7851e156d795c905240d201714233ac67857ccd85cc21b78\": rpc error: code = NotFound desc = could not find container \"cc15c18c0b3aab5a7851e156d795c905240d201714233ac67857ccd85cc21b78\": container with ID starting with cc15c18c0b3aab5a7851e156d795c905240d201714233ac67857ccd85cc21b78 not found: ID does not exist" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.395580 4743 scope.go:117] "RemoveContainer" containerID="6846e6af5444ce8c462a5a23f7cd099fd16fb4ebfc5877d55b867bda3337d2e6" Oct 11 01:13:01 crc kubenswrapper[4743]: E1011 01:13:01.399145 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6846e6af5444ce8c462a5a23f7cd099fd16fb4ebfc5877d55b867bda3337d2e6\": container with ID starting with 6846e6af5444ce8c462a5a23f7cd099fd16fb4ebfc5877d55b867bda3337d2e6 not found: ID does not exist" containerID="6846e6af5444ce8c462a5a23f7cd099fd16fb4ebfc5877d55b867bda3337d2e6" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.399173 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6846e6af5444ce8c462a5a23f7cd099fd16fb4ebfc5877d55b867bda3337d2e6"} err="failed to get container status \"6846e6af5444ce8c462a5a23f7cd099fd16fb4ebfc5877d55b867bda3337d2e6\": rpc error: code = NotFound desc = could not find container \"6846e6af5444ce8c462a5a23f7cd099fd16fb4ebfc5877d55b867bda3337d2e6\": container with ID starting with 6846e6af5444ce8c462a5a23f7cd099fd16fb4ebfc5877d55b867bda3337d2e6 not found: ID does not exist" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.429949 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.463230 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1570e831-5132-4e30-b791-6ac13faaeea4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1570e831-5132-4e30-b791-6ac13faaeea4\") " pod="openstack/cinder-scheduler-0" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.463329 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljkvx\" (UniqueName: \"kubernetes.io/projected/1570e831-5132-4e30-b791-6ac13faaeea4-kube-api-access-ljkvx\") pod \"cinder-scheduler-0\" (UID: \"1570e831-5132-4e30-b791-6ac13faaeea4\") " pod="openstack/cinder-scheduler-0" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.463357 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1570e831-5132-4e30-b791-6ac13faaeea4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1570e831-5132-4e30-b791-6ac13faaeea4\") " pod="openstack/cinder-scheduler-0" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.463386 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1570e831-5132-4e30-b791-6ac13faaeea4-scripts\") pod \"cinder-scheduler-0\" (UID: \"1570e831-5132-4e30-b791-6ac13faaeea4\") " pod="openstack/cinder-scheduler-0" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.463449 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1570e831-5132-4e30-b791-6ac13faaeea4-config-data\") pod \"cinder-scheduler-0\" (UID: \"1570e831-5132-4e30-b791-6ac13faaeea4\") " pod="openstack/cinder-scheduler-0" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.463506 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1570e831-5132-4e30-b791-6ac13faaeea4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1570e831-5132-4e30-b791-6ac13faaeea4\") " pod="openstack/cinder-scheduler-0" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.498565 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-h4nh5"] Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.565482 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1570e831-5132-4e30-b791-6ac13faaeea4-scripts\") pod \"cinder-scheduler-0\" (UID: \"1570e831-5132-4e30-b791-6ac13faaeea4\") " pod="openstack/cinder-scheduler-0" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.565567 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1570e831-5132-4e30-b791-6ac13faaeea4-config-data\") pod \"cinder-scheduler-0\" (UID: \"1570e831-5132-4e30-b791-6ac13faaeea4\") " pod="openstack/cinder-scheduler-0" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.565620 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1570e831-5132-4e30-b791-6ac13faaeea4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1570e831-5132-4e30-b791-6ac13faaeea4\") " pod="openstack/cinder-scheduler-0" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.565669 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1570e831-5132-4e30-b791-6ac13faaeea4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1570e831-5132-4e30-b791-6ac13faaeea4\") " pod="openstack/cinder-scheduler-0" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.565716 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljkvx\" (UniqueName: \"kubernetes.io/projected/1570e831-5132-4e30-b791-6ac13faaeea4-kube-api-access-ljkvx\") pod \"cinder-scheduler-0\" (UID: \"1570e831-5132-4e30-b791-6ac13faaeea4\") " pod="openstack/cinder-scheduler-0" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.565733 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1570e831-5132-4e30-b791-6ac13faaeea4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1570e831-5132-4e30-b791-6ac13faaeea4\") " pod="openstack/cinder-scheduler-0" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.565800 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1570e831-5132-4e30-b791-6ac13faaeea4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1570e831-5132-4e30-b791-6ac13faaeea4\") " pod="openstack/cinder-scheduler-0" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.571827 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1570e831-5132-4e30-b791-6ac13faaeea4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1570e831-5132-4e30-b791-6ac13faaeea4\") " pod="openstack/cinder-scheduler-0" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.574164 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1570e831-5132-4e30-b791-6ac13faaeea4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1570e831-5132-4e30-b791-6ac13faaeea4\") " pod="openstack/cinder-scheduler-0" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.574482 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1570e831-5132-4e30-b791-6ac13faaeea4-scripts\") pod \"cinder-scheduler-0\" (UID: \"1570e831-5132-4e30-b791-6ac13faaeea4\") " pod="openstack/cinder-scheduler-0" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.576042 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1570e831-5132-4e30-b791-6ac13faaeea4-config-data\") pod \"cinder-scheduler-0\" (UID: \"1570e831-5132-4e30-b791-6ac13faaeea4\") " pod="openstack/cinder-scheduler-0" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.594181 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljkvx\" (UniqueName: \"kubernetes.io/projected/1570e831-5132-4e30-b791-6ac13faaeea4-kube-api-access-ljkvx\") pod \"cinder-scheduler-0\" (UID: \"1570e831-5132-4e30-b791-6ac13faaeea4\") " pod="openstack/cinder-scheduler-0" Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.638748 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-dhxf8"] Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.745471 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-q9wxz"] Oct 11 01:13:01 crc kubenswrapper[4743]: W1011 01:13:01.764620 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43d98f64_a3a8_4260_94e4_565c740912d9.slice/crio-c2f4f16797dc5f0d908e99e41e4390cc719f9e351b69560211ea49c4f115ec4e WatchSource:0}: Error finding container c2f4f16797dc5f0d908e99e41e4390cc719f9e351b69560211ea49c4f115ec4e: Status 404 returned error can't find the container with id c2f4f16797dc5f0d908e99e41e4390cc719f9e351b69560211ea49c4f115ec4e Oct 11 01:13:01 crc kubenswrapper[4743]: I1011 01:13:01.865387 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.024798 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7dfd9c7974-zppf2" Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.078122 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54eafe9f-024f-4d60-917b-6e867458632d-combined-ca-bundle\") pod \"54eafe9f-024f-4d60-917b-6e867458632d\" (UID: \"54eafe9f-024f-4d60-917b-6e867458632d\") " Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.078572 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/54eafe9f-024f-4d60-917b-6e867458632d-ovndb-tls-certs\") pod \"54eafe9f-024f-4d60-917b-6e867458632d\" (UID: \"54eafe9f-024f-4d60-917b-6e867458632d\") " Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.078673 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/54eafe9f-024f-4d60-917b-6e867458632d-config\") pod \"54eafe9f-024f-4d60-917b-6e867458632d\" (UID: \"54eafe9f-024f-4d60-917b-6e867458632d\") " Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.078880 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbvzf\" (UniqueName: \"kubernetes.io/projected/54eafe9f-024f-4d60-917b-6e867458632d-kube-api-access-qbvzf\") pod \"54eafe9f-024f-4d60-917b-6e867458632d\" (UID: \"54eafe9f-024f-4d60-917b-6e867458632d\") " Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.078954 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/54eafe9f-024f-4d60-917b-6e867458632d-httpd-config\") pod \"54eafe9f-024f-4d60-917b-6e867458632d\" (UID: \"54eafe9f-024f-4d60-917b-6e867458632d\") " Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.086955 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54eafe9f-024f-4d60-917b-6e867458632d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "54eafe9f-024f-4d60-917b-6e867458632d" (UID: "54eafe9f-024f-4d60-917b-6e867458632d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.091003 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54eafe9f-024f-4d60-917b-6e867458632d-kube-api-access-qbvzf" (OuterVolumeSpecName: "kube-api-access-qbvzf") pod "54eafe9f-024f-4d60-917b-6e867458632d" (UID: "54eafe9f-024f-4d60-917b-6e867458632d"). InnerVolumeSpecName "kube-api-access-qbvzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.116117 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47a67cb1-322e-4da4-b240-1a89ff62fa51" path="/var/lib/kubelet/pods/47a67cb1-322e-4da4-b240-1a89ff62fa51/volumes" Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.182785 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbvzf\" (UniqueName: \"kubernetes.io/projected/54eafe9f-024f-4d60-917b-6e867458632d-kube-api-access-qbvzf\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.182828 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/54eafe9f-024f-4d60-917b-6e867458632d-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.197812 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54eafe9f-024f-4d60-917b-6e867458632d-config" (OuterVolumeSpecName: "config") pod "54eafe9f-024f-4d60-917b-6e867458632d" (UID: "54eafe9f-024f-4d60-917b-6e867458632d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.202275 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54eafe9f-024f-4d60-917b-6e867458632d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54eafe9f-024f-4d60-917b-6e867458632d" (UID: "54eafe9f-024f-4d60-917b-6e867458632d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.222558 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54eafe9f-024f-4d60-917b-6e867458632d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "54eafe9f-024f-4d60-917b-6e867458632d" (UID: "54eafe9f-024f-4d60-917b-6e867458632d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.286425 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54eafe9f-024f-4d60-917b-6e867458632d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.286453 4743 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/54eafe9f-024f-4d60-917b-6e867458632d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.286462 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/54eafe9f-024f-4d60-917b-6e867458632d-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.310482 4743 generic.go:334] "Generic (PLEG): container finished" podID="54eafe9f-024f-4d60-917b-6e867458632d" containerID="5f210a41596dfcc8d1509da34eba889691b3ccabdb5ae5b9a810960d3018b7f8" exitCode=0 Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.310796 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dfd9c7974-zppf2" event={"ID":"54eafe9f-024f-4d60-917b-6e867458632d","Type":"ContainerDied","Data":"5f210a41596dfcc8d1509da34eba889691b3ccabdb5ae5b9a810960d3018b7f8"} Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.310826 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dfd9c7974-zppf2" event={"ID":"54eafe9f-024f-4d60-917b-6e867458632d","Type":"ContainerDied","Data":"f3ba5a3847eade89ce93855d754f6890a97719696b02cf1445a1dac50e1d4fa3"} Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.310842 4743 scope.go:117] "RemoveContainer" containerID="f71b4cf1f0d1af682e06225191bf46c6508528b97a6d00b4bea132078cf70eca" Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.311051 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7dfd9c7974-zppf2" Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.317631 4743 generic.go:334] "Generic (PLEG): container finished" podID="41e4f286-9bff-400a-9604-81e12333eb6c" containerID="2ff0e655e08fe7c8152322d7f32d94261beb30382bc21ec202b87af66cc7e7bd" exitCode=0 Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.317909 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-h4nh5" event={"ID":"41e4f286-9bff-400a-9604-81e12333eb6c","Type":"ContainerDied","Data":"2ff0e655e08fe7c8152322d7f32d94261beb30382bc21ec202b87af66cc7e7bd"} Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.318031 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-h4nh5" event={"ID":"41e4f286-9bff-400a-9604-81e12333eb6c","Type":"ContainerStarted","Data":"85c58f2deb3a9d473742c721088e04600848383a2270b78fb9a5b129e9071cb8"} Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.357239 4743 generic.go:334] "Generic (PLEG): container finished" podID="43d98f64-a3a8-4260-94e4-565c740912d9" containerID="5d7c355d803aba783fa48c6398af9330605a486c6f2f725224e4a3e8985f9615" exitCode=0 Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.357321 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-q9wxz" event={"ID":"43d98f64-a3a8-4260-94e4-565c740912d9","Type":"ContainerDied","Data":"5d7c355d803aba783fa48c6398af9330605a486c6f2f725224e4a3e8985f9615"} Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.357345 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-q9wxz" event={"ID":"43d98f64-a3a8-4260-94e4-565c740912d9","Type":"ContainerStarted","Data":"c2f4f16797dc5f0d908e99e41e4390cc719f9e351b69560211ea49c4f115ec4e"} Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.366465 4743 generic.go:334] "Generic (PLEG): container finished" podID="d4193e99-7b11-4285-86c4-7fe1689e4aaa" containerID="01d3f5b2ec3ede995b2098ba25deb010dab6bd5d670dc106b836f5684602edc9" exitCode=0 Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.367518 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dhxf8" event={"ID":"d4193e99-7b11-4285-86c4-7fe1689e4aaa","Type":"ContainerDied","Data":"01d3f5b2ec3ede995b2098ba25deb010dab6bd5d670dc106b836f5684602edc9"} Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.367545 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dhxf8" event={"ID":"d4193e99-7b11-4285-86c4-7fe1689e4aaa","Type":"ContainerStarted","Data":"d2a7716f61bf671ef4b7474bc11d4a7408553a30b70132a0bd8105ae82cd67c7"} Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.374990 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7dfd9c7974-zppf2"] Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.375662 4743 scope.go:117] "RemoveContainer" containerID="5f210a41596dfcc8d1509da34eba889691b3ccabdb5ae5b9a810960d3018b7f8" Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.393909 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7dfd9c7974-zppf2"] Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.480043 4743 scope.go:117] "RemoveContainer" containerID="f71b4cf1f0d1af682e06225191bf46c6508528b97a6d00b4bea132078cf70eca" Oct 11 01:13:02 crc kubenswrapper[4743]: E1011 01:13:02.489021 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f71b4cf1f0d1af682e06225191bf46c6508528b97a6d00b4bea132078cf70eca\": container with ID starting with f71b4cf1f0d1af682e06225191bf46c6508528b97a6d00b4bea132078cf70eca not found: ID does not exist" containerID="f71b4cf1f0d1af682e06225191bf46c6508528b97a6d00b4bea132078cf70eca" Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.489079 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f71b4cf1f0d1af682e06225191bf46c6508528b97a6d00b4bea132078cf70eca"} err="failed to get container status \"f71b4cf1f0d1af682e06225191bf46c6508528b97a6d00b4bea132078cf70eca\": rpc error: code = NotFound desc = could not find container \"f71b4cf1f0d1af682e06225191bf46c6508528b97a6d00b4bea132078cf70eca\": container with ID starting with f71b4cf1f0d1af682e06225191bf46c6508528b97a6d00b4bea132078cf70eca not found: ID does not exist" Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.489107 4743 scope.go:117] "RemoveContainer" containerID="5f210a41596dfcc8d1509da34eba889691b3ccabdb5ae5b9a810960d3018b7f8" Oct 11 01:13:02 crc kubenswrapper[4743]: E1011 01:13:02.497008 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f210a41596dfcc8d1509da34eba889691b3ccabdb5ae5b9a810960d3018b7f8\": container with ID starting with 5f210a41596dfcc8d1509da34eba889691b3ccabdb5ae5b9a810960d3018b7f8 not found: ID does not exist" containerID="5f210a41596dfcc8d1509da34eba889691b3ccabdb5ae5b9a810960d3018b7f8" Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.497055 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f210a41596dfcc8d1509da34eba889691b3ccabdb5ae5b9a810960d3018b7f8"} err="failed to get container status \"5f210a41596dfcc8d1509da34eba889691b3ccabdb5ae5b9a810960d3018b7f8\": rpc error: code = NotFound desc = could not find container \"5f210a41596dfcc8d1509da34eba889691b3ccabdb5ae5b9a810960d3018b7f8\": container with ID starting with 5f210a41596dfcc8d1509da34eba889691b3ccabdb5ae5b9a810960d3018b7f8 not found: ID does not exist" Oct 11 01:13:02 crc kubenswrapper[4743]: I1011 01:13:02.546536 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.170550 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.226412 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7246d8a-9560-4212-8219-c7ac80cd7152-sg-core-conf-yaml\") pod \"d7246d8a-9560-4212-8219-c7ac80cd7152\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.226464 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7246d8a-9560-4212-8219-c7ac80cd7152-config-data\") pod \"d7246d8a-9560-4212-8219-c7ac80cd7152\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.226525 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7246d8a-9560-4212-8219-c7ac80cd7152-log-httpd\") pod \"d7246d8a-9560-4212-8219-c7ac80cd7152\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.226549 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7246d8a-9560-4212-8219-c7ac80cd7152-scripts\") pod \"d7246d8a-9560-4212-8219-c7ac80cd7152\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.226630 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kns5r\" (UniqueName: \"kubernetes.io/projected/d7246d8a-9560-4212-8219-c7ac80cd7152-kube-api-access-kns5r\") pod \"d7246d8a-9560-4212-8219-c7ac80cd7152\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.226667 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7246d8a-9560-4212-8219-c7ac80cd7152-combined-ca-bundle\") pod \"d7246d8a-9560-4212-8219-c7ac80cd7152\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.226730 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7246d8a-9560-4212-8219-c7ac80cd7152-run-httpd\") pod \"d7246d8a-9560-4212-8219-c7ac80cd7152\" (UID: \"d7246d8a-9560-4212-8219-c7ac80cd7152\") " Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.227431 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7246d8a-9560-4212-8219-c7ac80cd7152-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d7246d8a-9560-4212-8219-c7ac80cd7152" (UID: "d7246d8a-9560-4212-8219-c7ac80cd7152"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.227551 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7246d8a-9560-4212-8219-c7ac80cd7152-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.230052 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7246d8a-9560-4212-8219-c7ac80cd7152-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d7246d8a-9560-4212-8219-c7ac80cd7152" (UID: "d7246d8a-9560-4212-8219-c7ac80cd7152"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.233384 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7246d8a-9560-4212-8219-c7ac80cd7152-scripts" (OuterVolumeSpecName: "scripts") pod "d7246d8a-9560-4212-8219-c7ac80cd7152" (UID: "d7246d8a-9560-4212-8219-c7ac80cd7152"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.235071 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7246d8a-9560-4212-8219-c7ac80cd7152-kube-api-access-kns5r" (OuterVolumeSpecName: "kube-api-access-kns5r") pod "d7246d8a-9560-4212-8219-c7ac80cd7152" (UID: "d7246d8a-9560-4212-8219-c7ac80cd7152"). InnerVolumeSpecName "kube-api-access-kns5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.285059 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7246d8a-9560-4212-8219-c7ac80cd7152-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d7246d8a-9560-4212-8219-c7ac80cd7152" (UID: "d7246d8a-9560-4212-8219-c7ac80cd7152"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.309402 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7246d8a-9560-4212-8219-c7ac80cd7152-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7246d8a-9560-4212-8219-c7ac80cd7152" (UID: "d7246d8a-9560-4212-8219-c7ac80cd7152"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.329439 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kns5r\" (UniqueName: \"kubernetes.io/projected/d7246d8a-9560-4212-8219-c7ac80cd7152-kube-api-access-kns5r\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.329470 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7246d8a-9560-4212-8219-c7ac80cd7152-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.329482 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7246d8a-9560-4212-8219-c7ac80cd7152-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.329490 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7246d8a-9560-4212-8219-c7ac80cd7152-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.329498 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7246d8a-9560-4212-8219-c7ac80cd7152-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.370119 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7246d8a-9560-4212-8219-c7ac80cd7152-config-data" (OuterVolumeSpecName: "config-data") pod "d7246d8a-9560-4212-8219-c7ac80cd7152" (UID: "d7246d8a-9560-4212-8219-c7ac80cd7152"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.389113 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1570e831-5132-4e30-b791-6ac13faaeea4","Type":"ContainerStarted","Data":"e50ef9022d9157cc370c2ccffe0275436b2db94c28d204c3b0c6e1c7cef34a8c"} Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.393378 4743 generic.go:334] "Generic (PLEG): container finished" podID="d7246d8a-9560-4212-8219-c7ac80cd7152" containerID="c5f9e37a17454f1ed04f86d7dae58744f264061e298c54740d65a49894bd0560" exitCode=0 Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.393488 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7246d8a-9560-4212-8219-c7ac80cd7152","Type":"ContainerDied","Data":"c5f9e37a17454f1ed04f86d7dae58744f264061e298c54740d65a49894bd0560"} Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.393526 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7246d8a-9560-4212-8219-c7ac80cd7152","Type":"ContainerDied","Data":"9c5a1e2503354355dc2581b3639bc19a615d6c5e22d20664a235a14351a418b0"} Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.393551 4743 scope.go:117] "RemoveContainer" containerID="d4f73c76f87cb075388136a5c2308bb0f7c040ab5eea76e45c831be279d24d65" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.393675 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.431401 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7246d8a-9560-4212-8219-c7ac80cd7152-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.437932 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.457029 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.465904 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.471192 4743 scope.go:117] "RemoveContainer" containerID="0a94cc2959a3caeae8a1da71778a8aee6fad22ab3592c272efa29502d5ebb7ed" Oct 11 01:13:03 crc kubenswrapper[4743]: E1011 01:13:03.472002 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54eafe9f-024f-4d60-917b-6e867458632d" containerName="neutron-api" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.472035 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="54eafe9f-024f-4d60-917b-6e867458632d" containerName="neutron-api" Oct 11 01:13:03 crc kubenswrapper[4743]: E1011 01:13:03.472047 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7246d8a-9560-4212-8219-c7ac80cd7152" containerName="ceilometer-central-agent" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.472054 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7246d8a-9560-4212-8219-c7ac80cd7152" containerName="ceilometer-central-agent" Oct 11 01:13:03 crc kubenswrapper[4743]: E1011 01:13:03.472076 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7246d8a-9560-4212-8219-c7ac80cd7152" containerName="ceilometer-notification-agent" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.472082 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7246d8a-9560-4212-8219-c7ac80cd7152" containerName="ceilometer-notification-agent" Oct 11 01:13:03 crc kubenswrapper[4743]: E1011 01:13:03.472094 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7246d8a-9560-4212-8219-c7ac80cd7152" containerName="sg-core" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.472100 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7246d8a-9560-4212-8219-c7ac80cd7152" containerName="sg-core" Oct 11 01:13:03 crc kubenswrapper[4743]: E1011 01:13:03.472111 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7246d8a-9560-4212-8219-c7ac80cd7152" containerName="proxy-httpd" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.472116 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7246d8a-9560-4212-8219-c7ac80cd7152" containerName="proxy-httpd" Oct 11 01:13:03 crc kubenswrapper[4743]: E1011 01:13:03.472135 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54eafe9f-024f-4d60-917b-6e867458632d" containerName="neutron-httpd" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.472143 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="54eafe9f-024f-4d60-917b-6e867458632d" containerName="neutron-httpd" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.472323 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7246d8a-9560-4212-8219-c7ac80cd7152" containerName="sg-core" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.472332 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="54eafe9f-024f-4d60-917b-6e867458632d" containerName="neutron-httpd" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.472345 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7246d8a-9560-4212-8219-c7ac80cd7152" containerName="ceilometer-central-agent" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.472358 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7246d8a-9560-4212-8219-c7ac80cd7152" containerName="ceilometer-notification-agent" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.472374 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="54eafe9f-024f-4d60-917b-6e867458632d" containerName="neutron-api" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.472383 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7246d8a-9560-4212-8219-c7ac80cd7152" containerName="proxy-httpd" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.474508 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.479385 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.481410 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.482067 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.538104 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " pod="openstack/ceilometer-0" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.538165 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-scripts\") pod \"ceilometer-0\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " pod="openstack/ceilometer-0" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.538222 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-run-httpd\") pod \"ceilometer-0\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " pod="openstack/ceilometer-0" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.538246 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " pod="openstack/ceilometer-0" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.538346 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq69m\" (UniqueName: \"kubernetes.io/projected/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-kube-api-access-fq69m\") pod \"ceilometer-0\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " pod="openstack/ceilometer-0" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.538370 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-log-httpd\") pod \"ceilometer-0\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " pod="openstack/ceilometer-0" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.538422 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-config-data\") pod \"ceilometer-0\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " pod="openstack/ceilometer-0" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.548103 4743 scope.go:117] "RemoveContainer" containerID="c5f9e37a17454f1ed04f86d7dae58744f264061e298c54740d65a49894bd0560" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.639936 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq69m\" (UniqueName: \"kubernetes.io/projected/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-kube-api-access-fq69m\") pod \"ceilometer-0\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " pod="openstack/ceilometer-0" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.639978 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-log-httpd\") pod \"ceilometer-0\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " pod="openstack/ceilometer-0" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.640025 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-config-data\") pod \"ceilometer-0\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " pod="openstack/ceilometer-0" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.640055 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " pod="openstack/ceilometer-0" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.640076 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-scripts\") pod \"ceilometer-0\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " pod="openstack/ceilometer-0" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.640120 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-run-httpd\") pod \"ceilometer-0\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " pod="openstack/ceilometer-0" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.640142 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " pod="openstack/ceilometer-0" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.643705 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-run-httpd\") pod \"ceilometer-0\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " pod="openstack/ceilometer-0" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.645267 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-log-httpd\") pod \"ceilometer-0\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " pod="openstack/ceilometer-0" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.646752 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " pod="openstack/ceilometer-0" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.647544 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-config-data\") pod \"ceilometer-0\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " pod="openstack/ceilometer-0" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.647993 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-scripts\") pod \"ceilometer-0\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " pod="openstack/ceilometer-0" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.651260 4743 scope.go:117] "RemoveContainer" containerID="27fd22327259dc3b21f40a159ab8864185943820c6906261cc15648a64eafc5e" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.651800 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " pod="openstack/ceilometer-0" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.659896 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq69m\" (UniqueName: \"kubernetes.io/projected/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-kube-api-access-fq69m\") pod \"ceilometer-0\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " pod="openstack/ceilometer-0" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.774460 4743 scope.go:117] "RemoveContainer" containerID="d4f73c76f87cb075388136a5c2308bb0f7c040ab5eea76e45c831be279d24d65" Oct 11 01:13:03 crc kubenswrapper[4743]: E1011 01:13:03.777554 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4f73c76f87cb075388136a5c2308bb0f7c040ab5eea76e45c831be279d24d65\": container with ID starting with d4f73c76f87cb075388136a5c2308bb0f7c040ab5eea76e45c831be279d24d65 not found: ID does not exist" containerID="d4f73c76f87cb075388136a5c2308bb0f7c040ab5eea76e45c831be279d24d65" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.777588 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4f73c76f87cb075388136a5c2308bb0f7c040ab5eea76e45c831be279d24d65"} err="failed to get container status \"d4f73c76f87cb075388136a5c2308bb0f7c040ab5eea76e45c831be279d24d65\": rpc error: code = NotFound desc = could not find container \"d4f73c76f87cb075388136a5c2308bb0f7c040ab5eea76e45c831be279d24d65\": container with ID starting with d4f73c76f87cb075388136a5c2308bb0f7c040ab5eea76e45c831be279d24d65 not found: ID does not exist" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.777623 4743 scope.go:117] "RemoveContainer" containerID="0a94cc2959a3caeae8a1da71778a8aee6fad22ab3592c272efa29502d5ebb7ed" Oct 11 01:13:03 crc kubenswrapper[4743]: E1011 01:13:03.778143 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a94cc2959a3caeae8a1da71778a8aee6fad22ab3592c272efa29502d5ebb7ed\": container with ID starting with 0a94cc2959a3caeae8a1da71778a8aee6fad22ab3592c272efa29502d5ebb7ed not found: ID does not exist" containerID="0a94cc2959a3caeae8a1da71778a8aee6fad22ab3592c272efa29502d5ebb7ed" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.778166 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a94cc2959a3caeae8a1da71778a8aee6fad22ab3592c272efa29502d5ebb7ed"} err="failed to get container status \"0a94cc2959a3caeae8a1da71778a8aee6fad22ab3592c272efa29502d5ebb7ed\": rpc error: code = NotFound desc = could not find container \"0a94cc2959a3caeae8a1da71778a8aee6fad22ab3592c272efa29502d5ebb7ed\": container with ID starting with 0a94cc2959a3caeae8a1da71778a8aee6fad22ab3592c272efa29502d5ebb7ed not found: ID does not exist" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.778178 4743 scope.go:117] "RemoveContainer" containerID="c5f9e37a17454f1ed04f86d7dae58744f264061e298c54740d65a49894bd0560" Oct 11 01:13:03 crc kubenswrapper[4743]: E1011 01:13:03.779369 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5f9e37a17454f1ed04f86d7dae58744f264061e298c54740d65a49894bd0560\": container with ID starting with c5f9e37a17454f1ed04f86d7dae58744f264061e298c54740d65a49894bd0560 not found: ID does not exist" containerID="c5f9e37a17454f1ed04f86d7dae58744f264061e298c54740d65a49894bd0560" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.779391 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5f9e37a17454f1ed04f86d7dae58744f264061e298c54740d65a49894bd0560"} err="failed to get container status \"c5f9e37a17454f1ed04f86d7dae58744f264061e298c54740d65a49894bd0560\": rpc error: code = NotFound desc = could not find container \"c5f9e37a17454f1ed04f86d7dae58744f264061e298c54740d65a49894bd0560\": container with ID starting with c5f9e37a17454f1ed04f86d7dae58744f264061e298c54740d65a49894bd0560 not found: ID does not exist" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.779406 4743 scope.go:117] "RemoveContainer" containerID="27fd22327259dc3b21f40a159ab8864185943820c6906261cc15648a64eafc5e" Oct 11 01:13:03 crc kubenswrapper[4743]: E1011 01:13:03.779683 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27fd22327259dc3b21f40a159ab8864185943820c6906261cc15648a64eafc5e\": container with ID starting with 27fd22327259dc3b21f40a159ab8864185943820c6906261cc15648a64eafc5e not found: ID does not exist" containerID="27fd22327259dc3b21f40a159ab8864185943820c6906261cc15648a64eafc5e" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.779712 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27fd22327259dc3b21f40a159ab8864185943820c6906261cc15648a64eafc5e"} err="failed to get container status \"27fd22327259dc3b21f40a159ab8864185943820c6906261cc15648a64eafc5e\": rpc error: code = NotFound desc = could not find container \"27fd22327259dc3b21f40a159ab8864185943820c6906261cc15648a64eafc5e\": container with ID starting with 27fd22327259dc3b21f40a159ab8864185943820c6906261cc15648a64eafc5e not found: ID does not exist" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.942664 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:13:03 crc kubenswrapper[4743]: I1011 01:13:03.950759 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-h4nh5" Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.047916 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls7jv\" (UniqueName: \"kubernetes.io/projected/41e4f286-9bff-400a-9604-81e12333eb6c-kube-api-access-ls7jv\") pod \"41e4f286-9bff-400a-9604-81e12333eb6c\" (UID: \"41e4f286-9bff-400a-9604-81e12333eb6c\") " Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.056275 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41e4f286-9bff-400a-9604-81e12333eb6c-kube-api-access-ls7jv" (OuterVolumeSpecName: "kube-api-access-ls7jv") pod "41e4f286-9bff-400a-9604-81e12333eb6c" (UID: "41e4f286-9bff-400a-9604-81e12333eb6c"). InnerVolumeSpecName "kube-api-access-ls7jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.150851 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54eafe9f-024f-4d60-917b-6e867458632d" path="/var/lib/kubelet/pods/54eafe9f-024f-4d60-917b-6e867458632d/volumes" Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.151778 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7246d8a-9560-4212-8219-c7ac80cd7152" path="/var/lib/kubelet/pods/d7246d8a-9560-4212-8219-c7ac80cd7152/volumes" Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.155915 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls7jv\" (UniqueName: \"kubernetes.io/projected/41e4f286-9bff-400a-9604-81e12333eb6c-kube-api-access-ls7jv\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.176736 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.245524 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dhxf8" Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.325821 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-q9wxz" Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.362103 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58b2j\" (UniqueName: \"kubernetes.io/projected/43d98f64-a3a8-4260-94e4-565c740912d9-kube-api-access-58b2j\") pod \"43d98f64-a3a8-4260-94e4-565c740912d9\" (UID: \"43d98f64-a3a8-4260-94e4-565c740912d9\") " Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.362397 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sv8v\" (UniqueName: \"kubernetes.io/projected/d4193e99-7b11-4285-86c4-7fe1689e4aaa-kube-api-access-7sv8v\") pod \"d4193e99-7b11-4285-86c4-7fe1689e4aaa\" (UID: \"d4193e99-7b11-4285-86c4-7fe1689e4aaa\") " Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.390218 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-67c9948594-q58d2" Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.393650 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43d98f64-a3a8-4260-94e4-565c740912d9-kube-api-access-58b2j" (OuterVolumeSpecName: "kube-api-access-58b2j") pod "43d98f64-a3a8-4260-94e4-565c740912d9" (UID: "43d98f64-a3a8-4260-94e4-565c740912d9"). InnerVolumeSpecName "kube-api-access-58b2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.397199 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4193e99-7b11-4285-86c4-7fe1689e4aaa-kube-api-access-7sv8v" (OuterVolumeSpecName: "kube-api-access-7sv8v") pod "d4193e99-7b11-4285-86c4-7fe1689e4aaa" (UID: "d4193e99-7b11-4285-86c4-7fe1689e4aaa"). InnerVolumeSpecName "kube-api-access-7sv8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.422361 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1570e831-5132-4e30-b791-6ac13faaeea4","Type":"ContainerStarted","Data":"33fba52c777b1a9593f4ba7a1f412c502fce7ee90b4e95345b911d6ef136ff8e"} Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.438944 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-q9wxz" Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.441176 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-q9wxz" event={"ID":"43d98f64-a3a8-4260-94e4-565c740912d9","Type":"ContainerDied","Data":"c2f4f16797dc5f0d908e99e41e4390cc719f9e351b69560211ea49c4f115ec4e"} Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.441239 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2f4f16797dc5f0d908e99e41e4390cc719f9e351b69560211ea49c4f115ec4e" Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.470447 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sv8v\" (UniqueName: \"kubernetes.io/projected/d4193e99-7b11-4285-86c4-7fe1689e4aaa-kube-api-access-7sv8v\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.470482 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58b2j\" (UniqueName: \"kubernetes.io/projected/43d98f64-a3a8-4260-94e4-565c740912d9-kube-api-access-58b2j\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.473139 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dhxf8" event={"ID":"d4193e99-7b11-4285-86c4-7fe1689e4aaa","Type":"ContainerDied","Data":"d2a7716f61bf671ef4b7474bc11d4a7408553a30b70132a0bd8105ae82cd67c7"} Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.473194 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2a7716f61bf671ef4b7474bc11d4a7408553a30b70132a0bd8105ae82cd67c7" Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.473277 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dhxf8" Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.478439 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-8464ff7fb4-cflsg"] Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.478660 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-8464ff7fb4-cflsg" podUID="ffe3be15-eb32-4556-b0a8-099aa3f9e09b" containerName="barbican-api-log" containerID="cri-o://d4a7bd7d5a4901c481938f32145318f55b80f105120d6621da256b1884c3e303" gracePeriod=30 Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.479068 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-8464ff7fb4-cflsg" podUID="ffe3be15-eb32-4556-b0a8-099aa3f9e09b" containerName="barbican-api" containerID="cri-o://b16be9c60d9b68b7eb2c64a3b283c287454d626202e5f9800cad210192e952fa" gracePeriod=30 Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.524410 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-h4nh5" Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.524606 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-h4nh5" event={"ID":"41e4f286-9bff-400a-9604-81e12333eb6c","Type":"ContainerDied","Data":"85c58f2deb3a9d473742c721088e04600848383a2270b78fb9a5b129e9071cb8"} Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.524655 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85c58f2deb3a9d473742c721088e04600848383a2270b78fb9a5b129e9071cb8" Oct 11 01:13:04 crc kubenswrapper[4743]: I1011 01:13:04.594567 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:13:05 crc kubenswrapper[4743]: I1011 01:13:05.533120 4743 generic.go:334] "Generic (PLEG): container finished" podID="ffe3be15-eb32-4556-b0a8-099aa3f9e09b" containerID="d4a7bd7d5a4901c481938f32145318f55b80f105120d6621da256b1884c3e303" exitCode=143 Oct 11 01:13:05 crc kubenswrapper[4743]: I1011 01:13:05.533312 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8464ff7fb4-cflsg" event={"ID":"ffe3be15-eb32-4556-b0a8-099aa3f9e09b","Type":"ContainerDied","Data":"d4a7bd7d5a4901c481938f32145318f55b80f105120d6621da256b1884c3e303"} Oct 11 01:13:05 crc kubenswrapper[4743]: I1011 01:13:05.535524 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1570e831-5132-4e30-b791-6ac13faaeea4","Type":"ContainerStarted","Data":"6e782156749d6bb331bc56fb8c32beb6d0c436dffdcf6cacef10bb5afab1434c"} Oct 11 01:13:05 crc kubenswrapper[4743]: I1011 01:13:05.536645 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d","Type":"ContainerStarted","Data":"bbc46afe7a604db04d4dab09cc6412577995db755909c365552311c78140a1eb"} Oct 11 01:13:05 crc kubenswrapper[4743]: I1011 01:13:05.552610 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.552593532 podStartE2EDuration="4.552593532s" podCreationTimestamp="2025-10-11 01:13:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:13:05.549911821 +0000 UTC m=+1280.202892218" watchObservedRunningTime="2025-10-11 01:13:05.552593532 +0000 UTC m=+1280.205573929" Oct 11 01:13:06 crc kubenswrapper[4743]: I1011 01:13:06.622093 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 11 01:13:06 crc kubenswrapper[4743]: I1011 01:13:06.865935 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 11 01:13:07 crc kubenswrapper[4743]: I1011 01:13:07.643356 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8464ff7fb4-cflsg" podUID="ffe3be15-eb32-4556-b0a8-099aa3f9e09b" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.184:9311/healthcheck\": read tcp 10.217.0.2:53396->10.217.0.184:9311: read: connection reset by peer" Oct 11 01:13:07 crc kubenswrapper[4743]: I1011 01:13:07.643357 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8464ff7fb4-cflsg" podUID="ffe3be15-eb32-4556-b0a8-099aa3f9e09b" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.184:9311/healthcheck\": read tcp 10.217.0.2:53406->10.217.0.184:9311: read: connection reset by peer" Oct 11 01:13:08 crc kubenswrapper[4743]: I1011 01:13:08.566732 4743 generic.go:334] "Generic (PLEG): container finished" podID="ffe3be15-eb32-4556-b0a8-099aa3f9e09b" containerID="b16be9c60d9b68b7eb2c64a3b283c287454d626202e5f9800cad210192e952fa" exitCode=0 Oct 11 01:13:08 crc kubenswrapper[4743]: I1011 01:13:08.566811 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8464ff7fb4-cflsg" event={"ID":"ffe3be15-eb32-4556-b0a8-099aa3f9e09b","Type":"ContainerDied","Data":"b16be9c60d9b68b7eb2c64a3b283c287454d626202e5f9800cad210192e952fa"} Oct 11 01:13:09 crc kubenswrapper[4743]: I1011 01:13:09.484983 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:13:09 crc kubenswrapper[4743]: I1011 01:13:09.485399 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-744b8cd687-p7lgl" Oct 11 01:13:09 crc kubenswrapper[4743]: I1011 01:13:09.507358 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8464ff7fb4-cflsg" podUID="ffe3be15-eb32-4556-b0a8-099aa3f9e09b" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.184:9311/healthcheck\": dial tcp 10.217.0.184:9311: connect: connection refused" Oct 11 01:13:09 crc kubenswrapper[4743]: I1011 01:13:09.507377 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8464ff7fb4-cflsg" podUID="ffe3be15-eb32-4556-b0a8-099aa3f9e09b" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.184:9311/healthcheck\": dial tcp 10.217.0.184:9311: connect: connection refused" Oct 11 01:13:10 crc kubenswrapper[4743]: I1011 01:13:10.675111 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-bf85-account-create-9q879"] Oct 11 01:13:10 crc kubenswrapper[4743]: E1011 01:13:10.675933 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e4f286-9bff-400a-9604-81e12333eb6c" containerName="mariadb-database-create" Oct 11 01:13:10 crc kubenswrapper[4743]: I1011 01:13:10.675952 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e4f286-9bff-400a-9604-81e12333eb6c" containerName="mariadb-database-create" Oct 11 01:13:10 crc kubenswrapper[4743]: E1011 01:13:10.675992 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43d98f64-a3a8-4260-94e4-565c740912d9" containerName="mariadb-database-create" Oct 11 01:13:10 crc kubenswrapper[4743]: I1011 01:13:10.676000 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="43d98f64-a3a8-4260-94e4-565c740912d9" containerName="mariadb-database-create" Oct 11 01:13:10 crc kubenswrapper[4743]: E1011 01:13:10.676015 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4193e99-7b11-4285-86c4-7fe1689e4aaa" containerName="mariadb-database-create" Oct 11 01:13:10 crc kubenswrapper[4743]: I1011 01:13:10.676023 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4193e99-7b11-4285-86c4-7fe1689e4aaa" containerName="mariadb-database-create" Oct 11 01:13:10 crc kubenswrapper[4743]: I1011 01:13:10.676259 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e4f286-9bff-400a-9604-81e12333eb6c" containerName="mariadb-database-create" Oct 11 01:13:10 crc kubenswrapper[4743]: I1011 01:13:10.676280 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4193e99-7b11-4285-86c4-7fe1689e4aaa" containerName="mariadb-database-create" Oct 11 01:13:10 crc kubenswrapper[4743]: I1011 01:13:10.676292 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="43d98f64-a3a8-4260-94e4-565c740912d9" containerName="mariadb-database-create" Oct 11 01:13:10 crc kubenswrapper[4743]: I1011 01:13:10.677196 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bf85-account-create-9q879" Oct 11 01:13:10 crc kubenswrapper[4743]: I1011 01:13:10.690510 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-bf85-account-create-9q879"] Oct 11 01:13:10 crc kubenswrapper[4743]: I1011 01:13:10.690535 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 11 01:13:10 crc kubenswrapper[4743]: I1011 01:13:10.748305 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9l9n\" (UniqueName: \"kubernetes.io/projected/be9b0c9b-a917-422d-b3aa-c9011eda53c9-kube-api-access-k9l9n\") pod \"nova-api-bf85-account-create-9q879\" (UID: \"be9b0c9b-a917-422d-b3aa-c9011eda53c9\") " pod="openstack/nova-api-bf85-account-create-9q879" Oct 11 01:13:10 crc kubenswrapper[4743]: I1011 01:13:10.850979 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9l9n\" (UniqueName: \"kubernetes.io/projected/be9b0c9b-a917-422d-b3aa-c9011eda53c9-kube-api-access-k9l9n\") pod \"nova-api-bf85-account-create-9q879\" (UID: \"be9b0c9b-a917-422d-b3aa-c9011eda53c9\") " pod="openstack/nova-api-bf85-account-create-9q879" Oct 11 01:13:10 crc kubenswrapper[4743]: I1011 01:13:10.865616 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-c102-account-create-rcsw5"] Oct 11 01:13:10 crc kubenswrapper[4743]: I1011 01:13:10.867710 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c102-account-create-rcsw5" Oct 11 01:13:10 crc kubenswrapper[4743]: I1011 01:13:10.871219 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 11 01:13:10 crc kubenswrapper[4743]: I1011 01:13:10.877683 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9l9n\" (UniqueName: \"kubernetes.io/projected/be9b0c9b-a917-422d-b3aa-c9011eda53c9-kube-api-access-k9l9n\") pod \"nova-api-bf85-account-create-9q879\" (UID: \"be9b0c9b-a917-422d-b3aa-c9011eda53c9\") " pod="openstack/nova-api-bf85-account-create-9q879" Oct 11 01:13:10 crc kubenswrapper[4743]: I1011 01:13:10.881147 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c102-account-create-rcsw5"] Oct 11 01:13:10 crc kubenswrapper[4743]: I1011 01:13:10.898442 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bf85-account-create-9q879" Oct 11 01:13:10 crc kubenswrapper[4743]: I1011 01:13:10.953642 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d98kf\" (UniqueName: \"kubernetes.io/projected/0e22d773-b000-4dea-aa30-1134c593e7cc-kube-api-access-d98kf\") pod \"nova-cell0-c102-account-create-rcsw5\" (UID: \"0e22d773-b000-4dea-aa30-1134c593e7cc\") " pod="openstack/nova-cell0-c102-account-create-rcsw5" Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.058498 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d98kf\" (UniqueName: \"kubernetes.io/projected/0e22d773-b000-4dea-aa30-1134c593e7cc-kube-api-access-d98kf\") pod \"nova-cell0-c102-account-create-rcsw5\" (UID: \"0e22d773-b000-4dea-aa30-1134c593e7cc\") " pod="openstack/nova-cell0-c102-account-create-rcsw5" Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.066681 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-4caf-account-create-6hbb4"] Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.068327 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4caf-account-create-6hbb4" Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.072326 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.078845 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4caf-account-create-6hbb4"] Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.096564 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d98kf\" (UniqueName: \"kubernetes.io/projected/0e22d773-b000-4dea-aa30-1134c593e7cc-kube-api-access-d98kf\") pod \"nova-cell0-c102-account-create-rcsw5\" (UID: \"0e22d773-b000-4dea-aa30-1134c593e7cc\") " pod="openstack/nova-cell0-c102-account-create-rcsw5" Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.155307 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8464ff7fb4-cflsg" Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.164429 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps8zn\" (UniqueName: \"kubernetes.io/projected/b83d6968-cf34-4111-b66c-f2de3eb8abce-kube-api-access-ps8zn\") pod \"nova-cell1-4caf-account-create-6hbb4\" (UID: \"b83d6968-cf34-4111-b66c-f2de3eb8abce\") " pod="openstack/nova-cell1-4caf-account-create-6hbb4" Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.232293 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c102-account-create-rcsw5" Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.265814 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-config-data-custom\") pod \"ffe3be15-eb32-4556-b0a8-099aa3f9e09b\" (UID: \"ffe3be15-eb32-4556-b0a8-099aa3f9e09b\") " Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.266170 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-config-data\") pod \"ffe3be15-eb32-4556-b0a8-099aa3f9e09b\" (UID: \"ffe3be15-eb32-4556-b0a8-099aa3f9e09b\") " Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.267214 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-combined-ca-bundle\") pod \"ffe3be15-eb32-4556-b0a8-099aa3f9e09b\" (UID: \"ffe3be15-eb32-4556-b0a8-099aa3f9e09b\") " Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.267365 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-logs\") pod \"ffe3be15-eb32-4556-b0a8-099aa3f9e09b\" (UID: \"ffe3be15-eb32-4556-b0a8-099aa3f9e09b\") " Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.267532 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25jm4\" (UniqueName: \"kubernetes.io/projected/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-kube-api-access-25jm4\") pod \"ffe3be15-eb32-4556-b0a8-099aa3f9e09b\" (UID: \"ffe3be15-eb32-4556-b0a8-099aa3f9e09b\") " Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.268206 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps8zn\" (UniqueName: \"kubernetes.io/projected/b83d6968-cf34-4111-b66c-f2de3eb8abce-kube-api-access-ps8zn\") pod \"nova-cell1-4caf-account-create-6hbb4\" (UID: \"b83d6968-cf34-4111-b66c-f2de3eb8abce\") " pod="openstack/nova-cell1-4caf-account-create-6hbb4" Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.271455 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-logs" (OuterVolumeSpecName: "logs") pod "ffe3be15-eb32-4556-b0a8-099aa3f9e09b" (UID: "ffe3be15-eb32-4556-b0a8-099aa3f9e09b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.274319 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ffe3be15-eb32-4556-b0a8-099aa3f9e09b" (UID: "ffe3be15-eb32-4556-b0a8-099aa3f9e09b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.275030 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-kube-api-access-25jm4" (OuterVolumeSpecName: "kube-api-access-25jm4") pod "ffe3be15-eb32-4556-b0a8-099aa3f9e09b" (UID: "ffe3be15-eb32-4556-b0a8-099aa3f9e09b"). InnerVolumeSpecName "kube-api-access-25jm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.288693 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps8zn\" (UniqueName: \"kubernetes.io/projected/b83d6968-cf34-4111-b66c-f2de3eb8abce-kube-api-access-ps8zn\") pod \"nova-cell1-4caf-account-create-6hbb4\" (UID: \"b83d6968-cf34-4111-b66c-f2de3eb8abce\") " pod="openstack/nova-cell1-4caf-account-create-6hbb4" Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.309276 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ffe3be15-eb32-4556-b0a8-099aa3f9e09b" (UID: "ffe3be15-eb32-4556-b0a8-099aa3f9e09b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.349344 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-config-data" (OuterVolumeSpecName: "config-data") pod "ffe3be15-eb32-4556-b0a8-099aa3f9e09b" (UID: "ffe3be15-eb32-4556-b0a8-099aa3f9e09b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.371351 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.371392 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-logs\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.371404 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25jm4\" (UniqueName: \"kubernetes.io/projected/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-kube-api-access-25jm4\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.371416 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.371425 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffe3be15-eb32-4556-b0a8-099aa3f9e09b-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.448470 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4caf-account-create-6hbb4" Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.477550 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-bf85-account-create-9q879"] Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.606045 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-bf85-account-create-9q879" event={"ID":"be9b0c9b-a917-422d-b3aa-c9011eda53c9","Type":"ContainerStarted","Data":"952ac55361d232921e43f879be4a372bf0949f1a0ecd424dcd198ca0b5218719"} Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.616276 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8464ff7fb4-cflsg" event={"ID":"ffe3be15-eb32-4556-b0a8-099aa3f9e09b","Type":"ContainerDied","Data":"accc35cb6f42fa340ac6711d731006e67208463df221504816393af32b3ce5f2"} Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.616343 4743 scope.go:117] "RemoveContainer" containerID="b16be9c60d9b68b7eb2c64a3b283c287454d626202e5f9800cad210192e952fa" Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.616510 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8464ff7fb4-cflsg" Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.631002 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c02b1352-1ccf-4856-ad8e-328dab03135e","Type":"ContainerStarted","Data":"51056bd11920be2da6165f6c222ba473ae1d02a6baec8f33600bc17349fc3e48"} Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.638047 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d","Type":"ContainerStarted","Data":"9d69be4229595b4449ea06e4b1ea8db3911c5ab0dcdb6905b1b450714646acbf"} Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.659212 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.254443863 podStartE2EDuration="17.659190282s" podCreationTimestamp="2025-10-11 01:12:54 +0000 UTC" firstStartedPulling="2025-10-11 01:12:55.348709392 +0000 UTC m=+1270.001689789" lastFinishedPulling="2025-10-11 01:13:10.753455801 +0000 UTC m=+1285.406436208" observedRunningTime="2025-10-11 01:13:11.648562607 +0000 UTC m=+1286.301543024" watchObservedRunningTime="2025-10-11 01:13:11.659190282 +0000 UTC m=+1286.312170679" Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.722434 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c102-account-create-rcsw5"] Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.832262 4743 scope.go:117] "RemoveContainer" containerID="d4a7bd7d5a4901c481938f32145318f55b80f105120d6621da256b1884c3e303" Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.864061 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-8464ff7fb4-cflsg"] Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.873682 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-8464ff7fb4-cflsg"] Oct 11 01:13:11 crc kubenswrapper[4743]: I1011 01:13:11.967961 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4caf-account-create-6hbb4"] Oct 11 01:13:11 crc kubenswrapper[4743]: W1011 01:13:11.975475 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb83d6968_cf34_4111_b66c_f2de3eb8abce.slice/crio-5f58fe72705e571e8d2969895e7910f34af935ad5b2aed53fc1d2157c24631a2 WatchSource:0}: Error finding container 5f58fe72705e571e8d2969895e7910f34af935ad5b2aed53fc1d2157c24631a2: Status 404 returned error can't find the container with id 5f58fe72705e571e8d2969895e7910f34af935ad5b2aed53fc1d2157c24631a2 Oct 11 01:13:12 crc kubenswrapper[4743]: I1011 01:13:12.108390 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffe3be15-eb32-4556-b0a8-099aa3f9e09b" path="/var/lib/kubelet/pods/ffe3be15-eb32-4556-b0a8-099aa3f9e09b/volumes" Oct 11 01:13:12 crc kubenswrapper[4743]: I1011 01:13:12.215468 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 11 01:13:12 crc kubenswrapper[4743]: I1011 01:13:12.671579 4743 generic.go:334] "Generic (PLEG): container finished" podID="0e22d773-b000-4dea-aa30-1134c593e7cc" containerID="b828d9a92d81901efc0daf4decce896ac4c9fefa736474149001eb5d8d0464bf" exitCode=0 Oct 11 01:13:12 crc kubenswrapper[4743]: I1011 01:13:12.671964 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c102-account-create-rcsw5" event={"ID":"0e22d773-b000-4dea-aa30-1134c593e7cc","Type":"ContainerDied","Data":"b828d9a92d81901efc0daf4decce896ac4c9fefa736474149001eb5d8d0464bf"} Oct 11 01:13:12 crc kubenswrapper[4743]: I1011 01:13:12.672012 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c102-account-create-rcsw5" event={"ID":"0e22d773-b000-4dea-aa30-1134c593e7cc","Type":"ContainerStarted","Data":"b653eee5c1a556911f773baea282368bcce5723f621c997fc99a6671dcc74efc"} Oct 11 01:13:12 crc kubenswrapper[4743]: I1011 01:13:12.676480 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d","Type":"ContainerStarted","Data":"ca00a88290bf7e8f01c0cf70fd87a61cb2ad734ca7dc7c0269e108c52d00faa2"} Oct 11 01:13:12 crc kubenswrapper[4743]: I1011 01:13:12.676507 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d","Type":"ContainerStarted","Data":"c030322bf3f9d22192cda6219d287c61f6d4b397687de7b0ba1b5fade85a26fd"} Oct 11 01:13:12 crc kubenswrapper[4743]: I1011 01:13:12.681291 4743 generic.go:334] "Generic (PLEG): container finished" podID="b83d6968-cf34-4111-b66c-f2de3eb8abce" containerID="9519dc35711165dabb642d86082e4eac64a6d882fb5cae7a93bfede010f611d4" exitCode=0 Oct 11 01:13:12 crc kubenswrapper[4743]: I1011 01:13:12.681338 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4caf-account-create-6hbb4" event={"ID":"b83d6968-cf34-4111-b66c-f2de3eb8abce","Type":"ContainerDied","Data":"9519dc35711165dabb642d86082e4eac64a6d882fb5cae7a93bfede010f611d4"} Oct 11 01:13:12 crc kubenswrapper[4743]: I1011 01:13:12.681355 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4caf-account-create-6hbb4" event={"ID":"b83d6968-cf34-4111-b66c-f2de3eb8abce","Type":"ContainerStarted","Data":"5f58fe72705e571e8d2969895e7910f34af935ad5b2aed53fc1d2157c24631a2"} Oct 11 01:13:12 crc kubenswrapper[4743]: I1011 01:13:12.688629 4743 generic.go:334] "Generic (PLEG): container finished" podID="be9b0c9b-a917-422d-b3aa-c9011eda53c9" containerID="d8b39937da4ba3865c6e25f8b9d4d67d3cd574c66ea1501c9949d4bd121e12fb" exitCode=0 Oct 11 01:13:12 crc kubenswrapper[4743]: I1011 01:13:12.688732 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-bf85-account-create-9q879" event={"ID":"be9b0c9b-a917-422d-b3aa-c9011eda53c9","Type":"ContainerDied","Data":"d8b39937da4ba3865c6e25f8b9d4d67d3cd574c66ea1501c9949d4bd121e12fb"} Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.345399 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4caf-account-create-6hbb4" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.355072 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c102-account-create-rcsw5" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.360274 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bf85-account-create-9q879" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.444084 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps8zn\" (UniqueName: \"kubernetes.io/projected/b83d6968-cf34-4111-b66c-f2de3eb8abce-kube-api-access-ps8zn\") pod \"b83d6968-cf34-4111-b66c-f2de3eb8abce\" (UID: \"b83d6968-cf34-4111-b66c-f2de3eb8abce\") " Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.444127 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d98kf\" (UniqueName: \"kubernetes.io/projected/0e22d773-b000-4dea-aa30-1134c593e7cc-kube-api-access-d98kf\") pod \"0e22d773-b000-4dea-aa30-1134c593e7cc\" (UID: \"0e22d773-b000-4dea-aa30-1134c593e7cc\") " Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.444246 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9l9n\" (UniqueName: \"kubernetes.io/projected/be9b0c9b-a917-422d-b3aa-c9011eda53c9-kube-api-access-k9l9n\") pod \"be9b0c9b-a917-422d-b3aa-c9011eda53c9\" (UID: \"be9b0c9b-a917-422d-b3aa-c9011eda53c9\") " Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.449755 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e22d773-b000-4dea-aa30-1134c593e7cc-kube-api-access-d98kf" (OuterVolumeSpecName: "kube-api-access-d98kf") pod "0e22d773-b000-4dea-aa30-1134c593e7cc" (UID: "0e22d773-b000-4dea-aa30-1134c593e7cc"). InnerVolumeSpecName "kube-api-access-d98kf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.449803 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be9b0c9b-a917-422d-b3aa-c9011eda53c9-kube-api-access-k9l9n" (OuterVolumeSpecName: "kube-api-access-k9l9n") pod "be9b0c9b-a917-422d-b3aa-c9011eda53c9" (UID: "be9b0c9b-a917-422d-b3aa-c9011eda53c9"). InnerVolumeSpecName "kube-api-access-k9l9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.451514 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b83d6968-cf34-4111-b66c-f2de3eb8abce-kube-api-access-ps8zn" (OuterVolumeSpecName: "kube-api-access-ps8zn") pod "b83d6968-cf34-4111-b66c-f2de3eb8abce" (UID: "b83d6968-cf34-4111-b66c-f2de3eb8abce"). InnerVolumeSpecName "kube-api-access-ps8zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.459975 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.460023 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.546149 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ps8zn\" (UniqueName: \"kubernetes.io/projected/b83d6968-cf34-4111-b66c-f2de3eb8abce-kube-api-access-ps8zn\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.546504 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d98kf\" (UniqueName: \"kubernetes.io/projected/0e22d773-b000-4dea-aa30-1134c593e7cc-kube-api-access-d98kf\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.546517 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9l9n\" (UniqueName: \"kubernetes.io/projected/be9b0c9b-a917-422d-b3aa-c9011eda53c9-kube-api-access-k9l9n\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.680647 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6c678b5cf4-bmh48"] Oct 11 01:13:14 crc kubenswrapper[4743]: E1011 01:13:14.681120 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be9b0c9b-a917-422d-b3aa-c9011eda53c9" containerName="mariadb-account-create" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.681143 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="be9b0c9b-a917-422d-b3aa-c9011eda53c9" containerName="mariadb-account-create" Oct 11 01:13:14 crc kubenswrapper[4743]: E1011 01:13:14.681153 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b83d6968-cf34-4111-b66c-f2de3eb8abce" containerName="mariadb-account-create" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.681163 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b83d6968-cf34-4111-b66c-f2de3eb8abce" containerName="mariadb-account-create" Oct 11 01:13:14 crc kubenswrapper[4743]: E1011 01:13:14.681173 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffe3be15-eb32-4556-b0a8-099aa3f9e09b" containerName="barbican-api-log" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.681181 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffe3be15-eb32-4556-b0a8-099aa3f9e09b" containerName="barbican-api-log" Oct 11 01:13:14 crc kubenswrapper[4743]: E1011 01:13:14.681205 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffe3be15-eb32-4556-b0a8-099aa3f9e09b" containerName="barbican-api" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.681213 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffe3be15-eb32-4556-b0a8-099aa3f9e09b" containerName="barbican-api" Oct 11 01:13:14 crc kubenswrapper[4743]: E1011 01:13:14.681226 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e22d773-b000-4dea-aa30-1134c593e7cc" containerName="mariadb-account-create" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.681236 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e22d773-b000-4dea-aa30-1134c593e7cc" containerName="mariadb-account-create" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.681494 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffe3be15-eb32-4556-b0a8-099aa3f9e09b" containerName="barbican-api-log" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.681533 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffe3be15-eb32-4556-b0a8-099aa3f9e09b" containerName="barbican-api" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.681545 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e22d773-b000-4dea-aa30-1134c593e7cc" containerName="mariadb-account-create" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.681557 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="be9b0c9b-a917-422d-b3aa-c9011eda53c9" containerName="mariadb-account-create" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.681567 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b83d6968-cf34-4111-b66c-f2de3eb8abce" containerName="mariadb-account-create" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.682408 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6c678b5cf4-bmh48" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.686343 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-8n2gf" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.686507 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.688709 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.713589 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6c678b5cf4-bmh48"] Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.713762 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c102-account-create-rcsw5" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.714958 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c102-account-create-rcsw5" event={"ID":"0e22d773-b000-4dea-aa30-1134c593e7cc","Type":"ContainerDied","Data":"b653eee5c1a556911f773baea282368bcce5723f621c997fc99a6671dcc74efc"} Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.714989 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b653eee5c1a556911f773baea282368bcce5723f621c997fc99a6671dcc74efc" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.732410 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d","Type":"ContainerStarted","Data":"2fa637cdca5ec7b8ba10e3f5500c78d59f8d3fc766b66b9206dac1eabf945d23"} Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.732478 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.741705 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4caf-account-create-6hbb4" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.744353 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4caf-account-create-6hbb4" event={"ID":"b83d6968-cf34-4111-b66c-f2de3eb8abce","Type":"ContainerDied","Data":"5f58fe72705e571e8d2969895e7910f34af935ad5b2aed53fc1d2157c24631a2"} Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.744381 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f58fe72705e571e8d2969895e7910f34af935ad5b2aed53fc1d2157c24631a2" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.752171 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af4301e6-88e1-4694-85a7-1215badf534d-config-data-custom\") pod \"heat-engine-6c678b5cf4-bmh48\" (UID: \"af4301e6-88e1-4694-85a7-1215badf534d\") " pod="openstack/heat-engine-6c678b5cf4-bmh48" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.752440 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8xjx\" (UniqueName: \"kubernetes.io/projected/af4301e6-88e1-4694-85a7-1215badf534d-kube-api-access-n8xjx\") pod \"heat-engine-6c678b5cf4-bmh48\" (UID: \"af4301e6-88e1-4694-85a7-1215badf534d\") " pod="openstack/heat-engine-6c678b5cf4-bmh48" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.757237 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af4301e6-88e1-4694-85a7-1215badf534d-combined-ca-bundle\") pod \"heat-engine-6c678b5cf4-bmh48\" (UID: \"af4301e6-88e1-4694-85a7-1215badf534d\") " pod="openstack/heat-engine-6c678b5cf4-bmh48" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.757469 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af4301e6-88e1-4694-85a7-1215badf534d-config-data\") pod \"heat-engine-6c678b5cf4-bmh48\" (UID: \"af4301e6-88e1-4694-85a7-1215badf534d\") " pod="openstack/heat-engine-6c678b5cf4-bmh48" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.766343 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.193233634 podStartE2EDuration="11.766326957s" podCreationTimestamp="2025-10-11 01:13:03 +0000 UTC" firstStartedPulling="2025-10-11 01:13:04.614576397 +0000 UTC m=+1279.267556794" lastFinishedPulling="2025-10-11 01:13:14.18766972 +0000 UTC m=+1288.840650117" observedRunningTime="2025-10-11 01:13:14.756782447 +0000 UTC m=+1289.409762854" watchObservedRunningTime="2025-10-11 01:13:14.766326957 +0000 UTC m=+1289.419307354" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.766457 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bf85-account-create-9q879" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.766880 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-bf85-account-create-9q879" event={"ID":"be9b0c9b-a917-422d-b3aa-c9011eda53c9","Type":"ContainerDied","Data":"952ac55361d232921e43f879be4a372bf0949f1a0ecd424dcd198ca0b5218719"} Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.767845 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="952ac55361d232921e43f879be4a372bf0949f1a0ecd424dcd198ca0b5218719" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.816494 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-59cbbb87f4-s7qvc"] Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.817892 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.828446 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.838969 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-zn2q6"] Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.840699 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.860931 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af4301e6-88e1-4694-85a7-1215badf534d-config-data-custom\") pod \"heat-engine-6c678b5cf4-bmh48\" (UID: \"af4301e6-88e1-4694-85a7-1215badf534d\") " pod="openstack/heat-engine-6c678b5cf4-bmh48" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.861026 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8xjx\" (UniqueName: \"kubernetes.io/projected/af4301e6-88e1-4694-85a7-1215badf534d-kube-api-access-n8xjx\") pod \"heat-engine-6c678b5cf4-bmh48\" (UID: \"af4301e6-88e1-4694-85a7-1215badf534d\") " pod="openstack/heat-engine-6c678b5cf4-bmh48" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.861054 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af4301e6-88e1-4694-85a7-1215badf534d-combined-ca-bundle\") pod \"heat-engine-6c678b5cf4-bmh48\" (UID: \"af4301e6-88e1-4694-85a7-1215badf534d\") " pod="openstack/heat-engine-6c678b5cf4-bmh48" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.861115 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af4301e6-88e1-4694-85a7-1215badf534d-config-data\") pod \"heat-engine-6c678b5cf4-bmh48\" (UID: \"af4301e6-88e1-4694-85a7-1215badf534d\") " pod="openstack/heat-engine-6c678b5cf4-bmh48" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.863243 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-59cbbb87f4-s7qvc"] Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.872030 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af4301e6-88e1-4694-85a7-1215badf534d-combined-ca-bundle\") pod \"heat-engine-6c678b5cf4-bmh48\" (UID: \"af4301e6-88e1-4694-85a7-1215badf534d\") " pod="openstack/heat-engine-6c678b5cf4-bmh48" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.873047 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af4301e6-88e1-4694-85a7-1215badf534d-config-data\") pod \"heat-engine-6c678b5cf4-bmh48\" (UID: \"af4301e6-88e1-4694-85a7-1215badf534d\") " pod="openstack/heat-engine-6c678b5cf4-bmh48" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.874686 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af4301e6-88e1-4694-85a7-1215badf534d-config-data-custom\") pod \"heat-engine-6c678b5cf4-bmh48\" (UID: \"af4301e6-88e1-4694-85a7-1215badf534d\") " pod="openstack/heat-engine-6c678b5cf4-bmh48" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.885901 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-zn2q6"] Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.889993 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8xjx\" (UniqueName: \"kubernetes.io/projected/af4301e6-88e1-4694-85a7-1215badf534d-kube-api-access-n8xjx\") pod \"heat-engine-6c678b5cf4-bmh48\" (UID: \"af4301e6-88e1-4694-85a7-1215badf534d\") " pod="openstack/heat-engine-6c678b5cf4-bmh48" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.894995 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-744f6c8d8b-rxjq4"] Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.896300 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-744f6c8d8b-rxjq4" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.901653 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.920403 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-744f6c8d8b-rxjq4"] Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.969266 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h7nt\" (UniqueName: \"kubernetes.io/projected/503af092-8c8f-4bdb-a27c-a76f98794769-kube-api-access-4h7nt\") pod \"heat-cfnapi-59cbbb87f4-s7qvc\" (UID: \"503af092-8c8f-4bdb-a27c-a76f98794769\") " pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.969340 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/503af092-8c8f-4bdb-a27c-a76f98794769-combined-ca-bundle\") pod \"heat-cfnapi-59cbbb87f4-s7qvc\" (UID: \"503af092-8c8f-4bdb-a27c-a76f98794769\") " pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.969371 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-zn2q6\" (UID: \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\") " pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.969414 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-zn2q6\" (UID: \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\") " pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.969481 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8844a903-e7bc-4d03-902d-981c12d7875e-config-data-custom\") pod \"heat-api-744f6c8d8b-rxjq4\" (UID: \"8844a903-e7bc-4d03-902d-981c12d7875e\") " pod="openstack/heat-api-744f6c8d8b-rxjq4" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.969542 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tljpl\" (UniqueName: \"kubernetes.io/projected/0eee5a3c-bfbc-4975-ae73-2a33d414993d-kube-api-access-tljpl\") pod \"dnsmasq-dns-688b9f5b49-zn2q6\" (UID: \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\") " pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.969608 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-zn2q6\" (UID: \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\") " pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.969682 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8844a903-e7bc-4d03-902d-981c12d7875e-config-data\") pod \"heat-api-744f6c8d8b-rxjq4\" (UID: \"8844a903-e7bc-4d03-902d-981c12d7875e\") " pod="openstack/heat-api-744f6c8d8b-rxjq4" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.969715 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mxtp\" (UniqueName: \"kubernetes.io/projected/8844a903-e7bc-4d03-902d-981c12d7875e-kube-api-access-7mxtp\") pod \"heat-api-744f6c8d8b-rxjq4\" (UID: \"8844a903-e7bc-4d03-902d-981c12d7875e\") " pod="openstack/heat-api-744f6c8d8b-rxjq4" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.969770 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8844a903-e7bc-4d03-902d-981c12d7875e-combined-ca-bundle\") pod \"heat-api-744f6c8d8b-rxjq4\" (UID: \"8844a903-e7bc-4d03-902d-981c12d7875e\") " pod="openstack/heat-api-744f6c8d8b-rxjq4" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.969795 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-config\") pod \"dnsmasq-dns-688b9f5b49-zn2q6\" (UID: \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\") " pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.969845 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/503af092-8c8f-4bdb-a27c-a76f98794769-config-data-custom\") pod \"heat-cfnapi-59cbbb87f4-s7qvc\" (UID: \"503af092-8c8f-4bdb-a27c-a76f98794769\") " pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.971020 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/503af092-8c8f-4bdb-a27c-a76f98794769-config-data\") pod \"heat-cfnapi-59cbbb87f4-s7qvc\" (UID: \"503af092-8c8f-4bdb-a27c-a76f98794769\") " pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.971121 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-zn2q6\" (UID: \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\") " pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" Oct 11 01:13:14 crc kubenswrapper[4743]: I1011 01:13:14.998795 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6c678b5cf4-bmh48" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.072955 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h7nt\" (UniqueName: \"kubernetes.io/projected/503af092-8c8f-4bdb-a27c-a76f98794769-kube-api-access-4h7nt\") pod \"heat-cfnapi-59cbbb87f4-s7qvc\" (UID: \"503af092-8c8f-4bdb-a27c-a76f98794769\") " pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.073007 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/503af092-8c8f-4bdb-a27c-a76f98794769-combined-ca-bundle\") pod \"heat-cfnapi-59cbbb87f4-s7qvc\" (UID: \"503af092-8c8f-4bdb-a27c-a76f98794769\") " pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.073027 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-zn2q6\" (UID: \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\") " pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.073050 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-zn2q6\" (UID: \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\") " pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.073080 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8844a903-e7bc-4d03-902d-981c12d7875e-config-data-custom\") pod \"heat-api-744f6c8d8b-rxjq4\" (UID: \"8844a903-e7bc-4d03-902d-981c12d7875e\") " pod="openstack/heat-api-744f6c8d8b-rxjq4" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.073107 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tljpl\" (UniqueName: \"kubernetes.io/projected/0eee5a3c-bfbc-4975-ae73-2a33d414993d-kube-api-access-tljpl\") pod \"dnsmasq-dns-688b9f5b49-zn2q6\" (UID: \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\") " pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.073135 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-zn2q6\" (UID: \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\") " pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.073164 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8844a903-e7bc-4d03-902d-981c12d7875e-config-data\") pod \"heat-api-744f6c8d8b-rxjq4\" (UID: \"8844a903-e7bc-4d03-902d-981c12d7875e\") " pod="openstack/heat-api-744f6c8d8b-rxjq4" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.073186 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mxtp\" (UniqueName: \"kubernetes.io/projected/8844a903-e7bc-4d03-902d-981c12d7875e-kube-api-access-7mxtp\") pod \"heat-api-744f6c8d8b-rxjq4\" (UID: \"8844a903-e7bc-4d03-902d-981c12d7875e\") " pod="openstack/heat-api-744f6c8d8b-rxjq4" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.073212 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8844a903-e7bc-4d03-902d-981c12d7875e-combined-ca-bundle\") pod \"heat-api-744f6c8d8b-rxjq4\" (UID: \"8844a903-e7bc-4d03-902d-981c12d7875e\") " pod="openstack/heat-api-744f6c8d8b-rxjq4" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.073231 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-config\") pod \"dnsmasq-dns-688b9f5b49-zn2q6\" (UID: \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\") " pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.073256 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/503af092-8c8f-4bdb-a27c-a76f98794769-config-data-custom\") pod \"heat-cfnapi-59cbbb87f4-s7qvc\" (UID: \"503af092-8c8f-4bdb-a27c-a76f98794769\") " pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.073284 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/503af092-8c8f-4bdb-a27c-a76f98794769-config-data\") pod \"heat-cfnapi-59cbbb87f4-s7qvc\" (UID: \"503af092-8c8f-4bdb-a27c-a76f98794769\") " pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.073318 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-zn2q6\" (UID: \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\") " pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.074402 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-zn2q6\" (UID: \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\") " pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.075880 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-zn2q6\" (UID: \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\") " pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.075894 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-config\") pod \"dnsmasq-dns-688b9f5b49-zn2q6\" (UID: \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\") " pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.075956 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-zn2q6\" (UID: \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\") " pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.076563 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-zn2q6\" (UID: \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\") " pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.079344 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/503af092-8c8f-4bdb-a27c-a76f98794769-config-data\") pod \"heat-cfnapi-59cbbb87f4-s7qvc\" (UID: \"503af092-8c8f-4bdb-a27c-a76f98794769\") " pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.080127 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8844a903-e7bc-4d03-902d-981c12d7875e-combined-ca-bundle\") pod \"heat-api-744f6c8d8b-rxjq4\" (UID: \"8844a903-e7bc-4d03-902d-981c12d7875e\") " pod="openstack/heat-api-744f6c8d8b-rxjq4" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.081402 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/503af092-8c8f-4bdb-a27c-a76f98794769-combined-ca-bundle\") pod \"heat-cfnapi-59cbbb87f4-s7qvc\" (UID: \"503af092-8c8f-4bdb-a27c-a76f98794769\") " pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.081993 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/503af092-8c8f-4bdb-a27c-a76f98794769-config-data-custom\") pod \"heat-cfnapi-59cbbb87f4-s7qvc\" (UID: \"503af092-8c8f-4bdb-a27c-a76f98794769\") " pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.083935 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8844a903-e7bc-4d03-902d-981c12d7875e-config-data\") pod \"heat-api-744f6c8d8b-rxjq4\" (UID: \"8844a903-e7bc-4d03-902d-981c12d7875e\") " pod="openstack/heat-api-744f6c8d8b-rxjq4" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.085035 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8844a903-e7bc-4d03-902d-981c12d7875e-config-data-custom\") pod \"heat-api-744f6c8d8b-rxjq4\" (UID: \"8844a903-e7bc-4d03-902d-981c12d7875e\") " pod="openstack/heat-api-744f6c8d8b-rxjq4" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.097255 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tljpl\" (UniqueName: \"kubernetes.io/projected/0eee5a3c-bfbc-4975-ae73-2a33d414993d-kube-api-access-tljpl\") pod \"dnsmasq-dns-688b9f5b49-zn2q6\" (UID: \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\") " pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.103023 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h7nt\" (UniqueName: \"kubernetes.io/projected/503af092-8c8f-4bdb-a27c-a76f98794769-kube-api-access-4h7nt\") pod \"heat-cfnapi-59cbbb87f4-s7qvc\" (UID: \"503af092-8c8f-4bdb-a27c-a76f98794769\") " pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.103033 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mxtp\" (UniqueName: \"kubernetes.io/projected/8844a903-e7bc-4d03-902d-981c12d7875e-kube-api-access-7mxtp\") pod \"heat-api-744f6c8d8b-rxjq4\" (UID: \"8844a903-e7bc-4d03-902d-981c12d7875e\") " pod="openstack/heat-api-744f6c8d8b-rxjq4" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.144183 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.216846 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.218565 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-744f6c8d8b-rxjq4" Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.491607 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6c678b5cf4-bmh48"] Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.714101 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-59cbbb87f4-s7qvc"] Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.865897 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" event={"ID":"503af092-8c8f-4bdb-a27c-a76f98794769","Type":"ContainerStarted","Data":"e35cbe6a63ad222f2b8416f90ed48d8cf5026c126ea5ffb8289700c64a6fa046"} Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.881276 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6c678b5cf4-bmh48" event={"ID":"af4301e6-88e1-4694-85a7-1215badf534d","Type":"ContainerStarted","Data":"3377de58d0fbfefa741404c142433eaa5a579518b7b24367dcd7f35e82241895"} Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.925924 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-744f6c8d8b-rxjq4"] Oct 11 01:13:15 crc kubenswrapper[4743]: I1011 01:13:15.938807 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-zn2q6"] Oct 11 01:13:15 crc kubenswrapper[4743]: W1011 01:13:15.945461 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0eee5a3c_bfbc_4975_ae73_2a33d414993d.slice/crio-d81cc8604a270822001d29705df8201b8f9a2d7fc2322c78dba5267eb154f4ac WatchSource:0}: Error finding container d81cc8604a270822001d29705df8201b8f9a2d7fc2322c78dba5267eb154f4ac: Status 404 returned error can't find the container with id d81cc8604a270822001d29705df8201b8f9a2d7fc2322c78dba5267eb154f4ac Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.145927 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-89lgh"] Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.147569 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-89lgh" Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.152289 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.152464 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.152579 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-7hvcg" Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.192567 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-89lgh"] Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.215999 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5wv5\" (UniqueName: \"kubernetes.io/projected/69e15d6c-dacd-4466-aac4-050cda6242aa-kube-api-access-b5wv5\") pod \"nova-cell0-conductor-db-sync-89lgh\" (UID: \"69e15d6c-dacd-4466-aac4-050cda6242aa\") " pod="openstack/nova-cell0-conductor-db-sync-89lgh" Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.216096 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e15d6c-dacd-4466-aac4-050cda6242aa-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-89lgh\" (UID: \"69e15d6c-dacd-4466-aac4-050cda6242aa\") " pod="openstack/nova-cell0-conductor-db-sync-89lgh" Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.216152 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69e15d6c-dacd-4466-aac4-050cda6242aa-scripts\") pod \"nova-cell0-conductor-db-sync-89lgh\" (UID: \"69e15d6c-dacd-4466-aac4-050cda6242aa\") " pod="openstack/nova-cell0-conductor-db-sync-89lgh" Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.216284 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69e15d6c-dacd-4466-aac4-050cda6242aa-config-data\") pod \"nova-cell0-conductor-db-sync-89lgh\" (UID: \"69e15d6c-dacd-4466-aac4-050cda6242aa\") " pod="openstack/nova-cell0-conductor-db-sync-89lgh" Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.318212 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69e15d6c-dacd-4466-aac4-050cda6242aa-scripts\") pod \"nova-cell0-conductor-db-sync-89lgh\" (UID: \"69e15d6c-dacd-4466-aac4-050cda6242aa\") " pod="openstack/nova-cell0-conductor-db-sync-89lgh" Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.318371 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69e15d6c-dacd-4466-aac4-050cda6242aa-config-data\") pod \"nova-cell0-conductor-db-sync-89lgh\" (UID: \"69e15d6c-dacd-4466-aac4-050cda6242aa\") " pod="openstack/nova-cell0-conductor-db-sync-89lgh" Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.318441 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5wv5\" (UniqueName: \"kubernetes.io/projected/69e15d6c-dacd-4466-aac4-050cda6242aa-kube-api-access-b5wv5\") pod \"nova-cell0-conductor-db-sync-89lgh\" (UID: \"69e15d6c-dacd-4466-aac4-050cda6242aa\") " pod="openstack/nova-cell0-conductor-db-sync-89lgh" Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.318563 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e15d6c-dacd-4466-aac4-050cda6242aa-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-89lgh\" (UID: \"69e15d6c-dacd-4466-aac4-050cda6242aa\") " pod="openstack/nova-cell0-conductor-db-sync-89lgh" Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.323342 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69e15d6c-dacd-4466-aac4-050cda6242aa-scripts\") pod \"nova-cell0-conductor-db-sync-89lgh\" (UID: \"69e15d6c-dacd-4466-aac4-050cda6242aa\") " pod="openstack/nova-cell0-conductor-db-sync-89lgh" Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.325387 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e15d6c-dacd-4466-aac4-050cda6242aa-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-89lgh\" (UID: \"69e15d6c-dacd-4466-aac4-050cda6242aa\") " pod="openstack/nova-cell0-conductor-db-sync-89lgh" Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.332359 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69e15d6c-dacd-4466-aac4-050cda6242aa-config-data\") pod \"nova-cell0-conductor-db-sync-89lgh\" (UID: \"69e15d6c-dacd-4466-aac4-050cda6242aa\") " pod="openstack/nova-cell0-conductor-db-sync-89lgh" Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.335803 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5wv5\" (UniqueName: \"kubernetes.io/projected/69e15d6c-dacd-4466-aac4-050cda6242aa-kube-api-access-b5wv5\") pod \"nova-cell0-conductor-db-sync-89lgh\" (UID: \"69e15d6c-dacd-4466-aac4-050cda6242aa\") " pod="openstack/nova-cell0-conductor-db-sync-89lgh" Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.480414 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-89lgh" Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.918227 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-744f6c8d8b-rxjq4" event={"ID":"8844a903-e7bc-4d03-902d-981c12d7875e","Type":"ContainerStarted","Data":"a667b397d447ac68a7273de136d997ec6b3904a5891a652489326691fcd1ce7a"} Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.935592 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6c678b5cf4-bmh48" event={"ID":"af4301e6-88e1-4694-85a7-1215badf534d","Type":"ContainerStarted","Data":"8c0dd7350e0076132704f385fde81eb45a4f8bb25fde0b3984cb36baadd16b1b"} Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.935647 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6c678b5cf4-bmh48" Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.944706 4743 generic.go:334] "Generic (PLEG): container finished" podID="0eee5a3c-bfbc-4975-ae73-2a33d414993d" containerID="f7267109be097986e6cf7305fe5daa2a45df167979ef4f092249b0121c3614cb" exitCode=0 Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.944739 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" event={"ID":"0eee5a3c-bfbc-4975-ae73-2a33d414993d","Type":"ContainerDied","Data":"f7267109be097986e6cf7305fe5daa2a45df167979ef4f092249b0121c3614cb"} Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.944760 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" event={"ID":"0eee5a3c-bfbc-4975-ae73-2a33d414993d","Type":"ContainerStarted","Data":"d81cc8604a270822001d29705df8201b8f9a2d7fc2322c78dba5267eb154f4ac"} Oct 11 01:13:16 crc kubenswrapper[4743]: I1011 01:13:16.986406 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6c678b5cf4-bmh48" podStartSLOduration=2.986388401 podStartE2EDuration="2.986388401s" podCreationTimestamp="2025-10-11 01:13:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:13:16.960505674 +0000 UTC m=+1291.613486071" watchObservedRunningTime="2025-10-11 01:13:16.986388401 +0000 UTC m=+1291.639368798" Oct 11 01:13:17 crc kubenswrapper[4743]: I1011 01:13:17.081777 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-89lgh"] Oct 11 01:13:17 crc kubenswrapper[4743]: I1011 01:13:17.958800 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-89lgh" event={"ID":"69e15d6c-dacd-4466-aac4-050cda6242aa","Type":"ContainerStarted","Data":"893e2c5dfbe80b1c351c0a5ff5017a062d0333b20e215236b783db626b9b27b9"} Oct 11 01:13:17 crc kubenswrapper[4743]: I1011 01:13:17.969447 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" event={"ID":"0eee5a3c-bfbc-4975-ae73-2a33d414993d","Type":"ContainerStarted","Data":"a47dc85cfad70d954a488b40336b8c7e14e47decfbc6af22ee4cd25cfd7faf3a"} Oct 11 01:13:17 crc kubenswrapper[4743]: I1011 01:13:17.969499 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" Oct 11 01:13:17 crc kubenswrapper[4743]: I1011 01:13:17.990415 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" podStartSLOduration=3.990398436 podStartE2EDuration="3.990398436s" podCreationTimestamp="2025-10-11 01:13:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:13:17.989565066 +0000 UTC m=+1292.642545473" watchObservedRunningTime="2025-10-11 01:13:17.990398436 +0000 UTC m=+1292.643378833" Oct 11 01:13:18 crc kubenswrapper[4743]: I1011 01:13:18.982474 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-744f6c8d8b-rxjq4" event={"ID":"8844a903-e7bc-4d03-902d-981c12d7875e","Type":"ContainerStarted","Data":"79931a3b1a2a3de0f75bbddd6835b3e060312b1568e64c19ce7de9f33d285b62"} Oct 11 01:13:18 crc kubenswrapper[4743]: I1011 01:13:18.982941 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-744f6c8d8b-rxjq4" Oct 11 01:13:18 crc kubenswrapper[4743]: I1011 01:13:18.990673 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" event={"ID":"503af092-8c8f-4bdb-a27c-a76f98794769","Type":"ContainerStarted","Data":"431a075a88c2da6003e5c1ad5469b0301c72d0144d978d2f1c306e7f3284e882"} Oct 11 01:13:18 crc kubenswrapper[4743]: I1011 01:13:18.997694 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-744f6c8d8b-rxjq4" podStartSLOduration=2.427279623 podStartE2EDuration="4.997678065s" podCreationTimestamp="2025-10-11 01:13:14 +0000 UTC" firstStartedPulling="2025-10-11 01:13:15.937233446 +0000 UTC m=+1290.590213843" lastFinishedPulling="2025-10-11 01:13:18.507631888 +0000 UTC m=+1293.160612285" observedRunningTime="2025-10-11 01:13:18.9970146 +0000 UTC m=+1293.649994997" watchObservedRunningTime="2025-10-11 01:13:18.997678065 +0000 UTC m=+1293.650658462" Oct 11 01:13:19 crc kubenswrapper[4743]: I1011 01:13:19.020544 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" podStartSLOduration=2.254220207 podStartE2EDuration="5.020524941s" podCreationTimestamp="2025-10-11 01:13:14 +0000 UTC" firstStartedPulling="2025-10-11 01:13:15.735045319 +0000 UTC m=+1290.388025716" lastFinishedPulling="2025-10-11 01:13:18.501350053 +0000 UTC m=+1293.154330450" observedRunningTime="2025-10-11 01:13:19.016443497 +0000 UTC m=+1293.669423894" watchObservedRunningTime="2025-10-11 01:13:19.020524941 +0000 UTC m=+1293.673505338" Oct 11 01:13:19 crc kubenswrapper[4743]: I1011 01:13:19.184587 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:13:19 crc kubenswrapper[4743]: I1011 01:13:19.184870 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" containerName="ceilometer-central-agent" containerID="cri-o://9d69be4229595b4449ea06e4b1ea8db3911c5ab0dcdb6905b1b450714646acbf" gracePeriod=30 Oct 11 01:13:19 crc kubenswrapper[4743]: I1011 01:13:19.185058 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" containerName="ceilometer-notification-agent" containerID="cri-o://c030322bf3f9d22192cda6219d287c61f6d4b397687de7b0ba1b5fade85a26fd" gracePeriod=30 Oct 11 01:13:19 crc kubenswrapper[4743]: I1011 01:13:19.185073 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" containerName="proxy-httpd" containerID="cri-o://2fa637cdca5ec7b8ba10e3f5500c78d59f8d3fc766b66b9206dac1eabf945d23" gracePeriod=30 Oct 11 01:13:19 crc kubenswrapper[4743]: I1011 01:13:19.185143 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" containerName="sg-core" containerID="cri-o://ca00a88290bf7e8f01c0cf70fd87a61cb2ad734ca7dc7c0269e108c52d00faa2" gracePeriod=30 Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.008796 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" containerID="2fa637cdca5ec7b8ba10e3f5500c78d59f8d3fc766b66b9206dac1eabf945d23" exitCode=0 Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.009239 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" containerID="ca00a88290bf7e8f01c0cf70fd87a61cb2ad734ca7dc7c0269e108c52d00faa2" exitCode=2 Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.009250 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" containerID="c030322bf3f9d22192cda6219d287c61f6d4b397687de7b0ba1b5fade85a26fd" exitCode=0 Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.009260 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" containerID="9d69be4229595b4449ea06e4b1ea8db3911c5ab0dcdb6905b1b450714646acbf" exitCode=0 Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.010123 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d","Type":"ContainerDied","Data":"2fa637cdca5ec7b8ba10e3f5500c78d59f8d3fc766b66b9206dac1eabf945d23"} Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.010236 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d","Type":"ContainerDied","Data":"ca00a88290bf7e8f01c0cf70fd87a61cb2ad734ca7dc7c0269e108c52d00faa2"} Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.010252 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d","Type":"ContainerDied","Data":"c030322bf3f9d22192cda6219d287c61f6d4b397687de7b0ba1b5fade85a26fd"} Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.010287 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d","Type":"ContainerDied","Data":"9d69be4229595b4449ea06e4b1ea8db3911c5ab0dcdb6905b1b450714646acbf"} Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.010854 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.165400 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.221027 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-sg-core-conf-yaml\") pod \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.221168 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-config-data\") pod \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.221368 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-run-httpd\") pod \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.221423 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fq69m\" (UniqueName: \"kubernetes.io/projected/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-kube-api-access-fq69m\") pod \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.221489 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-scripts\") pod \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.221586 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-combined-ca-bundle\") pod \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.221681 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-log-httpd\") pod \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\" (UID: \"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d\") " Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.222741 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" (UID: "1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.231636 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" (UID: "1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.235521 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-scripts" (OuterVolumeSpecName: "scripts") pod "1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" (UID: "1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.251037 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-kube-api-access-fq69m" (OuterVolumeSpecName: "kube-api-access-fq69m") pod "1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" (UID: "1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d"). InnerVolumeSpecName "kube-api-access-fq69m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.281472 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" (UID: "1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.325516 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.325547 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fq69m\" (UniqueName: \"kubernetes.io/projected/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-kube-api-access-fq69m\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.325559 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.325566 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.325574 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.379035 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" (UID: "1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.418549 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-config-data" (OuterVolumeSpecName: "config-data") pod "1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" (UID: "1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.429798 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:20 crc kubenswrapper[4743]: I1011 01:13:20.429832 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.023965 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d","Type":"ContainerDied","Data":"bbc46afe7a604db04d4dab09cc6412577995db755909c365552311c78140a1eb"} Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.024260 4743 scope.go:117] "RemoveContainer" containerID="2fa637cdca5ec7b8ba10e3f5500c78d59f8d3fc766b66b9206dac1eabf945d23" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.024032 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.059366 4743 scope.go:117] "RemoveContainer" containerID="ca00a88290bf7e8f01c0cf70fd87a61cb2ad734ca7dc7c0269e108c52d00faa2" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.059570 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.067783 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.083843 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:13:21 crc kubenswrapper[4743]: E1011 01:13:21.084290 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" containerName="ceilometer-central-agent" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.084309 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" containerName="ceilometer-central-agent" Oct 11 01:13:21 crc kubenswrapper[4743]: E1011 01:13:21.084333 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" containerName="ceilometer-notification-agent" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.084341 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" containerName="ceilometer-notification-agent" Oct 11 01:13:21 crc kubenswrapper[4743]: E1011 01:13:21.084352 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" containerName="sg-core" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.084358 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" containerName="sg-core" Oct 11 01:13:21 crc kubenswrapper[4743]: E1011 01:13:21.084390 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" containerName="proxy-httpd" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.084398 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" containerName="proxy-httpd" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.084593 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" containerName="ceilometer-notification-agent" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.084615 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" containerName="proxy-httpd" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.084627 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" containerName="sg-core" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.084641 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" containerName="ceilometer-central-agent" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.087138 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.108653 4743 scope.go:117] "RemoveContainer" containerID="c030322bf3f9d22192cda6219d287c61f6d4b397687de7b0ba1b5fade85a26fd" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.110238 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.110419 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.113782 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.156248 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-config-data\") pod \"ceilometer-0\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " pod="openstack/ceilometer-0" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.156315 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-scripts\") pod \"ceilometer-0\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " pod="openstack/ceilometer-0" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.156337 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " pod="openstack/ceilometer-0" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.156350 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-run-httpd\") pod \"ceilometer-0\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " pod="openstack/ceilometer-0" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.156390 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " pod="openstack/ceilometer-0" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.156418 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhvs5\" (UniqueName: \"kubernetes.io/projected/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-kube-api-access-dhvs5\") pod \"ceilometer-0\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " pod="openstack/ceilometer-0" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.156436 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-log-httpd\") pod \"ceilometer-0\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " pod="openstack/ceilometer-0" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.200140 4743 scope.go:117] "RemoveContainer" containerID="9d69be4229595b4449ea06e4b1ea8db3911c5ab0dcdb6905b1b450714646acbf" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.280888 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhvs5\" (UniqueName: \"kubernetes.io/projected/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-kube-api-access-dhvs5\") pod \"ceilometer-0\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " pod="openstack/ceilometer-0" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.280943 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-log-httpd\") pod \"ceilometer-0\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " pod="openstack/ceilometer-0" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.281034 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-config-data\") pod \"ceilometer-0\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " pod="openstack/ceilometer-0" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.281089 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-scripts\") pod \"ceilometer-0\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " pod="openstack/ceilometer-0" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.281107 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " pod="openstack/ceilometer-0" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.281122 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-run-httpd\") pod \"ceilometer-0\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " pod="openstack/ceilometer-0" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.281165 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " pod="openstack/ceilometer-0" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.297491 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-log-httpd\") pod \"ceilometer-0\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " pod="openstack/ceilometer-0" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.298596 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-run-httpd\") pod \"ceilometer-0\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " pod="openstack/ceilometer-0" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.303784 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-config-data\") pod \"ceilometer-0\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " pod="openstack/ceilometer-0" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.309293 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " pod="openstack/ceilometer-0" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.315438 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " pod="openstack/ceilometer-0" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.315948 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-scripts\") pod \"ceilometer-0\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " pod="openstack/ceilometer-0" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.326488 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhvs5\" (UniqueName: \"kubernetes.io/projected/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-kube-api-access-dhvs5\") pod \"ceilometer-0\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " pod="openstack/ceilometer-0" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.407417 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:13:21 crc kubenswrapper[4743]: I1011 01:13:21.934057 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:13:21 crc kubenswrapper[4743]: W1011 01:13:21.938582 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a5ec0fc_3819_4bbf_81d8_ef9d01e2b96f.slice/crio-9636c2404d3e927d58328feea44f3e5ec43ec692622c74b9e7c585561b13cd92 WatchSource:0}: Error finding container 9636c2404d3e927d58328feea44f3e5ec43ec692622c74b9e7c585561b13cd92: Status 404 returned error can't find the container with id 9636c2404d3e927d58328feea44f3e5ec43ec692622c74b9e7c585561b13cd92 Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.038539 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f","Type":"ContainerStarted","Data":"9636c2404d3e927d58328feea44f3e5ec43ec692622c74b9e7c585561b13cd92"} Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.103801 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d" path="/var/lib/kubelet/pods/1d6e69fe-90ee-4a77-81e9-b5ec5d864d9d/volumes" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.240463 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-fdd7c75fc-rtmvl"] Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.242335 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-fdd7c75fc-rtmvl" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.248690 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-fdd7c75fc-rtmvl"] Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.307238 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5dc9695786-kqfsr"] Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.308668 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5dc9695786-kqfsr" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.317553 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5dbbddbc6d-2nprx"] Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.319006 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.328050 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5dc9695786-kqfsr"] Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.341033 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5dbbddbc6d-2nprx"] Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.412987 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b5e85f-5e3b-4230-861b-f124e542d8db-config-data\") pod \"heat-cfnapi-5dbbddbc6d-2nprx\" (UID: \"e0b5e85f-5e3b-4230-861b-f124e542d8db\") " pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.413043 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e39acaa-0992-471f-a015-46714fde82cf-config-data-custom\") pod \"heat-engine-fdd7c75fc-rtmvl\" (UID: \"5e39acaa-0992-471f-a015-46714fde82cf\") " pod="openstack/heat-engine-fdd7c75fc-rtmvl" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.413069 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e39acaa-0992-471f-a015-46714fde82cf-config-data\") pod \"heat-engine-fdd7c75fc-rtmvl\" (UID: \"5e39acaa-0992-471f-a015-46714fde82cf\") " pod="openstack/heat-engine-fdd7c75fc-rtmvl" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.413103 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d-config-data-custom\") pod \"heat-api-5dc9695786-kqfsr\" (UID: \"af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d\") " pod="openstack/heat-api-5dc9695786-kqfsr" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.413197 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0b5e85f-5e3b-4230-861b-f124e542d8db-config-data-custom\") pod \"heat-cfnapi-5dbbddbc6d-2nprx\" (UID: \"e0b5e85f-5e3b-4230-861b-f124e542d8db\") " pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.413397 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d-combined-ca-bundle\") pod \"heat-api-5dc9695786-kqfsr\" (UID: \"af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d\") " pod="openstack/heat-api-5dc9695786-kqfsr" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.413542 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e39acaa-0992-471f-a015-46714fde82cf-combined-ca-bundle\") pod \"heat-engine-fdd7c75fc-rtmvl\" (UID: \"5e39acaa-0992-471f-a015-46714fde82cf\") " pod="openstack/heat-engine-fdd7c75fc-rtmvl" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.413580 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d-config-data\") pod \"heat-api-5dc9695786-kqfsr\" (UID: \"af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d\") " pod="openstack/heat-api-5dc9695786-kqfsr" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.413695 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvql9\" (UniqueName: \"kubernetes.io/projected/e0b5e85f-5e3b-4230-861b-f124e542d8db-kube-api-access-xvql9\") pod \"heat-cfnapi-5dbbddbc6d-2nprx\" (UID: \"e0b5e85f-5e3b-4230-861b-f124e542d8db\") " pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.413795 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd48r\" (UniqueName: \"kubernetes.io/projected/5e39acaa-0992-471f-a015-46714fde82cf-kube-api-access-qd48r\") pod \"heat-engine-fdd7c75fc-rtmvl\" (UID: \"5e39acaa-0992-471f-a015-46714fde82cf\") " pod="openstack/heat-engine-fdd7c75fc-rtmvl" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.413815 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b5e85f-5e3b-4230-861b-f124e542d8db-combined-ca-bundle\") pod \"heat-cfnapi-5dbbddbc6d-2nprx\" (UID: \"e0b5e85f-5e3b-4230-861b-f124e542d8db\") " pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.413850 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5p2q\" (UniqueName: \"kubernetes.io/projected/af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d-kube-api-access-g5p2q\") pod \"heat-api-5dc9695786-kqfsr\" (UID: \"af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d\") " pod="openstack/heat-api-5dc9695786-kqfsr" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.515255 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvql9\" (UniqueName: \"kubernetes.io/projected/e0b5e85f-5e3b-4230-861b-f124e542d8db-kube-api-access-xvql9\") pod \"heat-cfnapi-5dbbddbc6d-2nprx\" (UID: \"e0b5e85f-5e3b-4230-861b-f124e542d8db\") " pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.515317 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd48r\" (UniqueName: \"kubernetes.io/projected/5e39acaa-0992-471f-a015-46714fde82cf-kube-api-access-qd48r\") pod \"heat-engine-fdd7c75fc-rtmvl\" (UID: \"5e39acaa-0992-471f-a015-46714fde82cf\") " pod="openstack/heat-engine-fdd7c75fc-rtmvl" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.515340 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b5e85f-5e3b-4230-861b-f124e542d8db-combined-ca-bundle\") pod \"heat-cfnapi-5dbbddbc6d-2nprx\" (UID: \"e0b5e85f-5e3b-4230-861b-f124e542d8db\") " pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.515357 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5p2q\" (UniqueName: \"kubernetes.io/projected/af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d-kube-api-access-g5p2q\") pod \"heat-api-5dc9695786-kqfsr\" (UID: \"af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d\") " pod="openstack/heat-api-5dc9695786-kqfsr" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.515393 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b5e85f-5e3b-4230-861b-f124e542d8db-config-data\") pod \"heat-cfnapi-5dbbddbc6d-2nprx\" (UID: \"e0b5e85f-5e3b-4230-861b-f124e542d8db\") " pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.515423 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e39acaa-0992-471f-a015-46714fde82cf-config-data-custom\") pod \"heat-engine-fdd7c75fc-rtmvl\" (UID: \"5e39acaa-0992-471f-a015-46714fde82cf\") " pod="openstack/heat-engine-fdd7c75fc-rtmvl" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.515446 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e39acaa-0992-471f-a015-46714fde82cf-config-data\") pod \"heat-engine-fdd7c75fc-rtmvl\" (UID: \"5e39acaa-0992-471f-a015-46714fde82cf\") " pod="openstack/heat-engine-fdd7c75fc-rtmvl" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.515476 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d-config-data-custom\") pod \"heat-api-5dc9695786-kqfsr\" (UID: \"af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d\") " pod="openstack/heat-api-5dc9695786-kqfsr" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.515536 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0b5e85f-5e3b-4230-861b-f124e542d8db-config-data-custom\") pod \"heat-cfnapi-5dbbddbc6d-2nprx\" (UID: \"e0b5e85f-5e3b-4230-861b-f124e542d8db\") " pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.515572 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d-combined-ca-bundle\") pod \"heat-api-5dc9695786-kqfsr\" (UID: \"af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d\") " pod="openstack/heat-api-5dc9695786-kqfsr" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.515608 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e39acaa-0992-471f-a015-46714fde82cf-combined-ca-bundle\") pod \"heat-engine-fdd7c75fc-rtmvl\" (UID: \"5e39acaa-0992-471f-a015-46714fde82cf\") " pod="openstack/heat-engine-fdd7c75fc-rtmvl" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.515629 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d-config-data\") pod \"heat-api-5dc9695786-kqfsr\" (UID: \"af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d\") " pod="openstack/heat-api-5dc9695786-kqfsr" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.519236 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e39acaa-0992-471f-a015-46714fde82cf-config-data-custom\") pod \"heat-engine-fdd7c75fc-rtmvl\" (UID: \"5e39acaa-0992-471f-a015-46714fde82cf\") " pod="openstack/heat-engine-fdd7c75fc-rtmvl" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.521194 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d-config-data-custom\") pod \"heat-api-5dc9695786-kqfsr\" (UID: \"af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d\") " pod="openstack/heat-api-5dc9695786-kqfsr" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.521402 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b5e85f-5e3b-4230-861b-f124e542d8db-combined-ca-bundle\") pod \"heat-cfnapi-5dbbddbc6d-2nprx\" (UID: \"e0b5e85f-5e3b-4230-861b-f124e542d8db\") " pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.521750 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e39acaa-0992-471f-a015-46714fde82cf-combined-ca-bundle\") pod \"heat-engine-fdd7c75fc-rtmvl\" (UID: \"5e39acaa-0992-471f-a015-46714fde82cf\") " pod="openstack/heat-engine-fdd7c75fc-rtmvl" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.525548 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d-config-data\") pod \"heat-api-5dc9695786-kqfsr\" (UID: \"af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d\") " pod="openstack/heat-api-5dc9695786-kqfsr" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.529219 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0b5e85f-5e3b-4230-861b-f124e542d8db-config-data-custom\") pod \"heat-cfnapi-5dbbddbc6d-2nprx\" (UID: \"e0b5e85f-5e3b-4230-861b-f124e542d8db\") " pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.531306 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b5e85f-5e3b-4230-861b-f124e542d8db-config-data\") pod \"heat-cfnapi-5dbbddbc6d-2nprx\" (UID: \"e0b5e85f-5e3b-4230-861b-f124e542d8db\") " pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.533553 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvql9\" (UniqueName: \"kubernetes.io/projected/e0b5e85f-5e3b-4230-861b-f124e542d8db-kube-api-access-xvql9\") pod \"heat-cfnapi-5dbbddbc6d-2nprx\" (UID: \"e0b5e85f-5e3b-4230-861b-f124e542d8db\") " pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.534396 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e39acaa-0992-471f-a015-46714fde82cf-config-data\") pod \"heat-engine-fdd7c75fc-rtmvl\" (UID: \"5e39acaa-0992-471f-a015-46714fde82cf\") " pod="openstack/heat-engine-fdd7c75fc-rtmvl" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.536731 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d-combined-ca-bundle\") pod \"heat-api-5dc9695786-kqfsr\" (UID: \"af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d\") " pod="openstack/heat-api-5dc9695786-kqfsr" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.541542 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd48r\" (UniqueName: \"kubernetes.io/projected/5e39acaa-0992-471f-a015-46714fde82cf-kube-api-access-qd48r\") pod \"heat-engine-fdd7c75fc-rtmvl\" (UID: \"5e39acaa-0992-471f-a015-46714fde82cf\") " pod="openstack/heat-engine-fdd7c75fc-rtmvl" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.542545 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5p2q\" (UniqueName: \"kubernetes.io/projected/af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d-kube-api-access-g5p2q\") pod \"heat-api-5dc9695786-kqfsr\" (UID: \"af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d\") " pod="openstack/heat-api-5dc9695786-kqfsr" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.591917 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-fdd7c75fc-rtmvl" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.633365 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5dc9695786-kqfsr" Oct 11 01:13:22 crc kubenswrapper[4743]: I1011 01:13:22.657757 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" Oct 11 01:13:23 crc kubenswrapper[4743]: I1011 01:13:23.059338 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-fdd7c75fc-rtmvl"] Oct 11 01:13:23 crc kubenswrapper[4743]: I1011 01:13:23.068736 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f","Type":"ContainerStarted","Data":"a408fbe7dcffa686789536eab5166c077dac6c7b88aa79bf50345b509e3f9f06"} Oct 11 01:13:23 crc kubenswrapper[4743]: I1011 01:13:23.240933 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5dc9695786-kqfsr"] Oct 11 01:13:23 crc kubenswrapper[4743]: I1011 01:13:23.252694 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5dbbddbc6d-2nprx"] Oct 11 01:13:23 crc kubenswrapper[4743]: W1011 01:13:23.252982 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0b5e85f_5e3b_4230_861b_f124e542d8db.slice/crio-0516aafc275b32bb280cf6d320b07016239d44a16fafa5a2917515720201379c WatchSource:0}: Error finding container 0516aafc275b32bb280cf6d320b07016239d44a16fafa5a2917515720201379c: Status 404 returned error can't find the container with id 0516aafc275b32bb280cf6d320b07016239d44a16fafa5a2917515720201379c Oct 11 01:13:23 crc kubenswrapper[4743]: I1011 01:13:23.987825 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-744f6c8d8b-rxjq4"] Oct 11 01:13:23 crc kubenswrapper[4743]: I1011 01:13:23.988040 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-744f6c8d8b-rxjq4" podUID="8844a903-e7bc-4d03-902d-981c12d7875e" containerName="heat-api" containerID="cri-o://79931a3b1a2a3de0f75bbddd6835b3e060312b1568e64c19ce7de9f33d285b62" gracePeriod=60 Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.009427 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-744f6c8d8b-rxjq4" podUID="8844a903-e7bc-4d03-902d-981c12d7875e" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.204:8004/healthcheck\": EOF" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.030509 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-698b7768c9-bwljp"] Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.032065 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-698b7768c9-bwljp" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.034626 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.035352 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.054386 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-59cbbb87f4-s7qvc"] Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.054776 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" podUID="503af092-8c8f-4bdb-a27c-a76f98794769" containerName="heat-cfnapi" containerID="cri-o://431a075a88c2da6003e5c1ad5469b0301c72d0144d978d2f1c306e7f3284e882" gracePeriod=60 Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.073537 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-698b7768c9-bwljp"] Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.135809 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-fdd7c75fc-rtmvl" event={"ID":"5e39acaa-0992-471f-a015-46714fde82cf","Type":"ContainerStarted","Data":"18f7a239f2e0707135ecd80c3ce36f7686fd8c2c29bfe008d2373815e615ee5c"} Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.135886 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-fdd7c75fc-rtmvl" event={"ID":"5e39acaa-0992-471f-a015-46714fde82cf","Type":"ContainerStarted","Data":"8a42a422b4ec2501bc22ee70b22f3500375cb3084687f408f504310195e4e748"} Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.135903 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-fdd7c75fc-rtmvl" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.135914 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" event={"ID":"e0b5e85f-5e3b-4230-861b-f124e542d8db","Type":"ContainerStarted","Data":"0516aafc275b32bb280cf6d320b07016239d44a16fafa5a2917515720201379c"} Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.135929 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5dc9695786-kqfsr" event={"ID":"af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d","Type":"ContainerStarted","Data":"286de206a4b61adee28d2f2fa77022c309f5151411c4345701b335ef09eee362"} Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.135944 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6877c7bb88-shzwl"] Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.137379 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6877c7bb88-shzwl" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.144353 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.144717 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.151258 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-config-data\") pod \"heat-api-698b7768c9-bwljp\" (UID: \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\") " pod="openstack/heat-api-698b7768c9-bwljp" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.151314 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-internal-tls-certs\") pod \"heat-api-698b7768c9-bwljp\" (UID: \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\") " pod="openstack/heat-api-698b7768c9-bwljp" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.151340 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-public-tls-certs\") pod \"heat-api-698b7768c9-bwljp\" (UID: \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\") " pod="openstack/heat-api-698b7768c9-bwljp" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.151389 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72vn9\" (UniqueName: \"kubernetes.io/projected/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-kube-api-access-72vn9\") pod \"heat-api-698b7768c9-bwljp\" (UID: \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\") " pod="openstack/heat-api-698b7768c9-bwljp" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.151433 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-config-data-custom\") pod \"heat-api-698b7768c9-bwljp\" (UID: \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\") " pod="openstack/heat-api-698b7768c9-bwljp" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.151455 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-combined-ca-bundle\") pod \"heat-api-698b7768c9-bwljp\" (UID: \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\") " pod="openstack/heat-api-698b7768c9-bwljp" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.189119 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6877c7bb88-shzwl"] Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.196331 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-fdd7c75fc-rtmvl" podStartSLOduration=2.196313022 podStartE2EDuration="2.196313022s" podCreationTimestamp="2025-10-11 01:13:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:13:24.134692973 +0000 UTC m=+1298.787673370" watchObservedRunningTime="2025-10-11 01:13:24.196313022 +0000 UTC m=+1298.849293419" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.253751 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-config-data\") pod \"heat-cfnapi-6877c7bb88-shzwl\" (UID: \"0173521e-a9ee-43e3-9760-f3f12527c84b\") " pod="openstack/heat-cfnapi-6877c7bb88-shzwl" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.254692 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-config-data\") pod \"heat-api-698b7768c9-bwljp\" (UID: \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\") " pod="openstack/heat-api-698b7768c9-bwljp" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.255283 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-public-tls-certs\") pod \"heat-cfnapi-6877c7bb88-shzwl\" (UID: \"0173521e-a9ee-43e3-9760-f3f12527c84b\") " pod="openstack/heat-cfnapi-6877c7bb88-shzwl" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.255314 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-internal-tls-certs\") pod \"heat-api-698b7768c9-bwljp\" (UID: \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\") " pod="openstack/heat-api-698b7768c9-bwljp" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.255330 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-public-tls-certs\") pod \"heat-api-698b7768c9-bwljp\" (UID: \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\") " pod="openstack/heat-api-698b7768c9-bwljp" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.255398 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72vn9\" (UniqueName: \"kubernetes.io/projected/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-kube-api-access-72vn9\") pod \"heat-api-698b7768c9-bwljp\" (UID: \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\") " pod="openstack/heat-api-698b7768c9-bwljp" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.255426 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4b87\" (UniqueName: \"kubernetes.io/projected/0173521e-a9ee-43e3-9760-f3f12527c84b-kube-api-access-b4b87\") pod \"heat-cfnapi-6877c7bb88-shzwl\" (UID: \"0173521e-a9ee-43e3-9760-f3f12527c84b\") " pod="openstack/heat-cfnapi-6877c7bb88-shzwl" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.255464 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-config-data-custom\") pod \"heat-api-698b7768c9-bwljp\" (UID: \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\") " pod="openstack/heat-api-698b7768c9-bwljp" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.255485 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-combined-ca-bundle\") pod \"heat-api-698b7768c9-bwljp\" (UID: \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\") " pod="openstack/heat-api-698b7768c9-bwljp" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.255537 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-combined-ca-bundle\") pod \"heat-cfnapi-6877c7bb88-shzwl\" (UID: \"0173521e-a9ee-43e3-9760-f3f12527c84b\") " pod="openstack/heat-cfnapi-6877c7bb88-shzwl" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.255582 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-internal-tls-certs\") pod \"heat-cfnapi-6877c7bb88-shzwl\" (UID: \"0173521e-a9ee-43e3-9760-f3f12527c84b\") " pod="openstack/heat-cfnapi-6877c7bb88-shzwl" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.255689 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-config-data-custom\") pod \"heat-cfnapi-6877c7bb88-shzwl\" (UID: \"0173521e-a9ee-43e3-9760-f3f12527c84b\") " pod="openstack/heat-cfnapi-6877c7bb88-shzwl" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.260683 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-public-tls-certs\") pod \"heat-api-698b7768c9-bwljp\" (UID: \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\") " pod="openstack/heat-api-698b7768c9-bwljp" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.261634 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-config-data-custom\") pod \"heat-api-698b7768c9-bwljp\" (UID: \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\") " pod="openstack/heat-api-698b7768c9-bwljp" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.264918 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-internal-tls-certs\") pod \"heat-api-698b7768c9-bwljp\" (UID: \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\") " pod="openstack/heat-api-698b7768c9-bwljp" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.265392 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-config-data\") pod \"heat-api-698b7768c9-bwljp\" (UID: \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\") " pod="openstack/heat-api-698b7768c9-bwljp" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.267017 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-combined-ca-bundle\") pod \"heat-api-698b7768c9-bwljp\" (UID: \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\") " pod="openstack/heat-api-698b7768c9-bwljp" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.286227 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72vn9\" (UniqueName: \"kubernetes.io/projected/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-kube-api-access-72vn9\") pod \"heat-api-698b7768c9-bwljp\" (UID: \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\") " pod="openstack/heat-api-698b7768c9-bwljp" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.358404 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-config-data\") pod \"heat-cfnapi-6877c7bb88-shzwl\" (UID: \"0173521e-a9ee-43e3-9760-f3f12527c84b\") " pod="openstack/heat-cfnapi-6877c7bb88-shzwl" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.358476 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-public-tls-certs\") pod \"heat-cfnapi-6877c7bb88-shzwl\" (UID: \"0173521e-a9ee-43e3-9760-f3f12527c84b\") " pod="openstack/heat-cfnapi-6877c7bb88-shzwl" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.358537 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4b87\" (UniqueName: \"kubernetes.io/projected/0173521e-a9ee-43e3-9760-f3f12527c84b-kube-api-access-b4b87\") pod \"heat-cfnapi-6877c7bb88-shzwl\" (UID: \"0173521e-a9ee-43e3-9760-f3f12527c84b\") " pod="openstack/heat-cfnapi-6877c7bb88-shzwl" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.358585 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-combined-ca-bundle\") pod \"heat-cfnapi-6877c7bb88-shzwl\" (UID: \"0173521e-a9ee-43e3-9760-f3f12527c84b\") " pod="openstack/heat-cfnapi-6877c7bb88-shzwl" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.358620 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-internal-tls-certs\") pod \"heat-cfnapi-6877c7bb88-shzwl\" (UID: \"0173521e-a9ee-43e3-9760-f3f12527c84b\") " pod="openstack/heat-cfnapi-6877c7bb88-shzwl" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.358677 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-config-data-custom\") pod \"heat-cfnapi-6877c7bb88-shzwl\" (UID: \"0173521e-a9ee-43e3-9760-f3f12527c84b\") " pod="openstack/heat-cfnapi-6877c7bb88-shzwl" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.361984 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-698b7768c9-bwljp" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.362583 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-public-tls-certs\") pod \"heat-cfnapi-6877c7bb88-shzwl\" (UID: \"0173521e-a9ee-43e3-9760-f3f12527c84b\") " pod="openstack/heat-cfnapi-6877c7bb88-shzwl" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.372545 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-combined-ca-bundle\") pod \"heat-cfnapi-6877c7bb88-shzwl\" (UID: \"0173521e-a9ee-43e3-9760-f3f12527c84b\") " pod="openstack/heat-cfnapi-6877c7bb88-shzwl" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.373200 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-config-data\") pod \"heat-cfnapi-6877c7bb88-shzwl\" (UID: \"0173521e-a9ee-43e3-9760-f3f12527c84b\") " pod="openstack/heat-cfnapi-6877c7bb88-shzwl" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.374629 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4b87\" (UniqueName: \"kubernetes.io/projected/0173521e-a9ee-43e3-9760-f3f12527c84b-kube-api-access-b4b87\") pod \"heat-cfnapi-6877c7bb88-shzwl\" (UID: \"0173521e-a9ee-43e3-9760-f3f12527c84b\") " pod="openstack/heat-cfnapi-6877c7bb88-shzwl" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.378314 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-internal-tls-certs\") pod \"heat-cfnapi-6877c7bb88-shzwl\" (UID: \"0173521e-a9ee-43e3-9760-f3f12527c84b\") " pod="openstack/heat-cfnapi-6877c7bb88-shzwl" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.379786 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-config-data-custom\") pod \"heat-cfnapi-6877c7bb88-shzwl\" (UID: \"0173521e-a9ee-43e3-9760-f3f12527c84b\") " pod="openstack/heat-cfnapi-6877c7bb88-shzwl" Oct 11 01:13:24 crc kubenswrapper[4743]: I1011 01:13:24.472300 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6877c7bb88-shzwl" Oct 11 01:13:25 crc kubenswrapper[4743]: I1011 01:13:25.219082 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" Oct 11 01:13:25 crc kubenswrapper[4743]: I1011 01:13:25.289885 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-bg4dw"] Oct 11 01:13:25 crc kubenswrapper[4743]: I1011 01:13:25.290120 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" podUID="ef875954-7f31-4d4d-acec-56789e002001" containerName="dnsmasq-dns" containerID="cri-o://efc5feb2db287d79f71abf737483c9a699a58fff5eb0a5bbe557a237d343efd7" gracePeriod=10 Oct 11 01:13:25 crc kubenswrapper[4743]: I1011 01:13:25.785244 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:13:26 crc kubenswrapper[4743]: I1011 01:13:26.142694 4743 generic.go:334] "Generic (PLEG): container finished" podID="ef875954-7f31-4d4d-acec-56789e002001" containerID="efc5feb2db287d79f71abf737483c9a699a58fff5eb0a5bbe557a237d343efd7" exitCode=0 Oct 11 01:13:26 crc kubenswrapper[4743]: I1011 01:13:26.142767 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" event={"ID":"ef875954-7f31-4d4d-acec-56789e002001","Type":"ContainerDied","Data":"efc5feb2db287d79f71abf737483c9a699a58fff5eb0a5bbe557a237d343efd7"} Oct 11 01:13:26 crc kubenswrapper[4743]: I1011 01:13:26.484527 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" podUID="ef875954-7f31-4d4d-acec-56789e002001" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.186:5353: connect: connection refused" Oct 11 01:13:27 crc kubenswrapper[4743]: I1011 01:13:27.294254 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" Oct 11 01:13:29 crc kubenswrapper[4743]: I1011 01:13:29.571773 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" Oct 11 01:13:29 crc kubenswrapper[4743]: I1011 01:13:29.681658 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-ovsdbserver-sb\") pod \"ef875954-7f31-4d4d-acec-56789e002001\" (UID: \"ef875954-7f31-4d4d-acec-56789e002001\") " Oct 11 01:13:29 crc kubenswrapper[4743]: I1011 01:13:29.682557 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-ovsdbserver-nb\") pod \"ef875954-7f31-4d4d-acec-56789e002001\" (UID: \"ef875954-7f31-4d4d-acec-56789e002001\") " Oct 11 01:13:29 crc kubenswrapper[4743]: I1011 01:13:29.682687 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-dns-svc\") pod \"ef875954-7f31-4d4d-acec-56789e002001\" (UID: \"ef875954-7f31-4d4d-acec-56789e002001\") " Oct 11 01:13:29 crc kubenswrapper[4743]: I1011 01:13:29.682923 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f2xk\" (UniqueName: \"kubernetes.io/projected/ef875954-7f31-4d4d-acec-56789e002001-kube-api-access-2f2xk\") pod \"ef875954-7f31-4d4d-acec-56789e002001\" (UID: \"ef875954-7f31-4d4d-acec-56789e002001\") " Oct 11 01:13:29 crc kubenswrapper[4743]: I1011 01:13:29.683062 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-config\") pod \"ef875954-7f31-4d4d-acec-56789e002001\" (UID: \"ef875954-7f31-4d4d-acec-56789e002001\") " Oct 11 01:13:29 crc kubenswrapper[4743]: I1011 01:13:29.683308 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-dns-swift-storage-0\") pod \"ef875954-7f31-4d4d-acec-56789e002001\" (UID: \"ef875954-7f31-4d4d-acec-56789e002001\") " Oct 11 01:13:29 crc kubenswrapper[4743]: I1011 01:13:29.693745 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef875954-7f31-4d4d-acec-56789e002001-kube-api-access-2f2xk" (OuterVolumeSpecName: "kube-api-access-2f2xk") pod "ef875954-7f31-4d4d-acec-56789e002001" (UID: "ef875954-7f31-4d4d-acec-56789e002001"). InnerVolumeSpecName "kube-api-access-2f2xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:13:29 crc kubenswrapper[4743]: I1011 01:13:29.787526 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f2xk\" (UniqueName: \"kubernetes.io/projected/ef875954-7f31-4d4d-acec-56789e002001-kube-api-access-2f2xk\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:29 crc kubenswrapper[4743]: I1011 01:13:29.888792 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-698b7768c9-bwljp"] Oct 11 01:13:29 crc kubenswrapper[4743]: I1011 01:13:29.894070 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ef875954-7f31-4d4d-acec-56789e002001" (UID: "ef875954-7f31-4d4d-acec-56789e002001"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:13:29 crc kubenswrapper[4743]: I1011 01:13:29.925702 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ef875954-7f31-4d4d-acec-56789e002001" (UID: "ef875954-7f31-4d4d-acec-56789e002001"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:13:29 crc kubenswrapper[4743]: I1011 01:13:29.926199 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-config" (OuterVolumeSpecName: "config") pod "ef875954-7f31-4d4d-acec-56789e002001" (UID: "ef875954-7f31-4d4d-acec-56789e002001"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:13:29 crc kubenswrapper[4743]: I1011 01:13:29.929801 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ef875954-7f31-4d4d-acec-56789e002001" (UID: "ef875954-7f31-4d4d-acec-56789e002001"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:13:29 crc kubenswrapper[4743]: I1011 01:13:29.940081 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ef875954-7f31-4d4d-acec-56789e002001" (UID: "ef875954-7f31-4d4d-acec-56789e002001"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:13:29 crc kubenswrapper[4743]: I1011 01:13:29.996145 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:29 crc kubenswrapper[4743]: I1011 01:13:29.996190 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:29 crc kubenswrapper[4743]: I1011 01:13:29.996203 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:29 crc kubenswrapper[4743]: I1011 01:13:29.996214 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:29 crc kubenswrapper[4743]: I1011 01:13:29.996230 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef875954-7f31-4d4d-acec-56789e002001-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:30 crc kubenswrapper[4743]: I1011 01:13:30.034696 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6877c7bb88-shzwl"] Oct 11 01:13:30 crc kubenswrapper[4743]: I1011 01:13:30.221991 4743 generic.go:334] "Generic (PLEG): container finished" podID="af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d" containerID="b6f83f30b962e225528f4f1b93c798cbefcd41ac823f2c2732c0306b62543140" exitCode=1 Oct 11 01:13:30 crc kubenswrapper[4743]: I1011 01:13:30.222237 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5dc9695786-kqfsr" event={"ID":"af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d","Type":"ContainerDied","Data":"b6f83f30b962e225528f4f1b93c798cbefcd41ac823f2c2732c0306b62543140"} Oct 11 01:13:30 crc kubenswrapper[4743]: I1011 01:13:30.222682 4743 scope.go:117] "RemoveContainer" containerID="b6f83f30b962e225528f4f1b93c798cbefcd41ac823f2c2732c0306b62543140" Oct 11 01:13:30 crc kubenswrapper[4743]: I1011 01:13:30.229579 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6877c7bb88-shzwl" event={"ID":"0173521e-a9ee-43e3-9760-f3f12527c84b","Type":"ContainerStarted","Data":"f86bb7d938d22431051f103e8eefbd9b1a949d387e1691a794ca3987ebb430c4"} Oct 11 01:13:30 crc kubenswrapper[4743]: I1011 01:13:30.234197 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" event={"ID":"e0b5e85f-5e3b-4230-861b-f124e542d8db","Type":"ContainerStarted","Data":"588f770caab600650e088e77d621a5223a675ec40d792a23450230f3aeda1ea8"} Oct 11 01:13:30 crc kubenswrapper[4743]: I1011 01:13:30.235003 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" Oct 11 01:13:30 crc kubenswrapper[4743]: I1011 01:13:30.236415 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-698b7768c9-bwljp" event={"ID":"ef2598ca-aa73-4d3e-adf8-7f94e68f2838","Type":"ContainerStarted","Data":"6865f057330b7a2add702a63f04dc551183fed004099d0b45d756a59497c7170"} Oct 11 01:13:30 crc kubenswrapper[4743]: I1011 01:13:30.244835 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f","Type":"ContainerStarted","Data":"02a33364cb7dcca44f2626b9d3c768a1284c1f443c6199ee6a1a01875d1c3528"} Oct 11 01:13:30 crc kubenswrapper[4743]: I1011 01:13:30.255344 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" event={"ID":"ef875954-7f31-4d4d-acec-56789e002001","Type":"ContainerDied","Data":"589f592d9fe67ced8fc084adbbc61f2bfb0afcf430a0e99c36dfaf9fd836c9c4"} Oct 11 01:13:30 crc kubenswrapper[4743]: I1011 01:13:30.255396 4743 scope.go:117] "RemoveContainer" containerID="efc5feb2db287d79f71abf737483c9a699a58fff5eb0a5bbe557a237d343efd7" Oct 11 01:13:30 crc kubenswrapper[4743]: I1011 01:13:30.255519 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-bg4dw" Oct 11 01:13:30 crc kubenswrapper[4743]: I1011 01:13:30.267886 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-89lgh" event={"ID":"69e15d6c-dacd-4466-aac4-050cda6242aa","Type":"ContainerStarted","Data":"6c415f2f966cf8502f34e945ecfacf3c5b60792adcb24042de362843942dc10c"} Oct 11 01:13:30 crc kubenswrapper[4743]: I1011 01:13:30.268811 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" podStartSLOduration=8.268791126 podStartE2EDuration="8.268791126s" podCreationTimestamp="2025-10-11 01:13:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:13:30.260867013 +0000 UTC m=+1304.913847410" watchObservedRunningTime="2025-10-11 01:13:30.268791126 +0000 UTC m=+1304.921771533" Oct 11 01:13:30 crc kubenswrapper[4743]: I1011 01:13:30.289920 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-bg4dw"] Oct 11 01:13:30 crc kubenswrapper[4743]: I1011 01:13:30.298493 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-bg4dw"] Oct 11 01:13:30 crc kubenswrapper[4743]: I1011 01:13:30.299232 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-89lgh" podStartSLOduration=1.8938230809999999 podStartE2EDuration="14.299216246s" podCreationTimestamp="2025-10-11 01:13:16 +0000 UTC" firstStartedPulling="2025-10-11 01:13:17.079139737 +0000 UTC m=+1291.732120134" lastFinishedPulling="2025-10-11 01:13:29.484532902 +0000 UTC m=+1304.137513299" observedRunningTime="2025-10-11 01:13:30.298471489 +0000 UTC m=+1304.951451886" watchObservedRunningTime="2025-10-11 01:13:30.299216246 +0000 UTC m=+1304.952196643" Oct 11 01:13:30 crc kubenswrapper[4743]: I1011 01:13:30.320530 4743 scope.go:117] "RemoveContainer" containerID="e26153410b18f47bcd88287e1c66b5fd354ff6a52689cc62138a3101449d1e81" Oct 11 01:13:30 crc kubenswrapper[4743]: I1011 01:13:30.391686 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-744f6c8d8b-rxjq4" podUID="8844a903-e7bc-4d03-902d-981c12d7875e" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.204:8004/healthcheck\": read tcp 10.217.0.2:58974->10.217.0.204:8004: read: connection reset by peer" Oct 11 01:13:30 crc kubenswrapper[4743]: I1011 01:13:30.392107 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-744f6c8d8b-rxjq4" podUID="8844a903-e7bc-4d03-902d-981c12d7875e" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.204:8004/healthcheck\": dial tcp 10.217.0.204:8004: connect: connection refused" Oct 11 01:13:30 crc kubenswrapper[4743]: I1011 01:13:30.467001 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" podUID="503af092-8c8f-4bdb-a27c-a76f98794769" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.202:8000/healthcheck\": read tcp 10.217.0.2:32876->10.217.0.202:8000: read: connection reset by peer" Oct 11 01:13:30 crc kubenswrapper[4743]: I1011 01:13:30.467461 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" podUID="503af092-8c8f-4bdb-a27c-a76f98794769" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.202:8000/healthcheck\": dial tcp 10.217.0.202:8000: connect: connection refused" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.092366 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-744f6c8d8b-rxjq4" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.112233 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.232245 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/503af092-8c8f-4bdb-a27c-a76f98794769-config-data-custom\") pod \"503af092-8c8f-4bdb-a27c-a76f98794769\" (UID: \"503af092-8c8f-4bdb-a27c-a76f98794769\") " Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.232361 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8844a903-e7bc-4d03-902d-981c12d7875e-config-data\") pod \"8844a903-e7bc-4d03-902d-981c12d7875e\" (UID: \"8844a903-e7bc-4d03-902d-981c12d7875e\") " Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.232438 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h7nt\" (UniqueName: \"kubernetes.io/projected/503af092-8c8f-4bdb-a27c-a76f98794769-kube-api-access-4h7nt\") pod \"503af092-8c8f-4bdb-a27c-a76f98794769\" (UID: \"503af092-8c8f-4bdb-a27c-a76f98794769\") " Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.232466 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/503af092-8c8f-4bdb-a27c-a76f98794769-config-data\") pod \"503af092-8c8f-4bdb-a27c-a76f98794769\" (UID: \"503af092-8c8f-4bdb-a27c-a76f98794769\") " Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.232488 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mxtp\" (UniqueName: \"kubernetes.io/projected/8844a903-e7bc-4d03-902d-981c12d7875e-kube-api-access-7mxtp\") pod \"8844a903-e7bc-4d03-902d-981c12d7875e\" (UID: \"8844a903-e7bc-4d03-902d-981c12d7875e\") " Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.232545 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8844a903-e7bc-4d03-902d-981c12d7875e-combined-ca-bundle\") pod \"8844a903-e7bc-4d03-902d-981c12d7875e\" (UID: \"8844a903-e7bc-4d03-902d-981c12d7875e\") " Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.232564 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8844a903-e7bc-4d03-902d-981c12d7875e-config-data-custom\") pod \"8844a903-e7bc-4d03-902d-981c12d7875e\" (UID: \"8844a903-e7bc-4d03-902d-981c12d7875e\") " Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.232614 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/503af092-8c8f-4bdb-a27c-a76f98794769-combined-ca-bundle\") pod \"503af092-8c8f-4bdb-a27c-a76f98794769\" (UID: \"503af092-8c8f-4bdb-a27c-a76f98794769\") " Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.238307 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8844a903-e7bc-4d03-902d-981c12d7875e-kube-api-access-7mxtp" (OuterVolumeSpecName: "kube-api-access-7mxtp") pod "8844a903-e7bc-4d03-902d-981c12d7875e" (UID: "8844a903-e7bc-4d03-902d-981c12d7875e"). InnerVolumeSpecName "kube-api-access-7mxtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.240011 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/503af092-8c8f-4bdb-a27c-a76f98794769-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "503af092-8c8f-4bdb-a27c-a76f98794769" (UID: "503af092-8c8f-4bdb-a27c-a76f98794769"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.249696 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8844a903-e7bc-4d03-902d-981c12d7875e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8844a903-e7bc-4d03-902d-981c12d7875e" (UID: "8844a903-e7bc-4d03-902d-981c12d7875e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.257978 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/503af092-8c8f-4bdb-a27c-a76f98794769-kube-api-access-4h7nt" (OuterVolumeSpecName: "kube-api-access-4h7nt") pod "503af092-8c8f-4bdb-a27c-a76f98794769" (UID: "503af092-8c8f-4bdb-a27c-a76f98794769"). InnerVolumeSpecName "kube-api-access-4h7nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.295580 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8844a903-e7bc-4d03-902d-981c12d7875e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8844a903-e7bc-4d03-902d-981c12d7875e" (UID: "8844a903-e7bc-4d03-902d-981c12d7875e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.309964 4743 generic.go:334] "Generic (PLEG): container finished" podID="af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d" containerID="1df48005e853e115bd2e7909fb83bacabef2cf77bfdeafcbb6bc89a86155b7d6" exitCode=1 Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.310196 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5dc9695786-kqfsr" event={"ID":"af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d","Type":"ContainerDied","Data":"1df48005e853e115bd2e7909fb83bacabef2cf77bfdeafcbb6bc89a86155b7d6"} Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.310420 4743 scope.go:117] "RemoveContainer" containerID="b6f83f30b962e225528f4f1b93c798cbefcd41ac823f2c2732c0306b62543140" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.317868 4743 scope.go:117] "RemoveContainer" containerID="1df48005e853e115bd2e7909fb83bacabef2cf77bfdeafcbb6bc89a86155b7d6" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.318043 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/503af092-8c8f-4bdb-a27c-a76f98794769-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "503af092-8c8f-4bdb-a27c-a76f98794769" (UID: "503af092-8c8f-4bdb-a27c-a76f98794769"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.318569 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6877c7bb88-shzwl" event={"ID":"0173521e-a9ee-43e3-9760-f3f12527c84b","Type":"ContainerStarted","Data":"6c934096d37c4cdb4c39aaaaca7da99027c6f330c116c0e3d5922b9970621182"} Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.318813 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6877c7bb88-shzwl" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.320225 4743 generic.go:334] "Generic (PLEG): container finished" podID="8844a903-e7bc-4d03-902d-981c12d7875e" containerID="79931a3b1a2a3de0f75bbddd6835b3e060312b1568e64c19ce7de9f33d285b62" exitCode=0 Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.320265 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-744f6c8d8b-rxjq4" event={"ID":"8844a903-e7bc-4d03-902d-981c12d7875e","Type":"ContainerDied","Data":"79931a3b1a2a3de0f75bbddd6835b3e060312b1568e64c19ce7de9f33d285b62"} Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.320284 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-744f6c8d8b-rxjq4" event={"ID":"8844a903-e7bc-4d03-902d-981c12d7875e","Type":"ContainerDied","Data":"a667b397d447ac68a7273de136d997ec6b3904a5891a652489326691fcd1ce7a"} Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.320323 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-744f6c8d8b-rxjq4" Oct 11 01:13:31 crc kubenswrapper[4743]: E1011 01:13:31.322974 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5dc9695786-kqfsr_openstack(af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d)\"" pod="openstack/heat-api-5dc9695786-kqfsr" podUID="af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.329323 4743 generic.go:334] "Generic (PLEG): container finished" podID="e0b5e85f-5e3b-4230-861b-f124e542d8db" containerID="588f770caab600650e088e77d621a5223a675ec40d792a23450230f3aeda1ea8" exitCode=1 Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.329477 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" event={"ID":"e0b5e85f-5e3b-4230-861b-f124e542d8db","Type":"ContainerDied","Data":"588f770caab600650e088e77d621a5223a675ec40d792a23450230f3aeda1ea8"} Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.329983 4743 scope.go:117] "RemoveContainer" containerID="588f770caab600650e088e77d621a5223a675ec40d792a23450230f3aeda1ea8" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.330947 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/503af092-8c8f-4bdb-a27c-a76f98794769-config-data" (OuterVolumeSpecName: "config-data") pod "503af092-8c8f-4bdb-a27c-a76f98794769" (UID: "503af092-8c8f-4bdb-a27c-a76f98794769"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.347278 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8844a903-e7bc-4d03-902d-981c12d7875e-config-data" (OuterVolumeSpecName: "config-data") pod "8844a903-e7bc-4d03-902d-981c12d7875e" (UID: "8844a903-e7bc-4d03-902d-981c12d7875e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.350687 4743 generic.go:334] "Generic (PLEG): container finished" podID="503af092-8c8f-4bdb-a27c-a76f98794769" containerID="431a075a88c2da6003e5c1ad5469b0301c72d0144d978d2f1c306e7f3284e882" exitCode=0 Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.350760 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" event={"ID":"503af092-8c8f-4bdb-a27c-a76f98794769","Type":"ContainerDied","Data":"431a075a88c2da6003e5c1ad5469b0301c72d0144d978d2f1c306e7f3284e882"} Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.350790 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" event={"ID":"503af092-8c8f-4bdb-a27c-a76f98794769","Type":"ContainerDied","Data":"e35cbe6a63ad222f2b8416f90ed48d8cf5026c126ea5ffb8289700c64a6fa046"} Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.350847 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-59cbbb87f4-s7qvc" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.360415 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8844a903-e7bc-4d03-902d-981c12d7875e-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.360444 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h7nt\" (UniqueName: \"kubernetes.io/projected/503af092-8c8f-4bdb-a27c-a76f98794769-kube-api-access-4h7nt\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.360456 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/503af092-8c8f-4bdb-a27c-a76f98794769-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.360465 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mxtp\" (UniqueName: \"kubernetes.io/projected/8844a903-e7bc-4d03-902d-981c12d7875e-kube-api-access-7mxtp\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.360479 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8844a903-e7bc-4d03-902d-981c12d7875e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.360489 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8844a903-e7bc-4d03-902d-981c12d7875e-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.360497 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/503af092-8c8f-4bdb-a27c-a76f98794769-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.360509 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/503af092-8c8f-4bdb-a27c-a76f98794769-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.371566 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-698b7768c9-bwljp" event={"ID":"ef2598ca-aa73-4d3e-adf8-7f94e68f2838","Type":"ContainerStarted","Data":"0f18677d262f1a6f8cfc2af7e7ff348e93a8d1dbda8f2829f94d84d8863d4523"} Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.376079 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-698b7768c9-bwljp" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.386710 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f","Type":"ContainerStarted","Data":"2bf7ce61f9b0e51face763cac5e2585613a9827439d2b87516169a3ed522be48"} Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.454814 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6877c7bb88-shzwl" podStartSLOduration=7.454792242 podStartE2EDuration="7.454792242s" podCreationTimestamp="2025-10-11 01:13:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:13:31.402483288 +0000 UTC m=+1306.055463695" watchObservedRunningTime="2025-10-11 01:13:31.454792242 +0000 UTC m=+1306.107772639" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.464505 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-698b7768c9-bwljp" podStartSLOduration=8.464489116 podStartE2EDuration="8.464489116s" podCreationTimestamp="2025-10-11 01:13:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:13:31.426983982 +0000 UTC m=+1306.079964389" watchObservedRunningTime="2025-10-11 01:13:31.464489116 +0000 UTC m=+1306.117469513" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.476293 4743 scope.go:117] "RemoveContainer" containerID="79931a3b1a2a3de0f75bbddd6835b3e060312b1568e64c19ce7de9f33d285b62" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.483071 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-59cbbb87f4-s7qvc"] Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.501449 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-59cbbb87f4-s7qvc"] Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.502587 4743 scope.go:117] "RemoveContainer" containerID="79931a3b1a2a3de0f75bbddd6835b3e060312b1568e64c19ce7de9f33d285b62" Oct 11 01:13:31 crc kubenswrapper[4743]: E1011 01:13:31.503026 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79931a3b1a2a3de0f75bbddd6835b3e060312b1568e64c19ce7de9f33d285b62\": container with ID starting with 79931a3b1a2a3de0f75bbddd6835b3e060312b1568e64c19ce7de9f33d285b62 not found: ID does not exist" containerID="79931a3b1a2a3de0f75bbddd6835b3e060312b1568e64c19ce7de9f33d285b62" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.503057 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79931a3b1a2a3de0f75bbddd6835b3e060312b1568e64c19ce7de9f33d285b62"} err="failed to get container status \"79931a3b1a2a3de0f75bbddd6835b3e060312b1568e64c19ce7de9f33d285b62\": rpc error: code = NotFound desc = could not find container \"79931a3b1a2a3de0f75bbddd6835b3e060312b1568e64c19ce7de9f33d285b62\": container with ID starting with 79931a3b1a2a3de0f75bbddd6835b3e060312b1568e64c19ce7de9f33d285b62 not found: ID does not exist" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.503079 4743 scope.go:117] "RemoveContainer" containerID="431a075a88c2da6003e5c1ad5469b0301c72d0144d978d2f1c306e7f3284e882" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.535684 4743 scope.go:117] "RemoveContainer" containerID="431a075a88c2da6003e5c1ad5469b0301c72d0144d978d2f1c306e7f3284e882" Oct 11 01:13:31 crc kubenswrapper[4743]: E1011 01:13:31.536680 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"431a075a88c2da6003e5c1ad5469b0301c72d0144d978d2f1c306e7f3284e882\": container with ID starting with 431a075a88c2da6003e5c1ad5469b0301c72d0144d978d2f1c306e7f3284e882 not found: ID does not exist" containerID="431a075a88c2da6003e5c1ad5469b0301c72d0144d978d2f1c306e7f3284e882" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.536722 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"431a075a88c2da6003e5c1ad5469b0301c72d0144d978d2f1c306e7f3284e882"} err="failed to get container status \"431a075a88c2da6003e5c1ad5469b0301c72d0144d978d2f1c306e7f3284e882\": rpc error: code = NotFound desc = could not find container \"431a075a88c2da6003e5c1ad5469b0301c72d0144d978d2f1c306e7f3284e882\": container with ID starting with 431a075a88c2da6003e5c1ad5469b0301c72d0144d978d2f1c306e7f3284e882 not found: ID does not exist" Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.649204 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-744f6c8d8b-rxjq4"] Oct 11 01:13:31 crc kubenswrapper[4743]: I1011 01:13:31.660671 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-744f6c8d8b-rxjq4"] Oct 11 01:13:32 crc kubenswrapper[4743]: I1011 01:13:32.105373 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="503af092-8c8f-4bdb-a27c-a76f98794769" path="/var/lib/kubelet/pods/503af092-8c8f-4bdb-a27c-a76f98794769/volumes" Oct 11 01:13:32 crc kubenswrapper[4743]: I1011 01:13:32.106404 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8844a903-e7bc-4d03-902d-981c12d7875e" path="/var/lib/kubelet/pods/8844a903-e7bc-4d03-902d-981c12d7875e/volumes" Oct 11 01:13:32 crc kubenswrapper[4743]: I1011 01:13:32.107437 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef875954-7f31-4d4d-acec-56789e002001" path="/var/lib/kubelet/pods/ef875954-7f31-4d4d-acec-56789e002001/volumes" Oct 11 01:13:32 crc kubenswrapper[4743]: I1011 01:13:32.404226 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" event={"ID":"e0b5e85f-5e3b-4230-861b-f124e542d8db","Type":"ContainerStarted","Data":"2a69e7c01f308f32893dd56a634a3b184443f10e672545a177361071da23e647"} Oct 11 01:13:32 crc kubenswrapper[4743]: I1011 01:13:32.408552 4743 scope.go:117] "RemoveContainer" containerID="1df48005e853e115bd2e7909fb83bacabef2cf77bfdeafcbb6bc89a86155b7d6" Oct 11 01:13:32 crc kubenswrapper[4743]: E1011 01:13:32.408803 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5dc9695786-kqfsr_openstack(af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d)\"" pod="openstack/heat-api-5dc9695786-kqfsr" podUID="af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d" Oct 11 01:13:32 crc kubenswrapper[4743]: I1011 01:13:32.633961 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5dc9695786-kqfsr" Oct 11 01:13:32 crc kubenswrapper[4743]: I1011 01:13:32.634021 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-5dc9695786-kqfsr" Oct 11 01:13:32 crc kubenswrapper[4743]: I1011 01:13:32.658670 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" Oct 11 01:13:33 crc kubenswrapper[4743]: I1011 01:13:33.419461 4743 generic.go:334] "Generic (PLEG): container finished" podID="e0b5e85f-5e3b-4230-861b-f124e542d8db" containerID="2a69e7c01f308f32893dd56a634a3b184443f10e672545a177361071da23e647" exitCode=1 Oct 11 01:13:33 crc kubenswrapper[4743]: I1011 01:13:33.419564 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" event={"ID":"e0b5e85f-5e3b-4230-861b-f124e542d8db","Type":"ContainerDied","Data":"2a69e7c01f308f32893dd56a634a3b184443f10e672545a177361071da23e647"} Oct 11 01:13:33 crc kubenswrapper[4743]: I1011 01:13:33.419655 4743 scope.go:117] "RemoveContainer" containerID="588f770caab600650e088e77d621a5223a675ec40d792a23450230f3aeda1ea8" Oct 11 01:13:33 crc kubenswrapper[4743]: I1011 01:13:33.420068 4743 scope.go:117] "RemoveContainer" containerID="1df48005e853e115bd2e7909fb83bacabef2cf77bfdeafcbb6bc89a86155b7d6" Oct 11 01:13:33 crc kubenswrapper[4743]: E1011 01:13:33.420364 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5dc9695786-kqfsr_openstack(af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d)\"" pod="openstack/heat-api-5dc9695786-kqfsr" podUID="af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d" Oct 11 01:13:33 crc kubenswrapper[4743]: I1011 01:13:33.420543 4743 scope.go:117] "RemoveContainer" containerID="2a69e7c01f308f32893dd56a634a3b184443f10e672545a177361071da23e647" Oct 11 01:13:33 crc kubenswrapper[4743]: E1011 01:13:33.421002 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5dbbddbc6d-2nprx_openstack(e0b5e85f-5e3b-4230-861b-f124e542d8db)\"" pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" podUID="e0b5e85f-5e3b-4230-861b-f124e542d8db" Oct 11 01:13:34 crc kubenswrapper[4743]: I1011 01:13:34.446327 4743 scope.go:117] "RemoveContainer" containerID="2a69e7c01f308f32893dd56a634a3b184443f10e672545a177361071da23e647" Oct 11 01:13:34 crc kubenswrapper[4743]: E1011 01:13:34.447140 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5dbbddbc6d-2nprx_openstack(e0b5e85f-5e3b-4230-861b-f124e542d8db)\"" pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" podUID="e0b5e85f-5e3b-4230-861b-f124e542d8db" Oct 11 01:13:34 crc kubenswrapper[4743]: I1011 01:13:34.452023 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f","Type":"ContainerStarted","Data":"a87a6857e891645b6359c9dd35207ef767024de9d3eec1b0b40667a6c7ac7bd8"} Oct 11 01:13:34 crc kubenswrapper[4743]: I1011 01:13:34.452243 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" containerName="ceilometer-central-agent" containerID="cri-o://a408fbe7dcffa686789536eab5166c077dac6c7b88aa79bf50345b509e3f9f06" gracePeriod=30 Oct 11 01:13:34 crc kubenswrapper[4743]: I1011 01:13:34.452375 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 11 01:13:34 crc kubenswrapper[4743]: I1011 01:13:34.452447 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" containerName="proxy-httpd" containerID="cri-o://a87a6857e891645b6359c9dd35207ef767024de9d3eec1b0b40667a6c7ac7bd8" gracePeriod=30 Oct 11 01:13:34 crc kubenswrapper[4743]: I1011 01:13:34.452532 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" containerName="sg-core" containerID="cri-o://2bf7ce61f9b0e51face763cac5e2585613a9827439d2b87516169a3ed522be48" gracePeriod=30 Oct 11 01:13:34 crc kubenswrapper[4743]: I1011 01:13:34.452608 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" containerName="ceilometer-notification-agent" containerID="cri-o://02a33364cb7dcca44f2626b9d3c768a1284c1f443c6199ee6a1a01875d1c3528" gracePeriod=30 Oct 11 01:13:34 crc kubenswrapper[4743]: I1011 01:13:34.494722 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.357781976 podStartE2EDuration="13.494705539s" podCreationTimestamp="2025-10-11 01:13:21 +0000 UTC" firstStartedPulling="2025-10-11 01:13:21.9415997 +0000 UTC m=+1296.594580097" lastFinishedPulling="2025-10-11 01:13:34.078523223 +0000 UTC m=+1308.731503660" observedRunningTime="2025-10-11 01:13:34.488166108 +0000 UTC m=+1309.141146505" watchObservedRunningTime="2025-10-11 01:13:34.494705539 +0000 UTC m=+1309.147685936" Oct 11 01:13:35 crc kubenswrapper[4743]: I1011 01:13:35.053684 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6c678b5cf4-bmh48" Oct 11 01:13:35 crc kubenswrapper[4743]: I1011 01:13:35.462260 4743 generic.go:334] "Generic (PLEG): container finished" podID="8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" containerID="2bf7ce61f9b0e51face763cac5e2585613a9827439d2b87516169a3ed522be48" exitCode=2 Oct 11 01:13:35 crc kubenswrapper[4743]: I1011 01:13:35.462290 4743 generic.go:334] "Generic (PLEG): container finished" podID="8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" containerID="02a33364cb7dcca44f2626b9d3c768a1284c1f443c6199ee6a1a01875d1c3528" exitCode=0 Oct 11 01:13:35 crc kubenswrapper[4743]: I1011 01:13:35.462297 4743 generic.go:334] "Generic (PLEG): container finished" podID="8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" containerID="a408fbe7dcffa686789536eab5166c077dac6c7b88aa79bf50345b509e3f9f06" exitCode=0 Oct 11 01:13:35 crc kubenswrapper[4743]: I1011 01:13:35.462317 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f","Type":"ContainerDied","Data":"2bf7ce61f9b0e51face763cac5e2585613a9827439d2b87516169a3ed522be48"} Oct 11 01:13:35 crc kubenswrapper[4743]: I1011 01:13:35.462341 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f","Type":"ContainerDied","Data":"02a33364cb7dcca44f2626b9d3c768a1284c1f443c6199ee6a1a01875d1c3528"} Oct 11 01:13:35 crc kubenswrapper[4743]: I1011 01:13:35.462351 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f","Type":"ContainerDied","Data":"a408fbe7dcffa686789536eab5166c077dac6c7b88aa79bf50345b509e3f9f06"} Oct 11 01:13:37 crc kubenswrapper[4743]: I1011 01:13:37.658960 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" Oct 11 01:13:37 crc kubenswrapper[4743]: I1011 01:13:37.659519 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" Oct 11 01:13:37 crc kubenswrapper[4743]: I1011 01:13:37.660417 4743 scope.go:117] "RemoveContainer" containerID="2a69e7c01f308f32893dd56a634a3b184443f10e672545a177361071da23e647" Oct 11 01:13:37 crc kubenswrapper[4743]: E1011 01:13:37.660798 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5dbbddbc6d-2nprx_openstack(e0b5e85f-5e3b-4230-861b-f124e542d8db)\"" pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" podUID="e0b5e85f-5e3b-4230-861b-f124e542d8db" Oct 11 01:13:40 crc kubenswrapper[4743]: I1011 01:13:40.711372 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-698b7768c9-bwljp" Oct 11 01:13:40 crc kubenswrapper[4743]: I1011 01:13:40.803021 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-6877c7bb88-shzwl" Oct 11 01:13:40 crc kubenswrapper[4743]: I1011 01:13:40.809041 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5dc9695786-kqfsr"] Oct 11 01:13:40 crc kubenswrapper[4743]: I1011 01:13:40.874032 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5dbbddbc6d-2nprx"] Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.520036 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5dc9695786-kqfsr" Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.527699 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.529008 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" event={"ID":"e0b5e85f-5e3b-4230-861b-f124e542d8db","Type":"ContainerDied","Data":"0516aafc275b32bb280cf6d320b07016239d44a16fafa5a2917515720201379c"} Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.529084 4743 scope.go:117] "RemoveContainer" containerID="2a69e7c01f308f32893dd56a634a3b184443f10e672545a177361071da23e647" Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.531479 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5dc9695786-kqfsr" event={"ID":"af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d","Type":"ContainerDied","Data":"286de206a4b61adee28d2f2fa77022c309f5151411c4345701b335ef09eee362"} Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.531545 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5dc9695786-kqfsr" Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.557626 4743 scope.go:117] "RemoveContainer" containerID="1df48005e853e115bd2e7909fb83bacabef2cf77bfdeafcbb6bc89a86155b7d6" Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.687730 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d-combined-ca-bundle\") pod \"af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d\" (UID: \"af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d\") " Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.687812 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0b5e85f-5e3b-4230-861b-f124e542d8db-config-data-custom\") pod \"e0b5e85f-5e3b-4230-861b-f124e542d8db\" (UID: \"e0b5e85f-5e3b-4230-861b-f124e542d8db\") " Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.687907 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5p2q\" (UniqueName: \"kubernetes.io/projected/af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d-kube-api-access-g5p2q\") pod \"af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d\" (UID: \"af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d\") " Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.687941 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d-config-data\") pod \"af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d\" (UID: \"af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d\") " Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.688074 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b5e85f-5e3b-4230-861b-f124e542d8db-config-data\") pod \"e0b5e85f-5e3b-4230-861b-f124e542d8db\" (UID: \"e0b5e85f-5e3b-4230-861b-f124e542d8db\") " Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.688103 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d-config-data-custom\") pod \"af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d\" (UID: \"af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d\") " Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.688177 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvql9\" (UniqueName: \"kubernetes.io/projected/e0b5e85f-5e3b-4230-861b-f124e542d8db-kube-api-access-xvql9\") pod \"e0b5e85f-5e3b-4230-861b-f124e542d8db\" (UID: \"e0b5e85f-5e3b-4230-861b-f124e542d8db\") " Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.688255 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b5e85f-5e3b-4230-861b-f124e542d8db-combined-ca-bundle\") pod \"e0b5e85f-5e3b-4230-861b-f124e542d8db\" (UID: \"e0b5e85f-5e3b-4230-861b-f124e542d8db\") " Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.694482 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0b5e85f-5e3b-4230-861b-f124e542d8db-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e0b5e85f-5e3b-4230-861b-f124e542d8db" (UID: "e0b5e85f-5e3b-4230-861b-f124e542d8db"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.694505 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0b5e85f-5e3b-4230-861b-f124e542d8db-kube-api-access-xvql9" (OuterVolumeSpecName: "kube-api-access-xvql9") pod "e0b5e85f-5e3b-4230-861b-f124e542d8db" (UID: "e0b5e85f-5e3b-4230-861b-f124e542d8db"). InnerVolumeSpecName "kube-api-access-xvql9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.696818 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d" (UID: "af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.698003 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d-kube-api-access-g5p2q" (OuterVolumeSpecName: "kube-api-access-g5p2q") pod "af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d" (UID: "af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d"). InnerVolumeSpecName "kube-api-access-g5p2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.720012 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0b5e85f-5e3b-4230-861b-f124e542d8db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0b5e85f-5e3b-4230-861b-f124e542d8db" (UID: "e0b5e85f-5e3b-4230-861b-f124e542d8db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.721257 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d" (UID: "af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.742348 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d-config-data" (OuterVolumeSpecName: "config-data") pod "af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d" (UID: "af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.749850 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0b5e85f-5e3b-4230-861b-f124e542d8db-config-data" (OuterVolumeSpecName: "config-data") pod "e0b5e85f-5e3b-4230-861b-f124e542d8db" (UID: "e0b5e85f-5e3b-4230-861b-f124e542d8db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.790767 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0b5e85f-5e3b-4230-861b-f124e542d8db-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.790805 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5p2q\" (UniqueName: \"kubernetes.io/projected/af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d-kube-api-access-g5p2q\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.790844 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.790856 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b5e85f-5e3b-4230-861b-f124e542d8db-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.790883 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.790896 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvql9\" (UniqueName: \"kubernetes.io/projected/e0b5e85f-5e3b-4230-861b-f124e542d8db-kube-api-access-xvql9\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.790907 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b5e85f-5e3b-4230-861b-f124e542d8db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.790918 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.883184 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5dc9695786-kqfsr"] Oct 11 01:13:41 crc kubenswrapper[4743]: I1011 01:13:41.885946 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-5dc9695786-kqfsr"] Oct 11 01:13:42 crc kubenswrapper[4743]: I1011 01:13:42.105241 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d" path="/var/lib/kubelet/pods/af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d/volumes" Oct 11 01:13:42 crc kubenswrapper[4743]: I1011 01:13:42.542924 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5dbbddbc6d-2nprx" Oct 11 01:13:42 crc kubenswrapper[4743]: I1011 01:13:42.547299 4743 generic.go:334] "Generic (PLEG): container finished" podID="69e15d6c-dacd-4466-aac4-050cda6242aa" containerID="6c415f2f966cf8502f34e945ecfacf3c5b60792adcb24042de362843942dc10c" exitCode=0 Oct 11 01:13:42 crc kubenswrapper[4743]: I1011 01:13:42.547410 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-89lgh" event={"ID":"69e15d6c-dacd-4466-aac4-050cda6242aa","Type":"ContainerDied","Data":"6c415f2f966cf8502f34e945ecfacf3c5b60792adcb24042de362843942dc10c"} Oct 11 01:13:42 crc kubenswrapper[4743]: I1011 01:13:42.566583 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5dbbddbc6d-2nprx"] Oct 11 01:13:42 crc kubenswrapper[4743]: I1011 01:13:42.577523 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-5dbbddbc6d-2nprx"] Oct 11 01:13:42 crc kubenswrapper[4743]: I1011 01:13:42.621896 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-fdd7c75fc-rtmvl" Oct 11 01:13:42 crc kubenswrapper[4743]: I1011 01:13:42.669440 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6c678b5cf4-bmh48"] Oct 11 01:13:42 crc kubenswrapper[4743]: I1011 01:13:42.669638 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-6c678b5cf4-bmh48" podUID="af4301e6-88e1-4694-85a7-1215badf534d" containerName="heat-engine" containerID="cri-o://8c0dd7350e0076132704f385fde81eb45a4f8bb25fde0b3984cb36baadd16b1b" gracePeriod=60 Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.063632 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-89lgh" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.128971 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0b5e85f-5e3b-4230-861b-f124e542d8db" path="/var/lib/kubelet/pods/e0b5e85f-5e3b-4230-861b-f124e542d8db/volumes" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.258298 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5wv5\" (UniqueName: \"kubernetes.io/projected/69e15d6c-dacd-4466-aac4-050cda6242aa-kube-api-access-b5wv5\") pod \"69e15d6c-dacd-4466-aac4-050cda6242aa\" (UID: \"69e15d6c-dacd-4466-aac4-050cda6242aa\") " Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.258468 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69e15d6c-dacd-4466-aac4-050cda6242aa-scripts\") pod \"69e15d6c-dacd-4466-aac4-050cda6242aa\" (UID: \"69e15d6c-dacd-4466-aac4-050cda6242aa\") " Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.258700 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e15d6c-dacd-4466-aac4-050cda6242aa-combined-ca-bundle\") pod \"69e15d6c-dacd-4466-aac4-050cda6242aa\" (UID: \"69e15d6c-dacd-4466-aac4-050cda6242aa\") " Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.259311 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69e15d6c-dacd-4466-aac4-050cda6242aa-config-data\") pod \"69e15d6c-dacd-4466-aac4-050cda6242aa\" (UID: \"69e15d6c-dacd-4466-aac4-050cda6242aa\") " Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.279041 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e15d6c-dacd-4466-aac4-050cda6242aa-scripts" (OuterVolumeSpecName: "scripts") pod "69e15d6c-dacd-4466-aac4-050cda6242aa" (UID: "69e15d6c-dacd-4466-aac4-050cda6242aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.281185 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69e15d6c-dacd-4466-aac4-050cda6242aa-kube-api-access-b5wv5" (OuterVolumeSpecName: "kube-api-access-b5wv5") pod "69e15d6c-dacd-4466-aac4-050cda6242aa" (UID: "69e15d6c-dacd-4466-aac4-050cda6242aa"). InnerVolumeSpecName "kube-api-access-b5wv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.308003 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e15d6c-dacd-4466-aac4-050cda6242aa-config-data" (OuterVolumeSpecName: "config-data") pod "69e15d6c-dacd-4466-aac4-050cda6242aa" (UID: "69e15d6c-dacd-4466-aac4-050cda6242aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.315214 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e15d6c-dacd-4466-aac4-050cda6242aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69e15d6c-dacd-4466-aac4-050cda6242aa" (UID: "69e15d6c-dacd-4466-aac4-050cda6242aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.365391 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5wv5\" (UniqueName: \"kubernetes.io/projected/69e15d6c-dacd-4466-aac4-050cda6242aa-kube-api-access-b5wv5\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.365444 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69e15d6c-dacd-4466-aac4-050cda6242aa-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.365455 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e15d6c-dacd-4466-aac4-050cda6242aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.365464 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69e15d6c-dacd-4466-aac4-050cda6242aa-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.458915 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.458974 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.459030 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.459797 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bdc2fd3e645a7f36140a058209779fdbf1154f0849a37453796b08adc03a7cc1"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.459854 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://bdc2fd3e645a7f36140a058209779fdbf1154f0849a37453796b08adc03a7cc1" gracePeriod=600 Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.572682 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-89lgh" event={"ID":"69e15d6c-dacd-4466-aac4-050cda6242aa","Type":"ContainerDied","Data":"893e2c5dfbe80b1c351c0a5ff5017a062d0333b20e215236b783db626b9b27b9"} Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.572726 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="893e2c5dfbe80b1c351c0a5ff5017a062d0333b20e215236b783db626b9b27b9" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.572748 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-89lgh" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.663261 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 11 01:13:44 crc kubenswrapper[4743]: E1011 01:13:44.664006 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b5e85f-5e3b-4230-861b-f124e542d8db" containerName="heat-cfnapi" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.664131 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b5e85f-5e3b-4230-861b-f124e542d8db" containerName="heat-cfnapi" Oct 11 01:13:44 crc kubenswrapper[4743]: E1011 01:13:44.664204 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e15d6c-dacd-4466-aac4-050cda6242aa" containerName="nova-cell0-conductor-db-sync" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.664268 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e15d6c-dacd-4466-aac4-050cda6242aa" containerName="nova-cell0-conductor-db-sync" Oct 11 01:13:44 crc kubenswrapper[4743]: E1011 01:13:44.664372 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef875954-7f31-4d4d-acec-56789e002001" containerName="init" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.664440 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef875954-7f31-4d4d-acec-56789e002001" containerName="init" Oct 11 01:13:44 crc kubenswrapper[4743]: E1011 01:13:44.664519 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d" containerName="heat-api" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.664588 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d" containerName="heat-api" Oct 11 01:13:44 crc kubenswrapper[4743]: E1011 01:13:44.664659 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d" containerName="heat-api" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.664728 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d" containerName="heat-api" Oct 11 01:13:44 crc kubenswrapper[4743]: E1011 01:13:44.664809 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef875954-7f31-4d4d-acec-56789e002001" containerName="dnsmasq-dns" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.664898 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef875954-7f31-4d4d-acec-56789e002001" containerName="dnsmasq-dns" Oct 11 01:13:44 crc kubenswrapper[4743]: E1011 01:13:44.664993 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8844a903-e7bc-4d03-902d-981c12d7875e" containerName="heat-api" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.665063 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8844a903-e7bc-4d03-902d-981c12d7875e" containerName="heat-api" Oct 11 01:13:44 crc kubenswrapper[4743]: E1011 01:13:44.665145 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b5e85f-5e3b-4230-861b-f124e542d8db" containerName="heat-cfnapi" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.665214 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b5e85f-5e3b-4230-861b-f124e542d8db" containerName="heat-cfnapi" Oct 11 01:13:44 crc kubenswrapper[4743]: E1011 01:13:44.665293 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="503af092-8c8f-4bdb-a27c-a76f98794769" containerName="heat-cfnapi" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.665363 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="503af092-8c8f-4bdb-a27c-a76f98794769" containerName="heat-cfnapi" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.665666 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef875954-7f31-4d4d-acec-56789e002001" containerName="dnsmasq-dns" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.665762 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d" containerName="heat-api" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.665861 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0b5e85f-5e3b-4230-861b-f124e542d8db" containerName="heat-cfnapi" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.665970 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0b5e85f-5e3b-4230-861b-f124e542d8db" containerName="heat-cfnapi" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.666056 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="503af092-8c8f-4bdb-a27c-a76f98794769" containerName="heat-cfnapi" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.666152 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="69e15d6c-dacd-4466-aac4-050cda6242aa" containerName="nova-cell0-conductor-db-sync" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.666232 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8844a903-e7bc-4d03-902d-981c12d7875e" containerName="heat-api" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.667217 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.672083 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.672713 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-7hvcg" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.674561 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.775344 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2ddaae7-747a-4f05-bc0f-4f69fc15b816-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f2ddaae7-747a-4f05-bc0f-4f69fc15b816\") " pod="openstack/nova-cell0-conductor-0" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.775743 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ddaae7-747a-4f05-bc0f-4f69fc15b816-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f2ddaae7-747a-4f05-bc0f-4f69fc15b816\") " pod="openstack/nova-cell0-conductor-0" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.775787 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2h6t\" (UniqueName: \"kubernetes.io/projected/f2ddaae7-747a-4f05-bc0f-4f69fc15b816-kube-api-access-j2h6t\") pod \"nova-cell0-conductor-0\" (UID: \"f2ddaae7-747a-4f05-bc0f-4f69fc15b816\") " pod="openstack/nova-cell0-conductor-0" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.878114 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2ddaae7-747a-4f05-bc0f-4f69fc15b816-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f2ddaae7-747a-4f05-bc0f-4f69fc15b816\") " pod="openstack/nova-cell0-conductor-0" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.878214 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ddaae7-747a-4f05-bc0f-4f69fc15b816-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f2ddaae7-747a-4f05-bc0f-4f69fc15b816\") " pod="openstack/nova-cell0-conductor-0" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.878268 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2h6t\" (UniqueName: \"kubernetes.io/projected/f2ddaae7-747a-4f05-bc0f-4f69fc15b816-kube-api-access-j2h6t\") pod \"nova-cell0-conductor-0\" (UID: \"f2ddaae7-747a-4f05-bc0f-4f69fc15b816\") " pod="openstack/nova-cell0-conductor-0" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.885898 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ddaae7-747a-4f05-bc0f-4f69fc15b816-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f2ddaae7-747a-4f05-bc0f-4f69fc15b816\") " pod="openstack/nova-cell0-conductor-0" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.886433 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2ddaae7-747a-4f05-bc0f-4f69fc15b816-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f2ddaae7-747a-4f05-bc0f-4f69fc15b816\") " pod="openstack/nova-cell0-conductor-0" Oct 11 01:13:44 crc kubenswrapper[4743]: I1011 01:13:44.893878 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2h6t\" (UniqueName: \"kubernetes.io/projected/f2ddaae7-747a-4f05-bc0f-4f69fc15b816-kube-api-access-j2h6t\") pod \"nova-cell0-conductor-0\" (UID: \"f2ddaae7-747a-4f05-bc0f-4f69fc15b816\") " pod="openstack/nova-cell0-conductor-0" Oct 11 01:13:45 crc kubenswrapper[4743]: I1011 01:13:45.001215 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 11 01:13:45 crc kubenswrapper[4743]: E1011 01:13:45.007005 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c0dd7350e0076132704f385fde81eb45a4f8bb25fde0b3984cb36baadd16b1b" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 11 01:13:45 crc kubenswrapper[4743]: E1011 01:13:45.010981 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c0dd7350e0076132704f385fde81eb45a4f8bb25fde0b3984cb36baadd16b1b" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 11 01:13:45 crc kubenswrapper[4743]: E1011 01:13:45.013377 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c0dd7350e0076132704f385fde81eb45a4f8bb25fde0b3984cb36baadd16b1b" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 11 01:13:45 crc kubenswrapper[4743]: E1011 01:13:45.013443 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-6c678b5cf4-bmh48" podUID="af4301e6-88e1-4694-85a7-1215badf534d" containerName="heat-engine" Oct 11 01:13:45 crc kubenswrapper[4743]: I1011 01:13:45.548615 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 11 01:13:45 crc kubenswrapper[4743]: W1011 01:13:45.553538 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2ddaae7_747a_4f05_bc0f_4f69fc15b816.slice/crio-40bd92adb6ca125f5e8876622b1f4b5d90514e874b405e0b6346def8d43d0c7f WatchSource:0}: Error finding container 40bd92adb6ca125f5e8876622b1f4b5d90514e874b405e0b6346def8d43d0c7f: Status 404 returned error can't find the container with id 40bd92adb6ca125f5e8876622b1f4b5d90514e874b405e0b6346def8d43d0c7f Oct 11 01:13:45 crc kubenswrapper[4743]: I1011 01:13:45.590193 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="bdc2fd3e645a7f36140a058209779fdbf1154f0849a37453796b08adc03a7cc1" exitCode=0 Oct 11 01:13:45 crc kubenswrapper[4743]: I1011 01:13:45.590270 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"bdc2fd3e645a7f36140a058209779fdbf1154f0849a37453796b08adc03a7cc1"} Oct 11 01:13:45 crc kubenswrapper[4743]: I1011 01:13:45.590301 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1"} Oct 11 01:13:45 crc kubenswrapper[4743]: I1011 01:13:45.590329 4743 scope.go:117] "RemoveContainer" containerID="d141abb12335a71090b8204b0a7206f68b485cc9db85f994938ef978a23ae624" Oct 11 01:13:45 crc kubenswrapper[4743]: I1011 01:13:45.594499 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f2ddaae7-747a-4f05-bc0f-4f69fc15b816","Type":"ContainerStarted","Data":"40bd92adb6ca125f5e8876622b1f4b5d90514e874b405e0b6346def8d43d0c7f"} Oct 11 01:13:46 crc kubenswrapper[4743]: I1011 01:13:46.618470 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f2ddaae7-747a-4f05-bc0f-4f69fc15b816","Type":"ContainerStarted","Data":"944a5b377f96b4d57484d11157244ba3369e1124f0ae1709cbb68b8887080aef"} Oct 11 01:13:46 crc kubenswrapper[4743]: I1011 01:13:46.620148 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 11 01:13:46 crc kubenswrapper[4743]: I1011 01:13:46.661989 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.661971201 podStartE2EDuration="2.661971201s" podCreationTimestamp="2025-10-11 01:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:13:46.645737737 +0000 UTC m=+1321.298718144" watchObservedRunningTime="2025-10-11 01:13:46.661971201 +0000 UTC m=+1321.314951618" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.033754 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.554919 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-jqs8c"] Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.555538 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="af1f59ec-ca6c-4b6b-afb9-c8e6dc21e52d" containerName="heat-api" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.556204 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jqs8c" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.559751 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.559945 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.569956 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-jqs8c"] Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.674998 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.679027 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.686540 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.715618 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.722255 4743 generic.go:334] "Generic (PLEG): container finished" podID="af4301e6-88e1-4694-85a7-1215badf534d" containerID="8c0dd7350e0076132704f385fde81eb45a4f8bb25fde0b3984cb36baadd16b1b" exitCode=0 Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.722302 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6c678b5cf4-bmh48" event={"ID":"af4301e6-88e1-4694-85a7-1215badf534d","Type":"ContainerDied","Data":"8c0dd7350e0076132704f385fde81eb45a4f8bb25fde0b3984cb36baadd16b1b"} Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.723657 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c3253b-404c-4db1-a0e5-f2f112e94c43-config-data\") pod \"nova-cell0-cell-mapping-jqs8c\" (UID: \"70c3253b-404c-4db1-a0e5-f2f112e94c43\") " pod="openstack/nova-cell0-cell-mapping-jqs8c" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.723699 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gld77\" (UniqueName: \"kubernetes.io/projected/70c3253b-404c-4db1-a0e5-f2f112e94c43-kube-api-access-gld77\") pod \"nova-cell0-cell-mapping-jqs8c\" (UID: \"70c3253b-404c-4db1-a0e5-f2f112e94c43\") " pod="openstack/nova-cell0-cell-mapping-jqs8c" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.723744 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c3253b-404c-4db1-a0e5-f2f112e94c43-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jqs8c\" (UID: \"70c3253b-404c-4db1-a0e5-f2f112e94c43\") " pod="openstack/nova-cell0-cell-mapping-jqs8c" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.723784 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70c3253b-404c-4db1-a0e5-f2f112e94c43-scripts\") pod \"nova-cell0-cell-mapping-jqs8c\" (UID: \"70c3253b-404c-4db1-a0e5-f2f112e94c43\") " pod="openstack/nova-cell0-cell-mapping-jqs8c" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.747353 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.748708 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.752144 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.827116 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54f2bf58-66f4-4f64-9937-6cd0cc80403e-config-data\") pod \"nova-api-0\" (UID: \"54f2bf58-66f4-4f64-9937-6cd0cc80403e\") " pod="openstack/nova-api-0" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.827179 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rllkv\" (UniqueName: \"kubernetes.io/projected/54f2bf58-66f4-4f64-9937-6cd0cc80403e-kube-api-access-rllkv\") pod \"nova-api-0\" (UID: \"54f2bf58-66f4-4f64-9937-6cd0cc80403e\") " pod="openstack/nova-api-0" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.827231 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c3253b-404c-4db1-a0e5-f2f112e94c43-config-data\") pod \"nova-cell0-cell-mapping-jqs8c\" (UID: \"70c3253b-404c-4db1-a0e5-f2f112e94c43\") " pod="openstack/nova-cell0-cell-mapping-jqs8c" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.827260 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54f2bf58-66f4-4f64-9937-6cd0cc80403e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"54f2bf58-66f4-4f64-9937-6cd0cc80403e\") " pod="openstack/nova-api-0" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.831402 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gld77\" (UniqueName: \"kubernetes.io/projected/70c3253b-404c-4db1-a0e5-f2f112e94c43-kube-api-access-gld77\") pod \"nova-cell0-cell-mapping-jqs8c\" (UID: \"70c3253b-404c-4db1-a0e5-f2f112e94c43\") " pod="openstack/nova-cell0-cell-mapping-jqs8c" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.831492 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c3253b-404c-4db1-a0e5-f2f112e94c43-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jqs8c\" (UID: \"70c3253b-404c-4db1-a0e5-f2f112e94c43\") " pod="openstack/nova-cell0-cell-mapping-jqs8c" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.831557 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54f2bf58-66f4-4f64-9937-6cd0cc80403e-logs\") pod \"nova-api-0\" (UID: \"54f2bf58-66f4-4f64-9937-6cd0cc80403e\") " pod="openstack/nova-api-0" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.831599 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70c3253b-404c-4db1-a0e5-f2f112e94c43-scripts\") pod \"nova-cell0-cell-mapping-jqs8c\" (UID: \"70c3253b-404c-4db1-a0e5-f2f112e94c43\") " pod="openstack/nova-cell0-cell-mapping-jqs8c" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.857323 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70c3253b-404c-4db1-a0e5-f2f112e94c43-scripts\") pod \"nova-cell0-cell-mapping-jqs8c\" (UID: \"70c3253b-404c-4db1-a0e5-f2f112e94c43\") " pod="openstack/nova-cell0-cell-mapping-jqs8c" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.868131 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.888035 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c3253b-404c-4db1-a0e5-f2f112e94c43-config-data\") pod \"nova-cell0-cell-mapping-jqs8c\" (UID: \"70c3253b-404c-4db1-a0e5-f2f112e94c43\") " pod="openstack/nova-cell0-cell-mapping-jqs8c" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.889420 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c3253b-404c-4db1-a0e5-f2f112e94c43-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jqs8c\" (UID: \"70c3253b-404c-4db1-a0e5-f2f112e94c43\") " pod="openstack/nova-cell0-cell-mapping-jqs8c" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.891123 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.893621 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.897228 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.898425 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gld77\" (UniqueName: \"kubernetes.io/projected/70c3253b-404c-4db1-a0e5-f2f112e94c43-kube-api-access-gld77\") pod \"nova-cell0-cell-mapping-jqs8c\" (UID: \"70c3253b-404c-4db1-a0e5-f2f112e94c43\") " pod="openstack/nova-cell0-cell-mapping-jqs8c" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.924568 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.935067 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54f2bf58-66f4-4f64-9937-6cd0cc80403e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"54f2bf58-66f4-4f64-9937-6cd0cc80403e\") " pod="openstack/nova-api-0" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.935163 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33029235-8b5f-4193-9760-c14a1f6de702-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"33029235-8b5f-4193-9760-c14a1f6de702\") " pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.935196 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54f2bf58-66f4-4f64-9937-6cd0cc80403e-logs\") pod \"nova-api-0\" (UID: \"54f2bf58-66f4-4f64-9937-6cd0cc80403e\") " pod="openstack/nova-api-0" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.935226 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6fw8\" (UniqueName: \"kubernetes.io/projected/33029235-8b5f-4193-9760-c14a1f6de702-kube-api-access-h6fw8\") pod \"nova-cell1-novncproxy-0\" (UID: \"33029235-8b5f-4193-9760-c14a1f6de702\") " pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.935299 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33029235-8b5f-4193-9760-c14a1f6de702-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"33029235-8b5f-4193-9760-c14a1f6de702\") " pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.935321 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54f2bf58-66f4-4f64-9937-6cd0cc80403e-config-data\") pod \"nova-api-0\" (UID: \"54f2bf58-66f4-4f64-9937-6cd0cc80403e\") " pod="openstack/nova-api-0" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.935337 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rllkv\" (UniqueName: \"kubernetes.io/projected/54f2bf58-66f4-4f64-9937-6cd0cc80403e-kube-api-access-rllkv\") pod \"nova-api-0\" (UID: \"54f2bf58-66f4-4f64-9937-6cd0cc80403e\") " pod="openstack/nova-api-0" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.937265 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54f2bf58-66f4-4f64-9937-6cd0cc80403e-logs\") pod \"nova-api-0\" (UID: \"54f2bf58-66f4-4f64-9937-6cd0cc80403e\") " pod="openstack/nova-api-0" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.942462 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54f2bf58-66f4-4f64-9937-6cd0cc80403e-config-data\") pod \"nova-api-0\" (UID: \"54f2bf58-66f4-4f64-9937-6cd0cc80403e\") " pod="openstack/nova-api-0" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.952789 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54f2bf58-66f4-4f64-9937-6cd0cc80403e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"54f2bf58-66f4-4f64-9937-6cd0cc80403e\") " pod="openstack/nova-api-0" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.972968 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rllkv\" (UniqueName: \"kubernetes.io/projected/54f2bf58-66f4-4f64-9937-6cd0cc80403e-kube-api-access-rllkv\") pod \"nova-api-0\" (UID: \"54f2bf58-66f4-4f64-9937-6cd0cc80403e\") " pod="openstack/nova-api-0" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.974215 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.975619 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 11 01:13:50 crc kubenswrapper[4743]: I1011 01:13:50.980796 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.004662 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.025123 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-4jblv"] Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.030652 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.039384 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33029235-8b5f-4193-9760-c14a1f6de702-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"33029235-8b5f-4193-9760-c14a1f6de702\") " pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.039442 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fe9ad7b-9381-4d88-93cf-2ad81eb684d7-config-data\") pod \"nova-metadata-0\" (UID: \"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7\") " pod="openstack/nova-metadata-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.039466 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6fw8\" (UniqueName: \"kubernetes.io/projected/33029235-8b5f-4193-9760-c14a1f6de702-kube-api-access-h6fw8\") pod \"nova-cell1-novncproxy-0\" (UID: \"33029235-8b5f-4193-9760-c14a1f6de702\") " pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.039482 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fe9ad7b-9381-4d88-93cf-2ad81eb684d7-logs\") pod \"nova-metadata-0\" (UID: \"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7\") " pod="openstack/nova-metadata-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.039553 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33029235-8b5f-4193-9760-c14a1f6de702-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"33029235-8b5f-4193-9760-c14a1f6de702\") " pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.039617 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe9ad7b-9381-4d88-93cf-2ad81eb684d7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7\") " pod="openstack/nova-metadata-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.039699 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t78t9\" (UniqueName: \"kubernetes.io/projected/2fe9ad7b-9381-4d88-93cf-2ad81eb684d7-kube-api-access-t78t9\") pod \"nova-metadata-0\" (UID: \"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7\") " pod="openstack/nova-metadata-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.044188 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33029235-8b5f-4193-9760-c14a1f6de702-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"33029235-8b5f-4193-9760-c14a1f6de702\") " pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.049301 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.054055 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-4jblv"] Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.063504 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33029235-8b5f-4193-9760-c14a1f6de702-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"33029235-8b5f-4193-9760-c14a1f6de702\") " pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.065419 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6fw8\" (UniqueName: \"kubernetes.io/projected/33029235-8b5f-4193-9760-c14a1f6de702-kube-api-access-h6fw8\") pod \"nova-cell1-novncproxy-0\" (UID: \"33029235-8b5f-4193-9760-c14a1f6de702\") " pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.111130 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.142398 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fe9ad7b-9381-4d88-93cf-2ad81eb684d7-config-data\") pod \"nova-metadata-0\" (UID: \"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7\") " pod="openstack/nova-metadata-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.142434 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fe9ad7b-9381-4d88-93cf-2ad81eb684d7-logs\") pod \"nova-metadata-0\" (UID: \"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7\") " pod="openstack/nova-metadata-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.142460 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-4jblv\" (UID: \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\") " pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.142489 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-4jblv\" (UID: \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\") " pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.142613 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe9ad7b-9381-4d88-93cf-2ad81eb684d7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7\") " pod="openstack/nova-metadata-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.142661 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d441a96d-dada-477d-aa48-0b467c00d5a0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d441a96d-dada-477d-aa48-0b467c00d5a0\") " pod="openstack/nova-scheduler-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.142700 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmcql\" (UniqueName: \"kubernetes.io/projected/d441a96d-dada-477d-aa48-0b467c00d5a0-kube-api-access-lmcql\") pod \"nova-scheduler-0\" (UID: \"d441a96d-dada-477d-aa48-0b467c00d5a0\") " pod="openstack/nova-scheduler-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.142731 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t78t9\" (UniqueName: \"kubernetes.io/projected/2fe9ad7b-9381-4d88-93cf-2ad81eb684d7-kube-api-access-t78t9\") pod \"nova-metadata-0\" (UID: \"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7\") " pod="openstack/nova-metadata-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.142757 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-4jblv\" (UID: \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\") " pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.142784 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-4jblv\" (UID: \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\") " pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.142809 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-config\") pod \"dnsmasq-dns-568d7fd7cf-4jblv\" (UID: \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\") " pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.142828 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb9p6\" (UniqueName: \"kubernetes.io/projected/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-kube-api-access-fb9p6\") pod \"dnsmasq-dns-568d7fd7cf-4jblv\" (UID: \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\") " pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.142888 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d441a96d-dada-477d-aa48-0b467c00d5a0-config-data\") pod \"nova-scheduler-0\" (UID: \"d441a96d-dada-477d-aa48-0b467c00d5a0\") " pod="openstack/nova-scheduler-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.143908 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fe9ad7b-9381-4d88-93cf-2ad81eb684d7-logs\") pod \"nova-metadata-0\" (UID: \"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7\") " pod="openstack/nova-metadata-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.149180 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6c678b5cf4-bmh48" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.150383 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe9ad7b-9381-4d88-93cf-2ad81eb684d7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7\") " pod="openstack/nova-metadata-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.150990 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fe9ad7b-9381-4d88-93cf-2ad81eb684d7-config-data\") pod \"nova-metadata-0\" (UID: \"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7\") " pod="openstack/nova-metadata-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.164317 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t78t9\" (UniqueName: \"kubernetes.io/projected/2fe9ad7b-9381-4d88-93cf-2ad81eb684d7-kube-api-access-t78t9\") pod \"nova-metadata-0\" (UID: \"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7\") " pod="openstack/nova-metadata-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.178881 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jqs8c" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.244696 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af4301e6-88e1-4694-85a7-1215badf534d-combined-ca-bundle\") pod \"af4301e6-88e1-4694-85a7-1215badf534d\" (UID: \"af4301e6-88e1-4694-85a7-1215badf534d\") " Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.244771 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af4301e6-88e1-4694-85a7-1215badf534d-config-data\") pod \"af4301e6-88e1-4694-85a7-1215badf534d\" (UID: \"af4301e6-88e1-4694-85a7-1215badf534d\") " Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.245556 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8xjx\" (UniqueName: \"kubernetes.io/projected/af4301e6-88e1-4694-85a7-1215badf534d-kube-api-access-n8xjx\") pod \"af4301e6-88e1-4694-85a7-1215badf534d\" (UID: \"af4301e6-88e1-4694-85a7-1215badf534d\") " Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.245699 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af4301e6-88e1-4694-85a7-1215badf534d-config-data-custom\") pod \"af4301e6-88e1-4694-85a7-1215badf534d\" (UID: \"af4301e6-88e1-4694-85a7-1215badf534d\") " Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.246018 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d441a96d-dada-477d-aa48-0b467c00d5a0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d441a96d-dada-477d-aa48-0b467c00d5a0\") " pod="openstack/nova-scheduler-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.246070 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmcql\" (UniqueName: \"kubernetes.io/projected/d441a96d-dada-477d-aa48-0b467c00d5a0-kube-api-access-lmcql\") pod \"nova-scheduler-0\" (UID: \"d441a96d-dada-477d-aa48-0b467c00d5a0\") " pod="openstack/nova-scheduler-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.246450 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-4jblv\" (UID: \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\") " pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.246498 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-4jblv\" (UID: \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\") " pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.246541 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-config\") pod \"dnsmasq-dns-568d7fd7cf-4jblv\" (UID: \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\") " pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.246566 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb9p6\" (UniqueName: \"kubernetes.io/projected/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-kube-api-access-fb9p6\") pod \"dnsmasq-dns-568d7fd7cf-4jblv\" (UID: \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\") " pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.246604 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d441a96d-dada-477d-aa48-0b467c00d5a0-config-data\") pod \"nova-scheduler-0\" (UID: \"d441a96d-dada-477d-aa48-0b467c00d5a0\") " pod="openstack/nova-scheduler-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.246642 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-4jblv\" (UID: \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\") " pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.246665 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-4jblv\" (UID: \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\") " pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.247498 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-4jblv\" (UID: \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\") " pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.248060 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-4jblv\" (UID: \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\") " pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.250190 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-4jblv\" (UID: \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\") " pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.251063 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-config\") pod \"dnsmasq-dns-568d7fd7cf-4jblv\" (UID: \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\") " pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.251610 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-4jblv\" (UID: \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\") " pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.255255 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af4301e6-88e1-4694-85a7-1215badf534d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "af4301e6-88e1-4694-85a7-1215badf534d" (UID: "af4301e6-88e1-4694-85a7-1215badf534d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.255956 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d441a96d-dada-477d-aa48-0b467c00d5a0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d441a96d-dada-477d-aa48-0b467c00d5a0\") " pod="openstack/nova-scheduler-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.260336 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d441a96d-dada-477d-aa48-0b467c00d5a0-config-data\") pod \"nova-scheduler-0\" (UID: \"d441a96d-dada-477d-aa48-0b467c00d5a0\") " pod="openstack/nova-scheduler-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.265075 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb9p6\" (UniqueName: \"kubernetes.io/projected/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-kube-api-access-fb9p6\") pod \"dnsmasq-dns-568d7fd7cf-4jblv\" (UID: \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\") " pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.271005 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af4301e6-88e1-4694-85a7-1215badf534d-kube-api-access-n8xjx" (OuterVolumeSpecName: "kube-api-access-n8xjx") pod "af4301e6-88e1-4694-85a7-1215badf534d" (UID: "af4301e6-88e1-4694-85a7-1215badf534d"). InnerVolumeSpecName "kube-api-access-n8xjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.271583 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmcql\" (UniqueName: \"kubernetes.io/projected/d441a96d-dada-477d-aa48-0b467c00d5a0-kube-api-access-lmcql\") pod \"nova-scheduler-0\" (UID: \"d441a96d-dada-477d-aa48-0b467c00d5a0\") " pod="openstack/nova-scheduler-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.311474 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af4301e6-88e1-4694-85a7-1215badf534d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af4301e6-88e1-4694-85a7-1215badf534d" (UID: "af4301e6-88e1-4694-85a7-1215badf534d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.341035 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af4301e6-88e1-4694-85a7-1215badf534d-config-data" (OuterVolumeSpecName: "config-data") pod "af4301e6-88e1-4694-85a7-1215badf534d" (UID: "af4301e6-88e1-4694-85a7-1215badf534d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.348129 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af4301e6-88e1-4694-85a7-1215badf534d-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.348164 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af4301e6-88e1-4694-85a7-1215badf534d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.348173 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af4301e6-88e1-4694-85a7-1215badf534d-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.348182 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8xjx\" (UniqueName: \"kubernetes.io/projected/af4301e6-88e1-4694-85a7-1215badf534d-kube-api-access-n8xjx\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.401670 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.423047 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.423454 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.447886 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.727910 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.750637 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6c678b5cf4-bmh48" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.752114 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6c678b5cf4-bmh48" event={"ID":"af4301e6-88e1-4694-85a7-1215badf534d","Type":"ContainerDied","Data":"3377de58d0fbfefa741404c142433eaa5a579518b7b24367dcd7f35e82241895"} Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.752223 4743 scope.go:117] "RemoveContainer" containerID="8c0dd7350e0076132704f385fde81eb45a4f8bb25fde0b3984cb36baadd16b1b" Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.802506 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6c678b5cf4-bmh48"] Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.816784 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-6c678b5cf4-bmh48"] Oct 11 01:13:51 crc kubenswrapper[4743]: I1011 01:13:51.920503 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.032681 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-jqs8c"] Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.068375 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wjk7q"] Oct 11 01:13:52 crc kubenswrapper[4743]: E1011 01:13:52.068900 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af4301e6-88e1-4694-85a7-1215badf534d" containerName="heat-engine" Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.068915 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="af4301e6-88e1-4694-85a7-1215badf534d" containerName="heat-engine" Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.069105 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="af4301e6-88e1-4694-85a7-1215badf534d" containerName="heat-engine" Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.069777 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wjk7q" Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.073507 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.073682 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.095426 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wjk7q"] Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.151091 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af4301e6-88e1-4694-85a7-1215badf534d" path="/var/lib/kubelet/pods/af4301e6-88e1-4694-85a7-1215badf534d/volumes" Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.169492 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16-config-data\") pod \"nova-cell1-conductor-db-sync-wjk7q\" (UID: \"ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16\") " pod="openstack/nova-cell1-conductor-db-sync-wjk7q" Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.169695 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16-scripts\") pod \"nova-cell1-conductor-db-sync-wjk7q\" (UID: \"ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16\") " pod="openstack/nova-cell1-conductor-db-sync-wjk7q" Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.170125 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wjk7q\" (UID: \"ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16\") " pod="openstack/nova-cell1-conductor-db-sync-wjk7q" Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.170186 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8f6k\" (UniqueName: \"kubernetes.io/projected/ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16-kube-api-access-f8f6k\") pod \"nova-cell1-conductor-db-sync-wjk7q\" (UID: \"ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16\") " pod="openstack/nova-cell1-conductor-db-sync-wjk7q" Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.275648 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16-config-data\") pod \"nova-cell1-conductor-db-sync-wjk7q\" (UID: \"ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16\") " pod="openstack/nova-cell1-conductor-db-sync-wjk7q" Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.275715 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16-scripts\") pod \"nova-cell1-conductor-db-sync-wjk7q\" (UID: \"ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16\") " pod="openstack/nova-cell1-conductor-db-sync-wjk7q" Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.275811 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wjk7q\" (UID: \"ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16\") " pod="openstack/nova-cell1-conductor-db-sync-wjk7q" Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.275843 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8f6k\" (UniqueName: \"kubernetes.io/projected/ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16-kube-api-access-f8f6k\") pod \"nova-cell1-conductor-db-sync-wjk7q\" (UID: \"ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16\") " pod="openstack/nova-cell1-conductor-db-sync-wjk7q" Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.289684 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16-scripts\") pod \"nova-cell1-conductor-db-sync-wjk7q\" (UID: \"ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16\") " pod="openstack/nova-cell1-conductor-db-sync-wjk7q" Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.289698 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wjk7q\" (UID: \"ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16\") " pod="openstack/nova-cell1-conductor-db-sync-wjk7q" Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.290735 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16-config-data\") pod \"nova-cell1-conductor-db-sync-wjk7q\" (UID: \"ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16\") " pod="openstack/nova-cell1-conductor-db-sync-wjk7q" Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.294958 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8f6k\" (UniqueName: \"kubernetes.io/projected/ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16-kube-api-access-f8f6k\") pod \"nova-cell1-conductor-db-sync-wjk7q\" (UID: \"ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16\") " pod="openstack/nova-cell1-conductor-db-sync-wjk7q" Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.408119 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wjk7q" Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.417976 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-4jblv"] Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.469568 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.511817 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.785366 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"54f2bf58-66f4-4f64-9937-6cd0cc80403e","Type":"ContainerStarted","Data":"f8e15f250b304e0d9e4f4f3dc284757c8ec34ebde9a85e1536c91f85a1a61d4a"} Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.787879 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7","Type":"ContainerStarted","Data":"fd74b8b0019542a12dbc816fc69ed7706542f44480a86ce1d01aba10fa8119f9"} Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.790490 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"33029235-8b5f-4193-9760-c14a1f6de702","Type":"ContainerStarted","Data":"82408a8718163a2ba9073cf721c7b1dc02d688c57b5d63ac339ecca2b2de1468"} Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.791736 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" event={"ID":"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e","Type":"ContainerStarted","Data":"aab812d8f91da73c9ce3f9c987b2942b54f867c302cf8ee7930cc507d4bf356b"} Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.791757 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" event={"ID":"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e","Type":"ContainerStarted","Data":"8bb997653ee9e1ec1cadd1d7226ae873b8bdfd6950608a45fe3592f870a6bb9f"} Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.811687 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d441a96d-dada-477d-aa48-0b467c00d5a0","Type":"ContainerStarted","Data":"0095bb124fda810a039dd0477fda2b3c3b7d69174c8a1c712ae980f6a6ba0bef"} Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.829996 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jqs8c" event={"ID":"70c3253b-404c-4db1-a0e5-f2f112e94c43","Type":"ContainerStarted","Data":"e47943025051f713b963ec70bb6fb2087b75e8c359dc81e61d01a650b30278d8"} Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.830043 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jqs8c" event={"ID":"70c3253b-404c-4db1-a0e5-f2f112e94c43","Type":"ContainerStarted","Data":"3ce501f781c8c84bca4a55272c863684ea0c32214c1735b649558186a332f583"} Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.857027 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-jqs8c" podStartSLOduration=2.857002847 podStartE2EDuration="2.857002847s" podCreationTimestamp="2025-10-11 01:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:13:52.848946621 +0000 UTC m=+1327.501927028" watchObservedRunningTime="2025-10-11 01:13:52.857002847 +0000 UTC m=+1327.509983244" Oct 11 01:13:52 crc kubenswrapper[4743]: I1011 01:13:52.971702 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wjk7q"] Oct 11 01:13:53 crc kubenswrapper[4743]: I1011 01:13:53.843121 4743 generic.go:334] "Generic (PLEG): container finished" podID="6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e" containerID="aab812d8f91da73c9ce3f9c987b2942b54f867c302cf8ee7930cc507d4bf356b" exitCode=0 Oct 11 01:13:53 crc kubenswrapper[4743]: I1011 01:13:53.843226 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" event={"ID":"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e","Type":"ContainerDied","Data":"aab812d8f91da73c9ce3f9c987b2942b54f867c302cf8ee7930cc507d4bf356b"} Oct 11 01:13:53 crc kubenswrapper[4743]: I1011 01:13:53.848450 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wjk7q" event={"ID":"ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16","Type":"ContainerStarted","Data":"861b7786b13a062b3e3787a761ca7c7dbe1d6edb76d7f580dc6f805b919dda5c"} Oct 11 01:13:53 crc kubenswrapper[4743]: I1011 01:13:53.848490 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wjk7q" event={"ID":"ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16","Type":"ContainerStarted","Data":"49d041f7e323234b0a848fac843aa5f262b773f18ec8a99278a12f558c7c1f0b"} Oct 11 01:13:53 crc kubenswrapper[4743]: I1011 01:13:53.890543 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-wjk7q" podStartSLOduration=1.890519652 podStartE2EDuration="1.890519652s" podCreationTimestamp="2025-10-11 01:13:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:13:53.885596368 +0000 UTC m=+1328.538576765" watchObservedRunningTime="2025-10-11 01:13:53.890519652 +0000 UTC m=+1328.543500049" Oct 11 01:13:54 crc kubenswrapper[4743]: I1011 01:13:54.220927 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 11 01:13:54 crc kubenswrapper[4743]: I1011 01:13:54.232558 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 11 01:13:54 crc kubenswrapper[4743]: I1011 01:13:54.876324 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" event={"ID":"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e","Type":"ContainerStarted","Data":"74190e0dd8ccc2e83c3d5b116f213c662e79d07e807ce83d49955bd900f89bd7"} Oct 11 01:13:54 crc kubenswrapper[4743]: I1011 01:13:54.877141 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" Oct 11 01:13:54 crc kubenswrapper[4743]: I1011 01:13:54.907453 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" podStartSLOduration=4.907425553 podStartE2EDuration="4.907425553s" podCreationTimestamp="2025-10-11 01:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:13:54.905390196 +0000 UTC m=+1329.558370593" watchObservedRunningTime="2025-10-11 01:13:54.907425553 +0000 UTC m=+1329.560405970" Oct 11 01:13:57 crc kubenswrapper[4743]: I1011 01:13:57.919848 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7","Type":"ContainerStarted","Data":"59b85102c1814cc18dd60a73b29de45cff7f6f85eb488709fb742d80f27764fa"} Oct 11 01:13:57 crc kubenswrapper[4743]: I1011 01:13:57.920346 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7","Type":"ContainerStarted","Data":"9f4f1f7bcf8d0971ac90a06a9f44f3431755f6e8d864371f368495d377d5fe4a"} Oct 11 01:13:57 crc kubenswrapper[4743]: I1011 01:13:57.920454 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2fe9ad7b-9381-4d88-93cf-2ad81eb684d7" containerName="nova-metadata-log" containerID="cri-o://9f4f1f7bcf8d0971ac90a06a9f44f3431755f6e8d864371f368495d377d5fe4a" gracePeriod=30 Oct 11 01:13:57 crc kubenswrapper[4743]: I1011 01:13:57.920827 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2fe9ad7b-9381-4d88-93cf-2ad81eb684d7" containerName="nova-metadata-metadata" containerID="cri-o://59b85102c1814cc18dd60a73b29de45cff7f6f85eb488709fb742d80f27764fa" gracePeriod=30 Oct 11 01:13:57 crc kubenswrapper[4743]: I1011 01:13:57.923076 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"33029235-8b5f-4193-9760-c14a1f6de702","Type":"ContainerStarted","Data":"c8948275b1d35c2b29935cc28850d32ecf3202c63120f5d9810e492a2a782ee8"} Oct 11 01:13:57 crc kubenswrapper[4743]: I1011 01:13:57.923155 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="33029235-8b5f-4193-9760-c14a1f6de702" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c8948275b1d35c2b29935cc28850d32ecf3202c63120f5d9810e492a2a782ee8" gracePeriod=30 Oct 11 01:13:57 crc kubenswrapper[4743]: I1011 01:13:57.926663 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d441a96d-dada-477d-aa48-0b467c00d5a0","Type":"ContainerStarted","Data":"b087a3f710ec0f6aa5d68b818ddb2a7f5cf3db75204a3abcf924510a40755947"} Oct 11 01:13:57 crc kubenswrapper[4743]: I1011 01:13:57.932203 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"54f2bf58-66f4-4f64-9937-6cd0cc80403e","Type":"ContainerStarted","Data":"513336ce15942b9580a08708701e9a2bee3cb7eb4720e908b228d63fbe74dcd7"} Oct 11 01:13:57 crc kubenswrapper[4743]: I1011 01:13:57.932238 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"54f2bf58-66f4-4f64-9937-6cd0cc80403e","Type":"ContainerStarted","Data":"2f5bd9f394e7190687c58e3063d788de72a8546d59a54993fcf3bd95a41d4891"} Oct 11 01:13:57 crc kubenswrapper[4743]: I1011 01:13:57.951020 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.787731043 podStartE2EDuration="7.950996773s" podCreationTimestamp="2025-10-11 01:13:50 +0000 UTC" firstStartedPulling="2025-10-11 01:13:52.46464218 +0000 UTC m=+1327.117622577" lastFinishedPulling="2025-10-11 01:13:56.6279079 +0000 UTC m=+1331.280888307" observedRunningTime="2025-10-11 01:13:57.93868782 +0000 UTC m=+1332.591668227" watchObservedRunningTime="2025-10-11 01:13:57.950996773 +0000 UTC m=+1332.603977170" Oct 11 01:13:57 crc kubenswrapper[4743]: I1011 01:13:57.958024 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.279894867 podStartE2EDuration="7.958003795s" podCreationTimestamp="2025-10-11 01:13:50 +0000 UTC" firstStartedPulling="2025-10-11 01:13:51.944212223 +0000 UTC m=+1326.597192620" lastFinishedPulling="2025-10-11 01:13:56.622321151 +0000 UTC m=+1331.275301548" observedRunningTime="2025-10-11 01:13:57.955936857 +0000 UTC m=+1332.608917264" watchObservedRunningTime="2025-10-11 01:13:57.958003795 +0000 UTC m=+1332.610984192" Oct 11 01:13:57 crc kubenswrapper[4743]: I1011 01:13:57.980591 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.843133389 podStartE2EDuration="7.980570464s" podCreationTimestamp="2025-10-11 01:13:50 +0000 UTC" firstStartedPulling="2025-10-11 01:13:52.484008486 +0000 UTC m=+1327.136988883" lastFinishedPulling="2025-10-11 01:13:56.621445561 +0000 UTC m=+1331.274425958" observedRunningTime="2025-10-11 01:13:57.978938577 +0000 UTC m=+1332.631918974" watchObservedRunningTime="2025-10-11 01:13:57.980570464 +0000 UTC m=+1332.633550881" Oct 11 01:13:58 crc kubenswrapper[4743]: I1011 01:13:58.008031 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.131232092 podStartE2EDuration="8.008013686s" podCreationTimestamp="2025-10-11 01:13:50 +0000 UTC" firstStartedPulling="2025-10-11 01:13:51.745064536 +0000 UTC m=+1326.398044933" lastFinishedPulling="2025-10-11 01:13:56.62184613 +0000 UTC m=+1331.274826527" observedRunningTime="2025-10-11 01:13:57.998923667 +0000 UTC m=+1332.651904064" watchObservedRunningTime="2025-10-11 01:13:58.008013686 +0000 UTC m=+1332.660994083" Oct 11 01:13:58 crc kubenswrapper[4743]: I1011 01:13:58.701799 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 11 01:13:58 crc kubenswrapper[4743]: I1011 01:13:58.847130 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fe9ad7b-9381-4d88-93cf-2ad81eb684d7-logs\") pod \"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7\" (UID: \"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7\") " Oct 11 01:13:58 crc kubenswrapper[4743]: I1011 01:13:58.847185 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fe9ad7b-9381-4d88-93cf-2ad81eb684d7-config-data\") pod \"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7\" (UID: \"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7\") " Oct 11 01:13:58 crc kubenswrapper[4743]: I1011 01:13:58.847466 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fe9ad7b-9381-4d88-93cf-2ad81eb684d7-logs" (OuterVolumeSpecName: "logs") pod "2fe9ad7b-9381-4d88-93cf-2ad81eb684d7" (UID: "2fe9ad7b-9381-4d88-93cf-2ad81eb684d7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:13:58 crc kubenswrapper[4743]: I1011 01:13:58.848093 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t78t9\" (UniqueName: \"kubernetes.io/projected/2fe9ad7b-9381-4d88-93cf-2ad81eb684d7-kube-api-access-t78t9\") pod \"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7\" (UID: \"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7\") " Oct 11 01:13:58 crc kubenswrapper[4743]: I1011 01:13:58.848140 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe9ad7b-9381-4d88-93cf-2ad81eb684d7-combined-ca-bundle\") pod \"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7\" (UID: \"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7\") " Oct 11 01:13:58 crc kubenswrapper[4743]: I1011 01:13:58.848679 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fe9ad7b-9381-4d88-93cf-2ad81eb684d7-logs\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:58 crc kubenswrapper[4743]: I1011 01:13:58.854957 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fe9ad7b-9381-4d88-93cf-2ad81eb684d7-kube-api-access-t78t9" (OuterVolumeSpecName: "kube-api-access-t78t9") pod "2fe9ad7b-9381-4d88-93cf-2ad81eb684d7" (UID: "2fe9ad7b-9381-4d88-93cf-2ad81eb684d7"). InnerVolumeSpecName "kube-api-access-t78t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:13:58 crc kubenswrapper[4743]: I1011 01:13:58.886037 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fe9ad7b-9381-4d88-93cf-2ad81eb684d7-config-data" (OuterVolumeSpecName: "config-data") pod "2fe9ad7b-9381-4d88-93cf-2ad81eb684d7" (UID: "2fe9ad7b-9381-4d88-93cf-2ad81eb684d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:58 crc kubenswrapper[4743]: I1011 01:13:58.889398 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fe9ad7b-9381-4d88-93cf-2ad81eb684d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2fe9ad7b-9381-4d88-93cf-2ad81eb684d7" (UID: "2fe9ad7b-9381-4d88-93cf-2ad81eb684d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:13:58 crc kubenswrapper[4743]: I1011 01:13:58.943781 4743 generic.go:334] "Generic (PLEG): container finished" podID="2fe9ad7b-9381-4d88-93cf-2ad81eb684d7" containerID="59b85102c1814cc18dd60a73b29de45cff7f6f85eb488709fb742d80f27764fa" exitCode=0 Oct 11 01:13:58 crc kubenswrapper[4743]: I1011 01:13:58.944047 4743 generic.go:334] "Generic (PLEG): container finished" podID="2fe9ad7b-9381-4d88-93cf-2ad81eb684d7" containerID="9f4f1f7bcf8d0971ac90a06a9f44f3431755f6e8d864371f368495d377d5fe4a" exitCode=143 Oct 11 01:13:58 crc kubenswrapper[4743]: I1011 01:13:58.944061 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7","Type":"ContainerDied","Data":"59b85102c1814cc18dd60a73b29de45cff7f6f85eb488709fb742d80f27764fa"} Oct 11 01:13:58 crc kubenswrapper[4743]: I1011 01:13:58.944104 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7","Type":"ContainerDied","Data":"9f4f1f7bcf8d0971ac90a06a9f44f3431755f6e8d864371f368495d377d5fe4a"} Oct 11 01:13:58 crc kubenswrapper[4743]: I1011 01:13:58.944120 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2fe9ad7b-9381-4d88-93cf-2ad81eb684d7","Type":"ContainerDied","Data":"fd74b8b0019542a12dbc816fc69ed7706542f44480a86ce1d01aba10fa8119f9"} Oct 11 01:13:58 crc kubenswrapper[4743]: I1011 01:13:58.944140 4743 scope.go:117] "RemoveContainer" containerID="59b85102c1814cc18dd60a73b29de45cff7f6f85eb488709fb742d80f27764fa" Oct 11 01:13:58 crc kubenswrapper[4743]: I1011 01:13:58.944023 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 11 01:13:58 crc kubenswrapper[4743]: I1011 01:13:58.950340 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fe9ad7b-9381-4d88-93cf-2ad81eb684d7-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:58 crc kubenswrapper[4743]: I1011 01:13:58.950361 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t78t9\" (UniqueName: \"kubernetes.io/projected/2fe9ad7b-9381-4d88-93cf-2ad81eb684d7-kube-api-access-t78t9\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:58 crc kubenswrapper[4743]: I1011 01:13:58.950371 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe9ad7b-9381-4d88-93cf-2ad81eb684d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:13:58 crc kubenswrapper[4743]: I1011 01:13:58.990975 4743 scope.go:117] "RemoveContainer" containerID="9f4f1f7bcf8d0971ac90a06a9f44f3431755f6e8d864371f368495d377d5fe4a" Oct 11 01:13:58 crc kubenswrapper[4743]: I1011 01:13:58.994016 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.017273 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.024080 4743 scope.go:117] "RemoveContainer" containerID="59b85102c1814cc18dd60a73b29de45cff7f6f85eb488709fb742d80f27764fa" Oct 11 01:13:59 crc kubenswrapper[4743]: E1011 01:13:59.024499 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59b85102c1814cc18dd60a73b29de45cff7f6f85eb488709fb742d80f27764fa\": container with ID starting with 59b85102c1814cc18dd60a73b29de45cff7f6f85eb488709fb742d80f27764fa not found: ID does not exist" containerID="59b85102c1814cc18dd60a73b29de45cff7f6f85eb488709fb742d80f27764fa" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.024536 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59b85102c1814cc18dd60a73b29de45cff7f6f85eb488709fb742d80f27764fa"} err="failed to get container status \"59b85102c1814cc18dd60a73b29de45cff7f6f85eb488709fb742d80f27764fa\": rpc error: code = NotFound desc = could not find container \"59b85102c1814cc18dd60a73b29de45cff7f6f85eb488709fb742d80f27764fa\": container with ID starting with 59b85102c1814cc18dd60a73b29de45cff7f6f85eb488709fb742d80f27764fa not found: ID does not exist" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.024562 4743 scope.go:117] "RemoveContainer" containerID="9f4f1f7bcf8d0971ac90a06a9f44f3431755f6e8d864371f368495d377d5fe4a" Oct 11 01:13:59 crc kubenswrapper[4743]: E1011 01:13:59.024942 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f4f1f7bcf8d0971ac90a06a9f44f3431755f6e8d864371f368495d377d5fe4a\": container with ID starting with 9f4f1f7bcf8d0971ac90a06a9f44f3431755f6e8d864371f368495d377d5fe4a not found: ID does not exist" containerID="9f4f1f7bcf8d0971ac90a06a9f44f3431755f6e8d864371f368495d377d5fe4a" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.024964 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f4f1f7bcf8d0971ac90a06a9f44f3431755f6e8d864371f368495d377d5fe4a"} err="failed to get container status \"9f4f1f7bcf8d0971ac90a06a9f44f3431755f6e8d864371f368495d377d5fe4a\": rpc error: code = NotFound desc = could not find container \"9f4f1f7bcf8d0971ac90a06a9f44f3431755f6e8d864371f368495d377d5fe4a\": container with ID starting with 9f4f1f7bcf8d0971ac90a06a9f44f3431755f6e8d864371f368495d377d5fe4a not found: ID does not exist" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.024978 4743 scope.go:117] "RemoveContainer" containerID="59b85102c1814cc18dd60a73b29de45cff7f6f85eb488709fb742d80f27764fa" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.025192 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59b85102c1814cc18dd60a73b29de45cff7f6f85eb488709fb742d80f27764fa"} err="failed to get container status \"59b85102c1814cc18dd60a73b29de45cff7f6f85eb488709fb742d80f27764fa\": rpc error: code = NotFound desc = could not find container \"59b85102c1814cc18dd60a73b29de45cff7f6f85eb488709fb742d80f27764fa\": container with ID starting with 59b85102c1814cc18dd60a73b29de45cff7f6f85eb488709fb742d80f27764fa not found: ID does not exist" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.025209 4743 scope.go:117] "RemoveContainer" containerID="9f4f1f7bcf8d0971ac90a06a9f44f3431755f6e8d864371f368495d377d5fe4a" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.025385 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f4f1f7bcf8d0971ac90a06a9f44f3431755f6e8d864371f368495d377d5fe4a"} err="failed to get container status \"9f4f1f7bcf8d0971ac90a06a9f44f3431755f6e8d864371f368495d377d5fe4a\": rpc error: code = NotFound desc = could not find container \"9f4f1f7bcf8d0971ac90a06a9f44f3431755f6e8d864371f368495d377d5fe4a\": container with ID starting with 9f4f1f7bcf8d0971ac90a06a9f44f3431755f6e8d864371f368495d377d5fe4a not found: ID does not exist" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.029716 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 11 01:13:59 crc kubenswrapper[4743]: E1011 01:13:59.030125 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fe9ad7b-9381-4d88-93cf-2ad81eb684d7" containerName="nova-metadata-log" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.030140 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fe9ad7b-9381-4d88-93cf-2ad81eb684d7" containerName="nova-metadata-log" Oct 11 01:13:59 crc kubenswrapper[4743]: E1011 01:13:59.030153 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fe9ad7b-9381-4d88-93cf-2ad81eb684d7" containerName="nova-metadata-metadata" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.030160 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fe9ad7b-9381-4d88-93cf-2ad81eb684d7" containerName="nova-metadata-metadata" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.030363 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fe9ad7b-9381-4d88-93cf-2ad81eb684d7" containerName="nova-metadata-metadata" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.030382 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fe9ad7b-9381-4d88-93cf-2ad81eb684d7" containerName="nova-metadata-log" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.033156 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.035753 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.036099 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.042908 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.156202 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9360186-9314-44aa-8203-f09057d76e6a-logs\") pod \"nova-metadata-0\" (UID: \"b9360186-9314-44aa-8203-f09057d76e6a\") " pod="openstack/nova-metadata-0" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.156310 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9360186-9314-44aa-8203-f09057d76e6a-config-data\") pod \"nova-metadata-0\" (UID: \"b9360186-9314-44aa-8203-f09057d76e6a\") " pod="openstack/nova-metadata-0" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.156380 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9360186-9314-44aa-8203-f09057d76e6a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b9360186-9314-44aa-8203-f09057d76e6a\") " pod="openstack/nova-metadata-0" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.157328 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9360186-9314-44aa-8203-f09057d76e6a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b9360186-9314-44aa-8203-f09057d76e6a\") " pod="openstack/nova-metadata-0" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.157426 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v928k\" (UniqueName: \"kubernetes.io/projected/b9360186-9314-44aa-8203-f09057d76e6a-kube-api-access-v928k\") pod \"nova-metadata-0\" (UID: \"b9360186-9314-44aa-8203-f09057d76e6a\") " pod="openstack/nova-metadata-0" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.259672 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9360186-9314-44aa-8203-f09057d76e6a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b9360186-9314-44aa-8203-f09057d76e6a\") " pod="openstack/nova-metadata-0" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.260462 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v928k\" (UniqueName: \"kubernetes.io/projected/b9360186-9314-44aa-8203-f09057d76e6a-kube-api-access-v928k\") pod \"nova-metadata-0\" (UID: \"b9360186-9314-44aa-8203-f09057d76e6a\") " pod="openstack/nova-metadata-0" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.260542 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9360186-9314-44aa-8203-f09057d76e6a-logs\") pod \"nova-metadata-0\" (UID: \"b9360186-9314-44aa-8203-f09057d76e6a\") " pod="openstack/nova-metadata-0" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.260649 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9360186-9314-44aa-8203-f09057d76e6a-config-data\") pod \"nova-metadata-0\" (UID: \"b9360186-9314-44aa-8203-f09057d76e6a\") " pod="openstack/nova-metadata-0" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.260688 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9360186-9314-44aa-8203-f09057d76e6a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b9360186-9314-44aa-8203-f09057d76e6a\") " pod="openstack/nova-metadata-0" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.261276 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9360186-9314-44aa-8203-f09057d76e6a-logs\") pod \"nova-metadata-0\" (UID: \"b9360186-9314-44aa-8203-f09057d76e6a\") " pod="openstack/nova-metadata-0" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.264270 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9360186-9314-44aa-8203-f09057d76e6a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b9360186-9314-44aa-8203-f09057d76e6a\") " pod="openstack/nova-metadata-0" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.266026 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9360186-9314-44aa-8203-f09057d76e6a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b9360186-9314-44aa-8203-f09057d76e6a\") " pod="openstack/nova-metadata-0" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.267498 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9360186-9314-44aa-8203-f09057d76e6a-config-data\") pod \"nova-metadata-0\" (UID: \"b9360186-9314-44aa-8203-f09057d76e6a\") " pod="openstack/nova-metadata-0" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.281005 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v928k\" (UniqueName: \"kubernetes.io/projected/b9360186-9314-44aa-8203-f09057d76e6a-kube-api-access-v928k\") pod \"nova-metadata-0\" (UID: \"b9360186-9314-44aa-8203-f09057d76e6a\") " pod="openstack/nova-metadata-0" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.358622 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.838550 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 11 01:13:59 crc kubenswrapper[4743]: I1011 01:13:59.959087 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9360186-9314-44aa-8203-f09057d76e6a","Type":"ContainerStarted","Data":"69e88b7b8ad556ede053aa4638f82e07fe468f26e9a2d4287dabddcb7dd05c28"} Oct 11 01:14:00 crc kubenswrapper[4743]: I1011 01:14:00.107252 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fe9ad7b-9381-4d88-93cf-2ad81eb684d7" path="/var/lib/kubelet/pods/2fe9ad7b-9381-4d88-93cf-2ad81eb684d7/volumes" Oct 11 01:14:00 crc kubenswrapper[4743]: I1011 01:14:00.975751 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9360186-9314-44aa-8203-f09057d76e6a","Type":"ContainerStarted","Data":"d3f58b59a52c8aef8a78ff7083541bb004e80f8d9cb9a233a388614a45c77c3f"} Oct 11 01:14:00 crc kubenswrapper[4743]: I1011 01:14:00.976080 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9360186-9314-44aa-8203-f09057d76e6a","Type":"ContainerStarted","Data":"a5911da0e5120c192e5b67191329ee08c6f320810a99abe14fddc8e4af19b79d"} Oct 11 01:14:00 crc kubenswrapper[4743]: I1011 01:14:00.998131 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.998105056 podStartE2EDuration="2.998105056s" podCreationTimestamp="2025-10-11 01:13:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:14:00.993171133 +0000 UTC m=+1335.646151540" watchObservedRunningTime="2025-10-11 01:14:00.998105056 +0000 UTC m=+1335.651085493" Oct 11 01:14:01 crc kubenswrapper[4743]: I1011 01:14:01.050014 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 11 01:14:01 crc kubenswrapper[4743]: I1011 01:14:01.050052 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 11 01:14:01 crc kubenswrapper[4743]: I1011 01:14:01.111985 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:14:01 crc kubenswrapper[4743]: I1011 01:14:01.423936 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 11 01:14:01 crc kubenswrapper[4743]: I1011 01:14:01.423977 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 11 01:14:01 crc kubenswrapper[4743]: I1011 01:14:01.449076 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" Oct 11 01:14:01 crc kubenswrapper[4743]: I1011 01:14:01.462303 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 11 01:14:01 crc kubenswrapper[4743]: I1011 01:14:01.522915 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-zn2q6"] Oct 11 01:14:01 crc kubenswrapper[4743]: I1011 01:14:01.523186 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" podUID="0eee5a3c-bfbc-4975-ae73-2a33d414993d" containerName="dnsmasq-dns" containerID="cri-o://a47dc85cfad70d954a488b40336b8c7e14e47decfbc6af22ee4cd25cfd7faf3a" gracePeriod=10 Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.016009 4743 generic.go:334] "Generic (PLEG): container finished" podID="0eee5a3c-bfbc-4975-ae73-2a33d414993d" containerID="a47dc85cfad70d954a488b40336b8c7e14e47decfbc6af22ee4cd25cfd7faf3a" exitCode=0 Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.016093 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" event={"ID":"0eee5a3c-bfbc-4975-ae73-2a33d414993d","Type":"ContainerDied","Data":"a47dc85cfad70d954a488b40336b8c7e14e47decfbc6af22ee4cd25cfd7faf3a"} Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.019114 4743 generic.go:334] "Generic (PLEG): container finished" podID="ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16" containerID="861b7786b13a062b3e3787a761ca7c7dbe1d6edb76d7f580dc6f805b919dda5c" exitCode=0 Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.019188 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wjk7q" event={"ID":"ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16","Type":"ContainerDied","Data":"861b7786b13a062b3e3787a761ca7c7dbe1d6edb76d7f580dc6f805b919dda5c"} Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.025946 4743 generic.go:334] "Generic (PLEG): container finished" podID="70c3253b-404c-4db1-a0e5-f2f112e94c43" containerID="e47943025051f713b963ec70bb6fb2087b75e8c359dc81e61d01a650b30278d8" exitCode=0 Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.026530 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jqs8c" event={"ID":"70c3253b-404c-4db1-a0e5-f2f112e94c43","Type":"ContainerDied","Data":"e47943025051f713b963ec70bb6fb2087b75e8c359dc81e61d01a650b30278d8"} Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.068968 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.099328 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.137608 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="54f2bf58-66f4-4f64-9937-6cd0cc80403e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.214:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.137934 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="54f2bf58-66f4-4f64-9937-6cd0cc80403e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.214:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.225917 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-dns-swift-storage-0\") pod \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\" (UID: \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\") " Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.226007 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-dns-svc\") pod \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\" (UID: \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\") " Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.226155 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tljpl\" (UniqueName: \"kubernetes.io/projected/0eee5a3c-bfbc-4975-ae73-2a33d414993d-kube-api-access-tljpl\") pod \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\" (UID: \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\") " Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.226190 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-ovsdbserver-nb\") pod \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\" (UID: \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\") " Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.226229 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-config\") pod \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\" (UID: \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\") " Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.226308 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-ovsdbserver-sb\") pod \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\" (UID: \"0eee5a3c-bfbc-4975-ae73-2a33d414993d\") " Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.241656 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eee5a3c-bfbc-4975-ae73-2a33d414993d-kube-api-access-tljpl" (OuterVolumeSpecName: "kube-api-access-tljpl") pod "0eee5a3c-bfbc-4975-ae73-2a33d414993d" (UID: "0eee5a3c-bfbc-4975-ae73-2a33d414993d"). InnerVolumeSpecName "kube-api-access-tljpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.298646 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0eee5a3c-bfbc-4975-ae73-2a33d414993d" (UID: "0eee5a3c-bfbc-4975-ae73-2a33d414993d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.306745 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0eee5a3c-bfbc-4975-ae73-2a33d414993d" (UID: "0eee5a3c-bfbc-4975-ae73-2a33d414993d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.308043 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0eee5a3c-bfbc-4975-ae73-2a33d414993d" (UID: "0eee5a3c-bfbc-4975-ae73-2a33d414993d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.315362 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-config" (OuterVolumeSpecName: "config") pod "0eee5a3c-bfbc-4975-ae73-2a33d414993d" (UID: "0eee5a3c-bfbc-4975-ae73-2a33d414993d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.322850 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0eee5a3c-bfbc-4975-ae73-2a33d414993d" (UID: "0eee5a3c-bfbc-4975-ae73-2a33d414993d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.329490 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.329524 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.329536 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tljpl\" (UniqueName: \"kubernetes.io/projected/0eee5a3c-bfbc-4975-ae73-2a33d414993d-kube-api-access-tljpl\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.329548 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.329556 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:02 crc kubenswrapper[4743]: I1011 01:14:02.329564 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0eee5a3c-bfbc-4975-ae73-2a33d414993d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.042814 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" event={"ID":"0eee5a3c-bfbc-4975-ae73-2a33d414993d","Type":"ContainerDied","Data":"d81cc8604a270822001d29705df8201b8f9a2d7fc2322c78dba5267eb154f4ac"} Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.043244 4743 scope.go:117] "RemoveContainer" containerID="a47dc85cfad70d954a488b40336b8c7e14e47decfbc6af22ee4cd25cfd7faf3a" Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.042962 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-zn2q6" Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.088128 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-zn2q6"] Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.091010 4743 scope.go:117] "RemoveContainer" containerID="f7267109be097986e6cf7305fe5daa2a45df167979ef4f092249b0121c3614cb" Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.100637 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-zn2q6"] Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.616682 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wjk7q" Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.623337 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jqs8c" Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.760743 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70c3253b-404c-4db1-a0e5-f2f112e94c43-scripts\") pod \"70c3253b-404c-4db1-a0e5-f2f112e94c43\" (UID: \"70c3253b-404c-4db1-a0e5-f2f112e94c43\") " Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.760843 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8f6k\" (UniqueName: \"kubernetes.io/projected/ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16-kube-api-access-f8f6k\") pod \"ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16\" (UID: \"ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16\") " Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.760908 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16-config-data\") pod \"ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16\" (UID: \"ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16\") " Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.761134 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c3253b-404c-4db1-a0e5-f2f112e94c43-config-data\") pod \"70c3253b-404c-4db1-a0e5-f2f112e94c43\" (UID: \"70c3253b-404c-4db1-a0e5-f2f112e94c43\") " Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.762407 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16-combined-ca-bundle\") pod \"ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16\" (UID: \"ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16\") " Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.762471 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gld77\" (UniqueName: \"kubernetes.io/projected/70c3253b-404c-4db1-a0e5-f2f112e94c43-kube-api-access-gld77\") pod \"70c3253b-404c-4db1-a0e5-f2f112e94c43\" (UID: \"70c3253b-404c-4db1-a0e5-f2f112e94c43\") " Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.762530 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c3253b-404c-4db1-a0e5-f2f112e94c43-combined-ca-bundle\") pod \"70c3253b-404c-4db1-a0e5-f2f112e94c43\" (UID: \"70c3253b-404c-4db1-a0e5-f2f112e94c43\") " Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.762579 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16-scripts\") pod \"ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16\" (UID: \"ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16\") " Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.766786 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c3253b-404c-4db1-a0e5-f2f112e94c43-scripts" (OuterVolumeSpecName: "scripts") pod "70c3253b-404c-4db1-a0e5-f2f112e94c43" (UID: "70c3253b-404c-4db1-a0e5-f2f112e94c43"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.768909 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16-kube-api-access-f8f6k" (OuterVolumeSpecName: "kube-api-access-f8f6k") pod "ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16" (UID: "ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16"). InnerVolumeSpecName "kube-api-access-f8f6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.769497 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16-scripts" (OuterVolumeSpecName: "scripts") pod "ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16" (UID: "ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.769742 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70c3253b-404c-4db1-a0e5-f2f112e94c43-kube-api-access-gld77" (OuterVolumeSpecName: "kube-api-access-gld77") pod "70c3253b-404c-4db1-a0e5-f2f112e94c43" (UID: "70c3253b-404c-4db1-a0e5-f2f112e94c43"). InnerVolumeSpecName "kube-api-access-gld77". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.793302 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16-config-data" (OuterVolumeSpecName: "config-data") pod "ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16" (UID: "ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.795590 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c3253b-404c-4db1-a0e5-f2f112e94c43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70c3253b-404c-4db1-a0e5-f2f112e94c43" (UID: "70c3253b-404c-4db1-a0e5-f2f112e94c43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.801416 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c3253b-404c-4db1-a0e5-f2f112e94c43-config-data" (OuterVolumeSpecName: "config-data") pod "70c3253b-404c-4db1-a0e5-f2f112e94c43" (UID: "70c3253b-404c-4db1-a0e5-f2f112e94c43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.820732 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16" (UID: "ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.865282 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.865318 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gld77\" (UniqueName: \"kubernetes.io/projected/70c3253b-404c-4db1-a0e5-f2f112e94c43-kube-api-access-gld77\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.865368 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c3253b-404c-4db1-a0e5-f2f112e94c43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.865380 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.865392 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70c3253b-404c-4db1-a0e5-f2f112e94c43-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.865400 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8f6k\" (UniqueName: \"kubernetes.io/projected/ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16-kube-api-access-f8f6k\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.865408 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:03 crc kubenswrapper[4743]: I1011 01:14:03.865435 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c3253b-404c-4db1-a0e5-f2f112e94c43-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.060205 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wjk7q" event={"ID":"ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16","Type":"ContainerDied","Data":"49d041f7e323234b0a848fac843aa5f262b773f18ec8a99278a12f558c7c1f0b"} Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.063941 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49d041f7e323234b0a848fac843aa5f262b773f18ec8a99278a12f558c7c1f0b" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.064106 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wjk7q" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.083095 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jqs8c" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.085581 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jqs8c" event={"ID":"70c3253b-404c-4db1-a0e5-f2f112e94c43","Type":"ContainerDied","Data":"3ce501f781c8c84bca4a55272c863684ea0c32214c1735b649558186a332f583"} Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.085654 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ce501f781c8c84bca4a55272c863684ea0c32214c1735b649558186a332f583" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.115622 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eee5a3c-bfbc-4975-ae73-2a33d414993d" path="/var/lib/kubelet/pods/0eee5a3c-bfbc-4975-ae73-2a33d414993d/volumes" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.164842 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 11 01:14:04 crc kubenswrapper[4743]: E1011 01:14:04.165460 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c3253b-404c-4db1-a0e5-f2f112e94c43" containerName="nova-manage" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.165490 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c3253b-404c-4db1-a0e5-f2f112e94c43" containerName="nova-manage" Oct 11 01:14:04 crc kubenswrapper[4743]: E1011 01:14:04.165561 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16" containerName="nova-cell1-conductor-db-sync" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.165575 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16" containerName="nova-cell1-conductor-db-sync" Oct 11 01:14:04 crc kubenswrapper[4743]: E1011 01:14:04.165600 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eee5a3c-bfbc-4975-ae73-2a33d414993d" containerName="init" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.165613 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eee5a3c-bfbc-4975-ae73-2a33d414993d" containerName="init" Oct 11 01:14:04 crc kubenswrapper[4743]: E1011 01:14:04.165631 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eee5a3c-bfbc-4975-ae73-2a33d414993d" containerName="dnsmasq-dns" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.165642 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eee5a3c-bfbc-4975-ae73-2a33d414993d" containerName="dnsmasq-dns" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.166021 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c3253b-404c-4db1-a0e5-f2f112e94c43" containerName="nova-manage" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.166047 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16" containerName="nova-cell1-conductor-db-sync" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.166075 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eee5a3c-bfbc-4975-ae73-2a33d414993d" containerName="dnsmasq-dns" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.167229 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.171103 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.192920 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.241336 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.241619 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="54f2bf58-66f4-4f64-9937-6cd0cc80403e" containerName="nova-api-log" containerID="cri-o://2f5bd9f394e7190687c58e3063d788de72a8546d59a54993fcf3bd95a41d4891" gracePeriod=30 Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.241691 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="54f2bf58-66f4-4f64-9937-6cd0cc80403e" containerName="nova-api-api" containerID="cri-o://513336ce15942b9580a08708701e9a2bee3cb7eb4720e908b228d63fbe74dcd7" gracePeriod=30 Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.281684 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f8d6d52-4659-4bde-8eac-469d0008964d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6f8d6d52-4659-4bde-8eac-469d0008964d\") " pod="openstack/nova-cell1-conductor-0" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.281899 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8d6d52-4659-4bde-8eac-469d0008964d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6f8d6d52-4659-4bde-8eac-469d0008964d\") " pod="openstack/nova-cell1-conductor-0" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.282316 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmrx9\" (UniqueName: \"kubernetes.io/projected/6f8d6d52-4659-4bde-8eac-469d0008964d-kube-api-access-nmrx9\") pod \"nova-cell1-conductor-0\" (UID: \"6f8d6d52-4659-4bde-8eac-469d0008964d\") " pod="openstack/nova-cell1-conductor-0" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.309220 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.309496 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d441a96d-dada-477d-aa48-0b467c00d5a0" containerName="nova-scheduler-scheduler" containerID="cri-o://b087a3f710ec0f6aa5d68b818ddb2a7f5cf3db75204a3abcf924510a40755947" gracePeriod=30 Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.322651 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.322907 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b9360186-9314-44aa-8203-f09057d76e6a" containerName="nova-metadata-log" containerID="cri-o://a5911da0e5120c192e5b67191329ee08c6f320810a99abe14fddc8e4af19b79d" gracePeriod=30 Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.323055 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b9360186-9314-44aa-8203-f09057d76e6a" containerName="nova-metadata-metadata" containerID="cri-o://d3f58b59a52c8aef8a78ff7083541bb004e80f8d9cb9a233a388614a45c77c3f" gracePeriod=30 Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.359415 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.359525 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.383975 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f8d6d52-4659-4bde-8eac-469d0008964d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6f8d6d52-4659-4bde-8eac-469d0008964d\") " pod="openstack/nova-cell1-conductor-0" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.384099 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8d6d52-4659-4bde-8eac-469d0008964d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6f8d6d52-4659-4bde-8eac-469d0008964d\") " pod="openstack/nova-cell1-conductor-0" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.384201 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmrx9\" (UniqueName: \"kubernetes.io/projected/6f8d6d52-4659-4bde-8eac-469d0008964d-kube-api-access-nmrx9\") pod \"nova-cell1-conductor-0\" (UID: \"6f8d6d52-4659-4bde-8eac-469d0008964d\") " pod="openstack/nova-cell1-conductor-0" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.388706 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f8d6d52-4659-4bde-8eac-469d0008964d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6f8d6d52-4659-4bde-8eac-469d0008964d\") " pod="openstack/nova-cell1-conductor-0" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.392376 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8d6d52-4659-4bde-8eac-469d0008964d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6f8d6d52-4659-4bde-8eac-469d0008964d\") " pod="openstack/nova-cell1-conductor-0" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.404588 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmrx9\" (UniqueName: \"kubernetes.io/projected/6f8d6d52-4659-4bde-8eac-469d0008964d-kube-api-access-nmrx9\") pod \"nova-cell1-conductor-0\" (UID: \"6f8d6d52-4659-4bde-8eac-469d0008964d\") " pod="openstack/nova-cell1-conductor-0" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.502608 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 11 01:14:04 crc kubenswrapper[4743]: W1011 01:14:04.509449 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fd3ff73_a9d9_493a_b4d2_1a8a73cb8d5e.slice/crio-conmon-aab812d8f91da73c9ce3f9c987b2942b54f867c302cf8ee7930cc507d4bf356b.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fd3ff73_a9d9_493a_b4d2_1a8a73cb8d5e.slice/crio-conmon-aab812d8f91da73c9ce3f9c987b2942b54f867c302cf8ee7930cc507d4bf356b.scope: no such file or directory Oct 11 01:14:04 crc kubenswrapper[4743]: W1011 01:14:04.509503 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fd3ff73_a9d9_493a_b4d2_1a8a73cb8d5e.slice/crio-aab812d8f91da73c9ce3f9c987b2942b54f867c302cf8ee7930cc507d4bf356b.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fd3ff73_a9d9_493a_b4d2_1a8a73cb8d5e.slice/crio-aab812d8f91da73c9ce3f9c987b2942b54f867c302cf8ee7930cc507d4bf356b.scope: no such file or directory Oct 11 01:14:04 crc kubenswrapper[4743]: W1011 01:14:04.509536 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec1be3a1_8fa8_4aa6_b61f_dcc2de3f0d16.slice/crio-49d041f7e323234b0a848fac843aa5f262b773f18ec8a99278a12f558c7c1f0b": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec1be3a1_8fa8_4aa6_b61f_dcc2de3f0d16.slice/crio-49d041f7e323234b0a848fac843aa5f262b773f18ec8a99278a12f558c7c1f0b: no such file or directory Oct 11 01:14:04 crc kubenswrapper[4743]: W1011 01:14:04.509557 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec1be3a1_8fa8_4aa6_b61f_dcc2de3f0d16.slice/crio-conmon-861b7786b13a062b3e3787a761ca7c7dbe1d6edb76d7f580dc6f805b919dda5c.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec1be3a1_8fa8_4aa6_b61f_dcc2de3f0d16.slice/crio-conmon-861b7786b13a062b3e3787a761ca7c7dbe1d6edb76d7f580dc6f805b919dda5c.scope: no such file or directory Oct 11 01:14:04 crc kubenswrapper[4743]: W1011 01:14:04.509575 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec1be3a1_8fa8_4aa6_b61f_dcc2de3f0d16.slice/crio-861b7786b13a062b3e3787a761ca7c7dbe1d6edb76d7f580dc6f805b919dda5c.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec1be3a1_8fa8_4aa6_b61f_dcc2de3f0d16.slice/crio-861b7786b13a062b3e3787a761ca7c7dbe1d6edb76d7f580dc6f805b919dda5c.scope: no such file or directory Oct 11 01:14:04 crc kubenswrapper[4743]: W1011 01:14:04.510074 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54f2bf58_66f4_4f64_9937_6cd0cc80403e.slice/crio-conmon-2f5bd9f394e7190687c58e3063d788de72a8546d59a54993fcf3bd95a41d4891.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54f2bf58_66f4_4f64_9937_6cd0cc80403e.slice/crio-conmon-2f5bd9f394e7190687c58e3063d788de72a8546d59a54993fcf3bd95a41d4891.scope: no such file or directory Oct 11 01:14:04 crc kubenswrapper[4743]: W1011 01:14:04.510138 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe9ad7b_9381_4d88_93cf_2ad81eb684d7.slice/crio-conmon-9f4f1f7bcf8d0971ac90a06a9f44f3431755f6e8d864371f368495d377d5fe4a.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe9ad7b_9381_4d88_93cf_2ad81eb684d7.slice/crio-conmon-9f4f1f7bcf8d0971ac90a06a9f44f3431755f6e8d864371f368495d377d5fe4a.scope: no such file or directory Oct 11 01:14:04 crc kubenswrapper[4743]: W1011 01:14:04.510500 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54f2bf58_66f4_4f64_9937_6cd0cc80403e.slice/crio-2f5bd9f394e7190687c58e3063d788de72a8546d59a54993fcf3bd95a41d4891.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54f2bf58_66f4_4f64_9937_6cd0cc80403e.slice/crio-2f5bd9f394e7190687c58e3063d788de72a8546d59a54993fcf3bd95a41d4891.scope: no such file or directory Oct 11 01:14:04 crc kubenswrapper[4743]: W1011 01:14:04.512599 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe9ad7b_9381_4d88_93cf_2ad81eb684d7.slice/crio-9f4f1f7bcf8d0971ac90a06a9f44f3431755f6e8d864371f368495d377d5fe4a.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe9ad7b_9381_4d88_93cf_2ad81eb684d7.slice/crio-9f4f1f7bcf8d0971ac90a06a9f44f3431755f6e8d864371f368495d377d5fe4a.scope: no such file or directory Oct 11 01:14:04 crc kubenswrapper[4743]: W1011 01:14:04.512643 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe9ad7b_9381_4d88_93cf_2ad81eb684d7.slice/crio-conmon-59b85102c1814cc18dd60a73b29de45cff7f6f85eb488709fb742d80f27764fa.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe9ad7b_9381_4d88_93cf_2ad81eb684d7.slice/crio-conmon-59b85102c1814cc18dd60a73b29de45cff7f6f85eb488709fb742d80f27764fa.scope: no such file or directory Oct 11 01:14:04 crc kubenswrapper[4743]: W1011 01:14:04.512662 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe9ad7b_9381_4d88_93cf_2ad81eb684d7.slice/crio-59b85102c1814cc18dd60a73b29de45cff7f6f85eb488709fb742d80f27764fa.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe9ad7b_9381_4d88_93cf_2ad81eb684d7.slice/crio-59b85102c1814cc18dd60a73b29de45cff7f6f85eb488709fb742d80f27764fa.scope: no such file or directory Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.725944 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-mtbr2"] Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.728113 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-mtbr2" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.749938 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-mtbr2"] Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.792109 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k8vs\" (UniqueName: \"kubernetes.io/projected/6e89352a-0aa8-4c41-ba04-4da0c6e59b3d-kube-api-access-2k8vs\") pod \"aodh-db-create-mtbr2\" (UID: \"6e89352a-0aa8-4c41-ba04-4da0c6e59b3d\") " pod="openstack/aodh-db-create-mtbr2" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.895309 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k8vs\" (UniqueName: \"kubernetes.io/projected/6e89352a-0aa8-4c41-ba04-4da0c6e59b3d-kube-api-access-2k8vs\") pod \"aodh-db-create-mtbr2\" (UID: \"6e89352a-0aa8-4c41-ba04-4da0c6e59b3d\") " pod="openstack/aodh-db-create-mtbr2" Oct 11 01:14:04 crc kubenswrapper[4743]: I1011 01:14:04.913223 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k8vs\" (UniqueName: \"kubernetes.io/projected/6e89352a-0aa8-4c41-ba04-4da0c6e59b3d-kube-api-access-2k8vs\") pod \"aodh-db-create-mtbr2\" (UID: \"6e89352a-0aa8-4c41-ba04-4da0c6e59b3d\") " pod="openstack/aodh-db-create-mtbr2" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.069335 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-mtbr2" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.089337 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.106596 4743 generic.go:334] "Generic (PLEG): container finished" podID="b9360186-9314-44aa-8203-f09057d76e6a" containerID="d3f58b59a52c8aef8a78ff7083541bb004e80f8d9cb9a233a388614a45c77c3f" exitCode=0 Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.106628 4743 generic.go:334] "Generic (PLEG): container finished" podID="b9360186-9314-44aa-8203-f09057d76e6a" containerID="a5911da0e5120c192e5b67191329ee08c6f320810a99abe14fddc8e4af19b79d" exitCode=143 Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.106669 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9360186-9314-44aa-8203-f09057d76e6a","Type":"ContainerDied","Data":"d3f58b59a52c8aef8a78ff7083541bb004e80f8d9cb9a233a388614a45c77c3f"} Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.106696 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9360186-9314-44aa-8203-f09057d76e6a","Type":"ContainerDied","Data":"a5911da0e5120c192e5b67191329ee08c6f320810a99abe14fddc8e4af19b79d"} Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.120080 4743 generic.go:334] "Generic (PLEG): container finished" podID="54f2bf58-66f4-4f64-9937-6cd0cc80403e" containerID="2f5bd9f394e7190687c58e3063d788de72a8546d59a54993fcf3bd95a41d4891" exitCode=143 Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.120145 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"54f2bf58-66f4-4f64-9937-6cd0cc80403e","Type":"ContainerDied","Data":"2f5bd9f394e7190687c58e3063d788de72a8546d59a54993fcf3bd95a41d4891"} Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.126118 4743 generic.go:334] "Generic (PLEG): container finished" podID="8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" containerID="a87a6857e891645b6359c9dd35207ef767024de9d3eec1b0b40667a6c7ac7bd8" exitCode=137 Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.126151 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f","Type":"ContainerDied","Data":"a87a6857e891645b6359c9dd35207ef767024de9d3eec1b0b40667a6c7ac7bd8"} Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.126173 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f","Type":"ContainerDied","Data":"9636c2404d3e927d58328feea44f3e5ec43ec692622c74b9e7c585561b13cd92"} Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.126188 4743 scope.go:117] "RemoveContainer" containerID="a87a6857e891645b6359c9dd35207ef767024de9d3eec1b0b40667a6c7ac7bd8" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.126232 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.158519 4743 scope.go:117] "RemoveContainer" containerID="2bf7ce61f9b0e51face763cac5e2585613a9827439d2b87516169a3ed522be48" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.201212 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-sg-core-conf-yaml\") pod \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.201277 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-combined-ca-bundle\") pod \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.201320 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-config-data\") pod \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.201364 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-scripts\") pod \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.201383 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-run-httpd\") pod \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.201524 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-log-httpd\") pod \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.201584 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhvs5\" (UniqueName: \"kubernetes.io/projected/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-kube-api-access-dhvs5\") pod \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\" (UID: \"8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f\") " Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.202654 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" (UID: "8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.203280 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" (UID: "8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.205046 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.205339 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.208380 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-kube-api-access-dhvs5" (OuterVolumeSpecName: "kube-api-access-dhvs5") pod "8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" (UID: "8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f"). InnerVolumeSpecName "kube-api-access-dhvs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.214009 4743 scope.go:117] "RemoveContainer" containerID="02a33364cb7dcca44f2626b9d3c768a1284c1f443c6199ee6a1a01875d1c3528" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.216102 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-scripts" (OuterVolumeSpecName: "scripts") pod "8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" (UID: "8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.243041 4743 scope.go:117] "RemoveContainer" containerID="a408fbe7dcffa686789536eab5166c077dac6c7b88aa79bf50345b509e3f9f06" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.252011 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" (UID: "8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.301005 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.308309 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhvs5\" (UniqueName: \"kubernetes.io/projected/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-kube-api-access-dhvs5\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.308333 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.308342 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.313310 4743 scope.go:117] "RemoveContainer" containerID="a87a6857e891645b6359c9dd35207ef767024de9d3eec1b0b40667a6c7ac7bd8" Oct 11 01:14:05 crc kubenswrapper[4743]: E1011 01:14:05.313703 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a87a6857e891645b6359c9dd35207ef767024de9d3eec1b0b40667a6c7ac7bd8\": container with ID starting with a87a6857e891645b6359c9dd35207ef767024de9d3eec1b0b40667a6c7ac7bd8 not found: ID does not exist" containerID="a87a6857e891645b6359c9dd35207ef767024de9d3eec1b0b40667a6c7ac7bd8" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.313750 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a87a6857e891645b6359c9dd35207ef767024de9d3eec1b0b40667a6c7ac7bd8"} err="failed to get container status \"a87a6857e891645b6359c9dd35207ef767024de9d3eec1b0b40667a6c7ac7bd8\": rpc error: code = NotFound desc = could not find container \"a87a6857e891645b6359c9dd35207ef767024de9d3eec1b0b40667a6c7ac7bd8\": container with ID starting with a87a6857e891645b6359c9dd35207ef767024de9d3eec1b0b40667a6c7ac7bd8 not found: ID does not exist" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.313783 4743 scope.go:117] "RemoveContainer" containerID="2bf7ce61f9b0e51face763cac5e2585613a9827439d2b87516169a3ed522be48" Oct 11 01:14:05 crc kubenswrapper[4743]: E1011 01:14:05.318019 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bf7ce61f9b0e51face763cac5e2585613a9827439d2b87516169a3ed522be48\": container with ID starting with 2bf7ce61f9b0e51face763cac5e2585613a9827439d2b87516169a3ed522be48 not found: ID does not exist" containerID="2bf7ce61f9b0e51face763cac5e2585613a9827439d2b87516169a3ed522be48" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.318056 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bf7ce61f9b0e51face763cac5e2585613a9827439d2b87516169a3ed522be48"} err="failed to get container status \"2bf7ce61f9b0e51face763cac5e2585613a9827439d2b87516169a3ed522be48\": rpc error: code = NotFound desc = could not find container \"2bf7ce61f9b0e51face763cac5e2585613a9827439d2b87516169a3ed522be48\": container with ID starting with 2bf7ce61f9b0e51face763cac5e2585613a9827439d2b87516169a3ed522be48 not found: ID does not exist" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.318081 4743 scope.go:117] "RemoveContainer" containerID="02a33364cb7dcca44f2626b9d3c768a1284c1f443c6199ee6a1a01875d1c3528" Oct 11 01:14:05 crc kubenswrapper[4743]: E1011 01:14:05.318808 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02a33364cb7dcca44f2626b9d3c768a1284c1f443c6199ee6a1a01875d1c3528\": container with ID starting with 02a33364cb7dcca44f2626b9d3c768a1284c1f443c6199ee6a1a01875d1c3528 not found: ID does not exist" containerID="02a33364cb7dcca44f2626b9d3c768a1284c1f443c6199ee6a1a01875d1c3528" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.318828 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02a33364cb7dcca44f2626b9d3c768a1284c1f443c6199ee6a1a01875d1c3528"} err="failed to get container status \"02a33364cb7dcca44f2626b9d3c768a1284c1f443c6199ee6a1a01875d1c3528\": rpc error: code = NotFound desc = could not find container \"02a33364cb7dcca44f2626b9d3c768a1284c1f443c6199ee6a1a01875d1c3528\": container with ID starting with 02a33364cb7dcca44f2626b9d3c768a1284c1f443c6199ee6a1a01875d1c3528 not found: ID does not exist" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.318844 4743 scope.go:117] "RemoveContainer" containerID="a408fbe7dcffa686789536eab5166c077dac6c7b88aa79bf50345b509e3f9f06" Oct 11 01:14:05 crc kubenswrapper[4743]: E1011 01:14:05.320076 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a408fbe7dcffa686789536eab5166c077dac6c7b88aa79bf50345b509e3f9f06\": container with ID starting with a408fbe7dcffa686789536eab5166c077dac6c7b88aa79bf50345b509e3f9f06 not found: ID does not exist" containerID="a408fbe7dcffa686789536eab5166c077dac6c7b88aa79bf50345b509e3f9f06" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.320111 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a408fbe7dcffa686789536eab5166c077dac6c7b88aa79bf50345b509e3f9f06"} err="failed to get container status \"a408fbe7dcffa686789536eab5166c077dac6c7b88aa79bf50345b509e3f9f06\": rpc error: code = NotFound desc = could not find container \"a408fbe7dcffa686789536eab5166c077dac6c7b88aa79bf50345b509e3f9f06\": container with ID starting with a408fbe7dcffa686789536eab5166c077dac6c7b88aa79bf50345b509e3f9f06 not found: ID does not exist" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.351199 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" (UID: "8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:05 crc kubenswrapper[4743]: I1011 01:14:05.358922 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.375060 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-config-data" (OuterVolumeSpecName: "config-data") pod "8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" (UID: "8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.409976 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9360186-9314-44aa-8203-f09057d76e6a-combined-ca-bundle\") pod \"b9360186-9314-44aa-8203-f09057d76e6a\" (UID: \"b9360186-9314-44aa-8203-f09057d76e6a\") " Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.410080 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v928k\" (UniqueName: \"kubernetes.io/projected/b9360186-9314-44aa-8203-f09057d76e6a-kube-api-access-v928k\") pod \"b9360186-9314-44aa-8203-f09057d76e6a\" (UID: \"b9360186-9314-44aa-8203-f09057d76e6a\") " Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.410283 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9360186-9314-44aa-8203-f09057d76e6a-logs\") pod \"b9360186-9314-44aa-8203-f09057d76e6a\" (UID: \"b9360186-9314-44aa-8203-f09057d76e6a\") " Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.410348 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9360186-9314-44aa-8203-f09057d76e6a-nova-metadata-tls-certs\") pod \"b9360186-9314-44aa-8203-f09057d76e6a\" (UID: \"b9360186-9314-44aa-8203-f09057d76e6a\") " Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.410451 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9360186-9314-44aa-8203-f09057d76e6a-config-data\") pod \"b9360186-9314-44aa-8203-f09057d76e6a\" (UID: \"b9360186-9314-44aa-8203-f09057d76e6a\") " Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.410552 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9360186-9314-44aa-8203-f09057d76e6a-logs" (OuterVolumeSpecName: "logs") pod "b9360186-9314-44aa-8203-f09057d76e6a" (UID: "b9360186-9314-44aa-8203-f09057d76e6a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.410863 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.410877 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.410886 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9360186-9314-44aa-8203-f09057d76e6a-logs\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.413001 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9360186-9314-44aa-8203-f09057d76e6a-kube-api-access-v928k" (OuterVolumeSpecName: "kube-api-access-v928k") pod "b9360186-9314-44aa-8203-f09057d76e6a" (UID: "b9360186-9314-44aa-8203-f09057d76e6a"). InnerVolumeSpecName "kube-api-access-v928k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.435730 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9360186-9314-44aa-8203-f09057d76e6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9360186-9314-44aa-8203-f09057d76e6a" (UID: "b9360186-9314-44aa-8203-f09057d76e6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.459149 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9360186-9314-44aa-8203-f09057d76e6a-config-data" (OuterVolumeSpecName: "config-data") pod "b9360186-9314-44aa-8203-f09057d76e6a" (UID: "b9360186-9314-44aa-8203-f09057d76e6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.479739 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.486969 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9360186-9314-44aa-8203-f09057d76e6a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b9360186-9314-44aa-8203-f09057d76e6a" (UID: "b9360186-9314-44aa-8203-f09057d76e6a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.509212 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.512967 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v928k\" (UniqueName: \"kubernetes.io/projected/b9360186-9314-44aa-8203-f09057d76e6a-kube-api-access-v928k\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.513000 4743 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9360186-9314-44aa-8203-f09057d76e6a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.513012 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9360186-9314-44aa-8203-f09057d76e6a-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.513024 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9360186-9314-44aa-8203-f09057d76e6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.522229 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:14:06 crc kubenswrapper[4743]: E1011 01:14:05.522639 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9360186-9314-44aa-8203-f09057d76e6a" containerName="nova-metadata-metadata" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.522651 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9360186-9314-44aa-8203-f09057d76e6a" containerName="nova-metadata-metadata" Oct 11 01:14:06 crc kubenswrapper[4743]: E1011 01:14:05.522666 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" containerName="ceilometer-notification-agent" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.522671 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" containerName="ceilometer-notification-agent" Oct 11 01:14:06 crc kubenswrapper[4743]: E1011 01:14:05.522682 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9360186-9314-44aa-8203-f09057d76e6a" containerName="nova-metadata-log" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.522688 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9360186-9314-44aa-8203-f09057d76e6a" containerName="nova-metadata-log" Oct 11 01:14:06 crc kubenswrapper[4743]: E1011 01:14:05.522706 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" containerName="ceilometer-central-agent" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.522712 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" containerName="ceilometer-central-agent" Oct 11 01:14:06 crc kubenswrapper[4743]: E1011 01:14:05.522726 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" containerName="proxy-httpd" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.522732 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" containerName="proxy-httpd" Oct 11 01:14:06 crc kubenswrapper[4743]: E1011 01:14:05.522742 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" containerName="sg-core" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.522747 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" containerName="sg-core" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.522959 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9360186-9314-44aa-8203-f09057d76e6a" containerName="nova-metadata-metadata" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.522968 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" containerName="ceilometer-central-agent" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.522979 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" containerName="proxy-httpd" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.522993 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" containerName="sg-core" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.523003 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" containerName="ceilometer-notification-agent" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.523014 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9360186-9314-44aa-8203-f09057d76e6a" containerName="nova-metadata-log" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.524772 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.528092 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.528164 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.529944 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.614529 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-scripts\") pod \"ceilometer-0\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " pod="openstack/ceilometer-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.614584 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-config-data\") pod \"ceilometer-0\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " pod="openstack/ceilometer-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.614611 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-log-httpd\") pod \"ceilometer-0\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " pod="openstack/ceilometer-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.614873 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-555cd\" (UniqueName: \"kubernetes.io/projected/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-kube-api-access-555cd\") pod \"ceilometer-0\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " pod="openstack/ceilometer-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.615024 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-run-httpd\") pod \"ceilometer-0\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " pod="openstack/ceilometer-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.615086 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " pod="openstack/ceilometer-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.615215 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " pod="openstack/ceilometer-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.661731 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-mtbr2"] Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.716950 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " pod="openstack/ceilometer-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.717046 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-scripts\") pod \"ceilometer-0\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " pod="openstack/ceilometer-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.717065 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-config-data\") pod \"ceilometer-0\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " pod="openstack/ceilometer-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.717087 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-log-httpd\") pod \"ceilometer-0\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " pod="openstack/ceilometer-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.717121 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-555cd\" (UniqueName: \"kubernetes.io/projected/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-kube-api-access-555cd\") pod \"ceilometer-0\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " pod="openstack/ceilometer-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.717164 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-run-httpd\") pod \"ceilometer-0\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " pod="openstack/ceilometer-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.717185 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " pod="openstack/ceilometer-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.717650 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-run-httpd\") pod \"ceilometer-0\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " pod="openstack/ceilometer-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.717752 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-log-httpd\") pod \"ceilometer-0\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " pod="openstack/ceilometer-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.722728 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " pod="openstack/ceilometer-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.723363 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-scripts\") pod \"ceilometer-0\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " pod="openstack/ceilometer-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.724147 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-config-data\") pod \"ceilometer-0\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " pod="openstack/ceilometer-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.728011 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " pod="openstack/ceilometer-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.736121 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-555cd\" (UniqueName: \"kubernetes.io/projected/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-kube-api-access-555cd\") pod \"ceilometer-0\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " pod="openstack/ceilometer-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:05.845995 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.116093 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f" path="/var/lib/kubelet/pods/8a5ec0fc-3819-4bbf-81d8-ef9d01e2b96f/volumes" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.145776 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9360186-9314-44aa-8203-f09057d76e6a","Type":"ContainerDied","Data":"69e88b7b8ad556ede053aa4638f82e07fe468f26e9a2d4287dabddcb7dd05c28"} Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.145984 4743 scope.go:117] "RemoveContainer" containerID="d3f58b59a52c8aef8a78ff7083541bb004e80f8d9cb9a233a388614a45c77c3f" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.145787 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.149357 4743 generic.go:334] "Generic (PLEG): container finished" podID="6e89352a-0aa8-4c41-ba04-4da0c6e59b3d" containerID="30e71ad943a8da42b9307bce0b943dc03e0fdaee993ac5e9e252bea62c476658" exitCode=0 Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.149527 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-mtbr2" event={"ID":"6e89352a-0aa8-4c41-ba04-4da0c6e59b3d","Type":"ContainerDied","Data":"30e71ad943a8da42b9307bce0b943dc03e0fdaee993ac5e9e252bea62c476658"} Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.149570 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-mtbr2" event={"ID":"6e89352a-0aa8-4c41-ba04-4da0c6e59b3d","Type":"ContainerStarted","Data":"fd490b56b0646194b2b0548cf5d5fd2376474e19c679a7bc4573da7357dcc673"} Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.165214 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"6f8d6d52-4659-4bde-8eac-469d0008964d","Type":"ContainerStarted","Data":"6f6517b762ee4d1e00516a6b54df87635b0e6a526be091108874744ac6b52efc"} Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.165401 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"6f8d6d52-4659-4bde-8eac-469d0008964d","Type":"ContainerStarted","Data":"fff6adaa570aa3eded0d591bd27977ca14dea9026716b329da5497d5148d4447"} Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.166875 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.201641 4743 scope.go:117] "RemoveContainer" containerID="a5911da0e5120c192e5b67191329ee08c6f320810a99abe14fddc8e4af19b79d" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.231911 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.240221 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.246425 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.246406907 podStartE2EDuration="2.246406907s" podCreationTimestamp="2025-10-11 01:14:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:14:06.201784679 +0000 UTC m=+1340.854765076" watchObservedRunningTime="2025-10-11 01:14:06.246406907 +0000 UTC m=+1340.899387294" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.257822 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.263951 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.271978 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.272090 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.298286 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.329464 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d793d5fe-737a-4217-a456-da9e894fe4f6-config-data\") pod \"nova-metadata-0\" (UID: \"d793d5fe-737a-4217-a456-da9e894fe4f6\") " pod="openstack/nova-metadata-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.329546 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llt4r\" (UniqueName: \"kubernetes.io/projected/d793d5fe-737a-4217-a456-da9e894fe4f6-kube-api-access-llt4r\") pod \"nova-metadata-0\" (UID: \"d793d5fe-737a-4217-a456-da9e894fe4f6\") " pod="openstack/nova-metadata-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.329575 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d793d5fe-737a-4217-a456-da9e894fe4f6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d793d5fe-737a-4217-a456-da9e894fe4f6\") " pod="openstack/nova-metadata-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.329615 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d793d5fe-737a-4217-a456-da9e894fe4f6-logs\") pod \"nova-metadata-0\" (UID: \"d793d5fe-737a-4217-a456-da9e894fe4f6\") " pod="openstack/nova-metadata-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.329676 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d793d5fe-737a-4217-a456-da9e894fe4f6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d793d5fe-737a-4217-a456-da9e894fe4f6\") " pod="openstack/nova-metadata-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.393782 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:14:06 crc kubenswrapper[4743]: E1011 01:14:06.426502 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b087a3f710ec0f6aa5d68b818ddb2a7f5cf3db75204a3abcf924510a40755947" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 11 01:14:06 crc kubenswrapper[4743]: E1011 01:14:06.428175 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b087a3f710ec0f6aa5d68b818ddb2a7f5cf3db75204a3abcf924510a40755947" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 11 01:14:06 crc kubenswrapper[4743]: E1011 01:14:06.429666 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b087a3f710ec0f6aa5d68b818ddb2a7f5cf3db75204a3abcf924510a40755947" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 11 01:14:06 crc kubenswrapper[4743]: E1011 01:14:06.429698 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d441a96d-dada-477d-aa48-0b467c00d5a0" containerName="nova-scheduler-scheduler" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.431009 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llt4r\" (UniqueName: \"kubernetes.io/projected/d793d5fe-737a-4217-a456-da9e894fe4f6-kube-api-access-llt4r\") pod \"nova-metadata-0\" (UID: \"d793d5fe-737a-4217-a456-da9e894fe4f6\") " pod="openstack/nova-metadata-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.431046 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d793d5fe-737a-4217-a456-da9e894fe4f6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d793d5fe-737a-4217-a456-da9e894fe4f6\") " pod="openstack/nova-metadata-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.431090 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d793d5fe-737a-4217-a456-da9e894fe4f6-logs\") pod \"nova-metadata-0\" (UID: \"d793d5fe-737a-4217-a456-da9e894fe4f6\") " pod="openstack/nova-metadata-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.431152 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d793d5fe-737a-4217-a456-da9e894fe4f6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d793d5fe-737a-4217-a456-da9e894fe4f6\") " pod="openstack/nova-metadata-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.431216 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d793d5fe-737a-4217-a456-da9e894fe4f6-config-data\") pod \"nova-metadata-0\" (UID: \"d793d5fe-737a-4217-a456-da9e894fe4f6\") " pod="openstack/nova-metadata-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.432137 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d793d5fe-737a-4217-a456-da9e894fe4f6-logs\") pod \"nova-metadata-0\" (UID: \"d793d5fe-737a-4217-a456-da9e894fe4f6\") " pod="openstack/nova-metadata-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.436634 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d793d5fe-737a-4217-a456-da9e894fe4f6-config-data\") pod \"nova-metadata-0\" (UID: \"d793d5fe-737a-4217-a456-da9e894fe4f6\") " pod="openstack/nova-metadata-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.436709 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d793d5fe-737a-4217-a456-da9e894fe4f6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d793d5fe-737a-4217-a456-da9e894fe4f6\") " pod="openstack/nova-metadata-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.436735 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d793d5fe-737a-4217-a456-da9e894fe4f6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d793d5fe-737a-4217-a456-da9e894fe4f6\") " pod="openstack/nova-metadata-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.448934 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llt4r\" (UniqueName: \"kubernetes.io/projected/d793d5fe-737a-4217-a456-da9e894fe4f6-kube-api-access-llt4r\") pod \"nova-metadata-0\" (UID: \"d793d5fe-737a-4217-a456-da9e894fe4f6\") " pod="openstack/nova-metadata-0" Oct 11 01:14:06 crc kubenswrapper[4743]: I1011 01:14:06.584036 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 11 01:14:07 crc kubenswrapper[4743]: I1011 01:14:07.105326 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 11 01:14:07 crc kubenswrapper[4743]: I1011 01:14:07.182888 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d793d5fe-737a-4217-a456-da9e894fe4f6","Type":"ContainerStarted","Data":"ddef3838cf5a006593af9ccee9a34115dbc92e3ead78a10dedae00593e259159"} Oct 11 01:14:07 crc kubenswrapper[4743]: I1011 01:14:07.184056 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584","Type":"ContainerStarted","Data":"d4db77043df5be01297534e501df05b9f20168eda583aee2ef8684072e3019b0"} Oct 11 01:14:07 crc kubenswrapper[4743]: W1011 01:14:07.547674 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54f2bf58_66f4_4f64_9937_6cd0cc80403e.slice/crio-513336ce15942b9580a08708701e9a2bee3cb7eb4720e908b228d63fbe74dcd7.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54f2bf58_66f4_4f64_9937_6cd0cc80403e.slice/crio-513336ce15942b9580a08708701e9a2bee3cb7eb4720e908b228d63fbe74dcd7.scope: no such file or directory Oct 11 01:14:07 crc kubenswrapper[4743]: W1011 01:14:07.548480 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9360186_9314_44aa_8203_f09057d76e6a.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9360186_9314_44aa_8203_f09057d76e6a.slice: no such file or directory Oct 11 01:14:07 crc kubenswrapper[4743]: I1011 01:14:07.554887 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-mtbr2" Oct 11 01:14:07 crc kubenswrapper[4743]: I1011 01:14:07.653320 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k8vs\" (UniqueName: \"kubernetes.io/projected/6e89352a-0aa8-4c41-ba04-4da0c6e59b3d-kube-api-access-2k8vs\") pod \"6e89352a-0aa8-4c41-ba04-4da0c6e59b3d\" (UID: \"6e89352a-0aa8-4c41-ba04-4da0c6e59b3d\") " Oct 11 01:14:07 crc kubenswrapper[4743]: I1011 01:14:07.659026 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e89352a-0aa8-4c41-ba04-4da0c6e59b3d-kube-api-access-2k8vs" (OuterVolumeSpecName: "kube-api-access-2k8vs") pod "6e89352a-0aa8-4c41-ba04-4da0c6e59b3d" (UID: "6e89352a-0aa8-4c41-ba04-4da0c6e59b3d"). InnerVolumeSpecName "kube-api-access-2k8vs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:14:07 crc kubenswrapper[4743]: I1011 01:14:07.755891 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k8vs\" (UniqueName: \"kubernetes.io/projected/6e89352a-0aa8-4c41-ba04-4da0c6e59b3d-kube-api-access-2k8vs\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.045063 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.067592 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54f2bf58-66f4-4f64-9937-6cd0cc80403e-combined-ca-bundle\") pod \"54f2bf58-66f4-4f64-9937-6cd0cc80403e\" (UID: \"54f2bf58-66f4-4f64-9937-6cd0cc80403e\") " Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.067728 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54f2bf58-66f4-4f64-9937-6cd0cc80403e-logs\") pod \"54f2bf58-66f4-4f64-9937-6cd0cc80403e\" (UID: \"54f2bf58-66f4-4f64-9937-6cd0cc80403e\") " Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.067801 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rllkv\" (UniqueName: \"kubernetes.io/projected/54f2bf58-66f4-4f64-9937-6cd0cc80403e-kube-api-access-rllkv\") pod \"54f2bf58-66f4-4f64-9937-6cd0cc80403e\" (UID: \"54f2bf58-66f4-4f64-9937-6cd0cc80403e\") " Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.067913 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54f2bf58-66f4-4f64-9937-6cd0cc80403e-config-data\") pod \"54f2bf58-66f4-4f64-9937-6cd0cc80403e\" (UID: \"54f2bf58-66f4-4f64-9937-6cd0cc80403e\") " Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.068509 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54f2bf58-66f4-4f64-9937-6cd0cc80403e-logs" (OuterVolumeSpecName: "logs") pod "54f2bf58-66f4-4f64-9937-6cd0cc80403e" (UID: "54f2bf58-66f4-4f64-9937-6cd0cc80403e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.070550 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54f2bf58-66f4-4f64-9937-6cd0cc80403e-logs\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.073274 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54f2bf58-66f4-4f64-9937-6cd0cc80403e-kube-api-access-rllkv" (OuterVolumeSpecName: "kube-api-access-rllkv") pod "54f2bf58-66f4-4f64-9937-6cd0cc80403e" (UID: "54f2bf58-66f4-4f64-9937-6cd0cc80403e"). InnerVolumeSpecName "kube-api-access-rllkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.100956 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54f2bf58-66f4-4f64-9937-6cd0cc80403e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54f2bf58-66f4-4f64-9937-6cd0cc80403e" (UID: "54f2bf58-66f4-4f64-9937-6cd0cc80403e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.110697 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9360186-9314-44aa-8203-f09057d76e6a" path="/var/lib/kubelet/pods/b9360186-9314-44aa-8203-f09057d76e6a/volumes" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.114368 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54f2bf58-66f4-4f64-9937-6cd0cc80403e-config-data" (OuterVolumeSpecName: "config-data") pod "54f2bf58-66f4-4f64-9937-6cd0cc80403e" (UID: "54f2bf58-66f4-4f64-9937-6cd0cc80403e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.173099 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54f2bf58-66f4-4f64-9937-6cd0cc80403e-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.173135 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54f2bf58-66f4-4f64-9937-6cd0cc80403e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.173149 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rllkv\" (UniqueName: \"kubernetes.io/projected/54f2bf58-66f4-4f64-9937-6cd0cc80403e-kube-api-access-rllkv\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.196012 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584","Type":"ContainerStarted","Data":"162b0eb903378f50a28bf5eac63de80e676a498a708b9ea30bbd6df497856488"} Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.196275 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584","Type":"ContainerStarted","Data":"e505fbffcce071baae5996b069f7b88114241cd03e0592d7fcdf9347f56fc134"} Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.197589 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d793d5fe-737a-4217-a456-da9e894fe4f6","Type":"ContainerStarted","Data":"ec265beae7c1aed3e04ae57daf470fe63e995730fc233089dfda473fec83b875"} Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.197622 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d793d5fe-737a-4217-a456-da9e894fe4f6","Type":"ContainerStarted","Data":"ede02981f3e2e33c2b02a8ec26332116f62d787b57e8852eed1b0667c12ef36b"} Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.200475 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-mtbr2" event={"ID":"6e89352a-0aa8-4c41-ba04-4da0c6e59b3d","Type":"ContainerDied","Data":"fd490b56b0646194b2b0548cf5d5fd2376474e19c679a7bc4573da7357dcc673"} Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.200510 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-mtbr2" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.200525 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd490b56b0646194b2b0548cf5d5fd2376474e19c679a7bc4573da7357dcc673" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.202120 4743 generic.go:334] "Generic (PLEG): container finished" podID="54f2bf58-66f4-4f64-9937-6cd0cc80403e" containerID="513336ce15942b9580a08708701e9a2bee3cb7eb4720e908b228d63fbe74dcd7" exitCode=0 Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.202151 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.202209 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"54f2bf58-66f4-4f64-9937-6cd0cc80403e","Type":"ContainerDied","Data":"513336ce15942b9580a08708701e9a2bee3cb7eb4720e908b228d63fbe74dcd7"} Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.202247 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"54f2bf58-66f4-4f64-9937-6cd0cc80403e","Type":"ContainerDied","Data":"f8e15f250b304e0d9e4f4f3dc284757c8ec34ebde9a85e1536c91f85a1a61d4a"} Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.202355 4743 scope.go:117] "RemoveContainer" containerID="513336ce15942b9580a08708701e9a2bee3cb7eb4720e908b228d63fbe74dcd7" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.219093 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.219076732 podStartE2EDuration="2.219076732s" podCreationTimestamp="2025-10-11 01:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:14:08.215143722 +0000 UTC m=+1342.868124139" watchObservedRunningTime="2025-10-11 01:14:08.219076732 +0000 UTC m=+1342.872057129" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.284643 4743 scope.go:117] "RemoveContainer" containerID="2f5bd9f394e7190687c58e3063d788de72a8546d59a54993fcf3bd95a41d4891" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.296926 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.308090 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.317065 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 11 01:14:08 crc kubenswrapper[4743]: E1011 01:14:08.317683 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54f2bf58-66f4-4f64-9937-6cd0cc80403e" containerName="nova-api-api" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.317701 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="54f2bf58-66f4-4f64-9937-6cd0cc80403e" containerName="nova-api-api" Oct 11 01:14:08 crc kubenswrapper[4743]: E1011 01:14:08.317729 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e89352a-0aa8-4c41-ba04-4da0c6e59b3d" containerName="mariadb-database-create" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.317736 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e89352a-0aa8-4c41-ba04-4da0c6e59b3d" containerName="mariadb-database-create" Oct 11 01:14:08 crc kubenswrapper[4743]: E1011 01:14:08.317764 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54f2bf58-66f4-4f64-9937-6cd0cc80403e" containerName="nova-api-log" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.317770 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="54f2bf58-66f4-4f64-9937-6cd0cc80403e" containerName="nova-api-log" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.317961 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="54f2bf58-66f4-4f64-9937-6cd0cc80403e" containerName="nova-api-api" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.317980 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="54f2bf58-66f4-4f64-9937-6cd0cc80403e" containerName="nova-api-log" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.317997 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e89352a-0aa8-4c41-ba04-4da0c6e59b3d" containerName="mariadb-database-create" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.319084 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.320080 4743 scope.go:117] "RemoveContainer" containerID="513336ce15942b9580a08708701e9a2bee3cb7eb4720e908b228d63fbe74dcd7" Oct 11 01:14:08 crc kubenswrapper[4743]: E1011 01:14:08.320395 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"513336ce15942b9580a08708701e9a2bee3cb7eb4720e908b228d63fbe74dcd7\": container with ID starting with 513336ce15942b9580a08708701e9a2bee3cb7eb4720e908b228d63fbe74dcd7 not found: ID does not exist" containerID="513336ce15942b9580a08708701e9a2bee3cb7eb4720e908b228d63fbe74dcd7" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.320425 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"513336ce15942b9580a08708701e9a2bee3cb7eb4720e908b228d63fbe74dcd7"} err="failed to get container status \"513336ce15942b9580a08708701e9a2bee3cb7eb4720e908b228d63fbe74dcd7\": rpc error: code = NotFound desc = could not find container \"513336ce15942b9580a08708701e9a2bee3cb7eb4720e908b228d63fbe74dcd7\": container with ID starting with 513336ce15942b9580a08708701e9a2bee3cb7eb4720e908b228d63fbe74dcd7 not found: ID does not exist" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.320445 4743 scope.go:117] "RemoveContainer" containerID="2f5bd9f394e7190687c58e3063d788de72a8546d59a54993fcf3bd95a41d4891" Oct 11 01:14:08 crc kubenswrapper[4743]: E1011 01:14:08.321120 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f5bd9f394e7190687c58e3063d788de72a8546d59a54993fcf3bd95a41d4891\": container with ID starting with 2f5bd9f394e7190687c58e3063d788de72a8546d59a54993fcf3bd95a41d4891 not found: ID does not exist" containerID="2f5bd9f394e7190687c58e3063d788de72a8546d59a54993fcf3bd95a41d4891" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.321157 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f5bd9f394e7190687c58e3063d788de72a8546d59a54993fcf3bd95a41d4891"} err="failed to get container status \"2f5bd9f394e7190687c58e3063d788de72a8546d59a54993fcf3bd95a41d4891\": rpc error: code = NotFound desc = could not find container \"2f5bd9f394e7190687c58e3063d788de72a8546d59a54993fcf3bd95a41d4891\": container with ID starting with 2f5bd9f394e7190687c58e3063d788de72a8546d59a54993fcf3bd95a41d4891 not found: ID does not exist" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.321522 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.326394 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.383533 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0e997e-fa72-4850-ade3-9dce1c59daab-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ed0e997e-fa72-4850-ade3-9dce1c59daab\") " pod="openstack/nova-api-0" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.383800 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed0e997e-fa72-4850-ade3-9dce1c59daab-logs\") pod \"nova-api-0\" (UID: \"ed0e997e-fa72-4850-ade3-9dce1c59daab\") " pod="openstack/nova-api-0" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.384094 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68ns6\" (UniqueName: \"kubernetes.io/projected/ed0e997e-fa72-4850-ade3-9dce1c59daab-kube-api-access-68ns6\") pod \"nova-api-0\" (UID: \"ed0e997e-fa72-4850-ade3-9dce1c59daab\") " pod="openstack/nova-api-0" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.384251 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed0e997e-fa72-4850-ade3-9dce1c59daab-config-data\") pod \"nova-api-0\" (UID: \"ed0e997e-fa72-4850-ade3-9dce1c59daab\") " pod="openstack/nova-api-0" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.485647 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed0e997e-fa72-4850-ade3-9dce1c59daab-logs\") pod \"nova-api-0\" (UID: \"ed0e997e-fa72-4850-ade3-9dce1c59daab\") " pod="openstack/nova-api-0" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.486081 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68ns6\" (UniqueName: \"kubernetes.io/projected/ed0e997e-fa72-4850-ade3-9dce1c59daab-kube-api-access-68ns6\") pod \"nova-api-0\" (UID: \"ed0e997e-fa72-4850-ade3-9dce1c59daab\") " pod="openstack/nova-api-0" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.486169 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed0e997e-fa72-4850-ade3-9dce1c59daab-config-data\") pod \"nova-api-0\" (UID: \"ed0e997e-fa72-4850-ade3-9dce1c59daab\") " pod="openstack/nova-api-0" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.486214 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0e997e-fa72-4850-ade3-9dce1c59daab-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ed0e997e-fa72-4850-ade3-9dce1c59daab\") " pod="openstack/nova-api-0" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.486225 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed0e997e-fa72-4850-ade3-9dce1c59daab-logs\") pod \"nova-api-0\" (UID: \"ed0e997e-fa72-4850-ade3-9dce1c59daab\") " pod="openstack/nova-api-0" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.501841 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0e997e-fa72-4850-ade3-9dce1c59daab-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ed0e997e-fa72-4850-ade3-9dce1c59daab\") " pod="openstack/nova-api-0" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.504792 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68ns6\" (UniqueName: \"kubernetes.io/projected/ed0e997e-fa72-4850-ade3-9dce1c59daab-kube-api-access-68ns6\") pod \"nova-api-0\" (UID: \"ed0e997e-fa72-4850-ade3-9dce1c59daab\") " pod="openstack/nova-api-0" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.504930 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed0e997e-fa72-4850-ade3-9dce1c59daab-config-data\") pod \"nova-api-0\" (UID: \"ed0e997e-fa72-4850-ade3-9dce1c59daab\") " pod="openstack/nova-api-0" Oct 11 01:14:08 crc kubenswrapper[4743]: I1011 01:14:08.642416 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.168382 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.205743 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d441a96d-dada-477d-aa48-0b467c00d5a0-config-data\") pod \"d441a96d-dada-477d-aa48-0b467c00d5a0\" (UID: \"d441a96d-dada-477d-aa48-0b467c00d5a0\") " Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.206289 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmcql\" (UniqueName: \"kubernetes.io/projected/d441a96d-dada-477d-aa48-0b467c00d5a0-kube-api-access-lmcql\") pod \"d441a96d-dada-477d-aa48-0b467c00d5a0\" (UID: \"d441a96d-dada-477d-aa48-0b467c00d5a0\") " Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.206398 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d441a96d-dada-477d-aa48-0b467c00d5a0-combined-ca-bundle\") pod \"d441a96d-dada-477d-aa48-0b467c00d5a0\" (UID: \"d441a96d-dada-477d-aa48-0b467c00d5a0\") " Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.217335 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d441a96d-dada-477d-aa48-0b467c00d5a0-kube-api-access-lmcql" (OuterVolumeSpecName: "kube-api-access-lmcql") pod "d441a96d-dada-477d-aa48-0b467c00d5a0" (UID: "d441a96d-dada-477d-aa48-0b467c00d5a0"). InnerVolumeSpecName "kube-api-access-lmcql". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.243929 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d441a96d-dada-477d-aa48-0b467c00d5a0-config-data" (OuterVolumeSpecName: "config-data") pod "d441a96d-dada-477d-aa48-0b467c00d5a0" (UID: "d441a96d-dada-477d-aa48-0b467c00d5a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.244351 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.246592 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584","Type":"ContainerStarted","Data":"a6e09a7bc274890145d0e1f3817133cfa116d0bc162a28ad64d99d13b23408a9"} Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.248845 4743 generic.go:334] "Generic (PLEG): container finished" podID="d441a96d-dada-477d-aa48-0b467c00d5a0" containerID="b087a3f710ec0f6aa5d68b818ddb2a7f5cf3db75204a3abcf924510a40755947" exitCode=0 Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.249186 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d441a96d-dada-477d-aa48-0b467c00d5a0","Type":"ContainerDied","Data":"b087a3f710ec0f6aa5d68b818ddb2a7f5cf3db75204a3abcf924510a40755947"} Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.249238 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d441a96d-dada-477d-aa48-0b467c00d5a0","Type":"ContainerDied","Data":"0095bb124fda810a039dd0477fda2b3c3b7d69174c8a1c712ae980f6a6ba0bef"} Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.249256 4743 scope.go:117] "RemoveContainer" containerID="b087a3f710ec0f6aa5d68b818ddb2a7f5cf3db75204a3abcf924510a40755947" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.249165 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.263978 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d441a96d-dada-477d-aa48-0b467c00d5a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d441a96d-dada-477d-aa48-0b467c00d5a0" (UID: "d441a96d-dada-477d-aa48-0b467c00d5a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.309576 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmcql\" (UniqueName: \"kubernetes.io/projected/d441a96d-dada-477d-aa48-0b467c00d5a0-kube-api-access-lmcql\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.309605 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d441a96d-dada-477d-aa48-0b467c00d5a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.309614 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d441a96d-dada-477d-aa48-0b467c00d5a0-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.338417 4743 scope.go:117] "RemoveContainer" containerID="b087a3f710ec0f6aa5d68b818ddb2a7f5cf3db75204a3abcf924510a40755947" Oct 11 01:14:09 crc kubenswrapper[4743]: E1011 01:14:09.339233 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b087a3f710ec0f6aa5d68b818ddb2a7f5cf3db75204a3abcf924510a40755947\": container with ID starting with b087a3f710ec0f6aa5d68b818ddb2a7f5cf3db75204a3abcf924510a40755947 not found: ID does not exist" containerID="b087a3f710ec0f6aa5d68b818ddb2a7f5cf3db75204a3abcf924510a40755947" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.339271 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b087a3f710ec0f6aa5d68b818ddb2a7f5cf3db75204a3abcf924510a40755947"} err="failed to get container status \"b087a3f710ec0f6aa5d68b818ddb2a7f5cf3db75204a3abcf924510a40755947\": rpc error: code = NotFound desc = could not find container \"b087a3f710ec0f6aa5d68b818ddb2a7f5cf3db75204a3abcf924510a40755947\": container with ID starting with b087a3f710ec0f6aa5d68b818ddb2a7f5cf3db75204a3abcf924510a40755947 not found: ID does not exist" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.585289 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.593822 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.606509 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 11 01:14:09 crc kubenswrapper[4743]: E1011 01:14:09.607171 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d441a96d-dada-477d-aa48-0b467c00d5a0" containerName="nova-scheduler-scheduler" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.607188 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d441a96d-dada-477d-aa48-0b467c00d5a0" containerName="nova-scheduler-scheduler" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.607371 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d441a96d-dada-477d-aa48-0b467c00d5a0" containerName="nova-scheduler-scheduler" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.608425 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.610339 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.621833 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.715576 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmrn4\" (UniqueName: \"kubernetes.io/projected/006c6efe-f2d3-4ce9-9a99-335f3830c7a8-kube-api-access-wmrn4\") pod \"nova-scheduler-0\" (UID: \"006c6efe-f2d3-4ce9-9a99-335f3830c7a8\") " pod="openstack/nova-scheduler-0" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.715650 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/006c6efe-f2d3-4ce9-9a99-335f3830c7a8-config-data\") pod \"nova-scheduler-0\" (UID: \"006c6efe-f2d3-4ce9-9a99-335f3830c7a8\") " pod="openstack/nova-scheduler-0" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.715676 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/006c6efe-f2d3-4ce9-9a99-335f3830c7a8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"006c6efe-f2d3-4ce9-9a99-335f3830c7a8\") " pod="openstack/nova-scheduler-0" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.817296 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/006c6efe-f2d3-4ce9-9a99-335f3830c7a8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"006c6efe-f2d3-4ce9-9a99-335f3830c7a8\") " pod="openstack/nova-scheduler-0" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.817495 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmrn4\" (UniqueName: \"kubernetes.io/projected/006c6efe-f2d3-4ce9-9a99-335f3830c7a8-kube-api-access-wmrn4\") pod \"nova-scheduler-0\" (UID: \"006c6efe-f2d3-4ce9-9a99-335f3830c7a8\") " pod="openstack/nova-scheduler-0" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.817586 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/006c6efe-f2d3-4ce9-9a99-335f3830c7a8-config-data\") pod \"nova-scheduler-0\" (UID: \"006c6efe-f2d3-4ce9-9a99-335f3830c7a8\") " pod="openstack/nova-scheduler-0" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.826687 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/006c6efe-f2d3-4ce9-9a99-335f3830c7a8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"006c6efe-f2d3-4ce9-9a99-335f3830c7a8\") " pod="openstack/nova-scheduler-0" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.827901 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/006c6efe-f2d3-4ce9-9a99-335f3830c7a8-config-data\") pod \"nova-scheduler-0\" (UID: \"006c6efe-f2d3-4ce9-9a99-335f3830c7a8\") " pod="openstack/nova-scheduler-0" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.834796 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmrn4\" (UniqueName: \"kubernetes.io/projected/006c6efe-f2d3-4ce9-9a99-335f3830c7a8-kube-api-access-wmrn4\") pod \"nova-scheduler-0\" (UID: \"006c6efe-f2d3-4ce9-9a99-335f3830c7a8\") " pod="openstack/nova-scheduler-0" Oct 11 01:14:09 crc kubenswrapper[4743]: I1011 01:14:09.944139 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 11 01:14:10 crc kubenswrapper[4743]: I1011 01:14:10.109214 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54f2bf58-66f4-4f64-9937-6cd0cc80403e" path="/var/lib/kubelet/pods/54f2bf58-66f4-4f64-9937-6cd0cc80403e/volumes" Oct 11 01:14:10 crc kubenswrapper[4743]: I1011 01:14:10.110448 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d441a96d-dada-477d-aa48-0b467c00d5a0" path="/var/lib/kubelet/pods/d441a96d-dada-477d-aa48-0b467c00d5a0/volumes" Oct 11 01:14:10 crc kubenswrapper[4743]: I1011 01:14:10.267214 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ed0e997e-fa72-4850-ade3-9dce1c59daab","Type":"ContainerStarted","Data":"1920b7f7416acfce69472dcfbc0753817b696901d93355db14578f266a9cf045"} Oct 11 01:14:10 crc kubenswrapper[4743]: I1011 01:14:10.267260 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ed0e997e-fa72-4850-ade3-9dce1c59daab","Type":"ContainerStarted","Data":"27f9016a419154431b54d07aa857f96525ec64907314e71d85aeb3110bd6235b"} Oct 11 01:14:10 crc kubenswrapper[4743]: I1011 01:14:10.267271 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ed0e997e-fa72-4850-ade3-9dce1c59daab","Type":"ContainerStarted","Data":"cba2b4d1a39ce7e144110e60b0e4d7973be3b01555ce3208d23dbf958a1bb66b"} Oct 11 01:14:10 crc kubenswrapper[4743]: I1011 01:14:10.293392 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.293367498 podStartE2EDuration="2.293367498s" podCreationTimestamp="2025-10-11 01:14:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:14:10.289245953 +0000 UTC m=+1344.942226370" watchObservedRunningTime="2025-10-11 01:14:10.293367498 +0000 UTC m=+1344.946347895" Oct 11 01:14:10 crc kubenswrapper[4743]: I1011 01:14:10.390796 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 11 01:14:11 crc kubenswrapper[4743]: I1011 01:14:11.283471 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584","Type":"ContainerStarted","Data":"fec04be03bc7a771a73945d71b479279a36a0acb5b3616a85952a72c28440340"} Oct 11 01:14:11 crc kubenswrapper[4743]: I1011 01:14:11.284131 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 11 01:14:11 crc kubenswrapper[4743]: I1011 01:14:11.285818 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"006c6efe-f2d3-4ce9-9a99-335f3830c7a8","Type":"ContainerStarted","Data":"39d8a64c03d232369b7c9a54d892dfc59a3a473201af45f8213daebdfe49e5ee"} Oct 11 01:14:11 crc kubenswrapper[4743]: I1011 01:14:11.285846 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"006c6efe-f2d3-4ce9-9a99-335f3830c7a8","Type":"ContainerStarted","Data":"a7796a5e6da8d551c9538ce9830202a82ac0cf2e9763fedb67956642003c2f81"} Oct 11 01:14:11 crc kubenswrapper[4743]: I1011 01:14:11.324828 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.062994925 podStartE2EDuration="6.324810125s" podCreationTimestamp="2025-10-11 01:14:05 +0000 UTC" firstStartedPulling="2025-10-11 01:14:06.409335729 +0000 UTC m=+1341.062316126" lastFinishedPulling="2025-10-11 01:14:10.671150929 +0000 UTC m=+1345.324131326" observedRunningTime="2025-10-11 01:14:11.317963807 +0000 UTC m=+1345.970944214" watchObservedRunningTime="2025-10-11 01:14:11.324810125 +0000 UTC m=+1345.977790522" Oct 11 01:14:11 crc kubenswrapper[4743]: I1011 01:14:11.346171 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.346149376 podStartE2EDuration="2.346149376s" podCreationTimestamp="2025-10-11 01:14:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:14:11.335072331 +0000 UTC m=+1345.988052718" watchObservedRunningTime="2025-10-11 01:14:11.346149376 +0000 UTC m=+1345.999129773" Oct 11 01:14:11 crc kubenswrapper[4743]: I1011 01:14:11.584144 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 11 01:14:11 crc kubenswrapper[4743]: I1011 01:14:11.584189 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 11 01:14:14 crc kubenswrapper[4743]: I1011 01:14:14.533839 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 11 01:14:14 crc kubenswrapper[4743]: I1011 01:14:14.778560 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-ad4e-account-create-b2rl8"] Oct 11 01:14:14 crc kubenswrapper[4743]: I1011 01:14:14.780472 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-ad4e-account-create-b2rl8" Oct 11 01:14:14 crc kubenswrapper[4743]: I1011 01:14:14.785151 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-ad4e-account-create-b2rl8"] Oct 11 01:14:14 crc kubenswrapper[4743]: I1011 01:14:14.841109 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Oct 11 01:14:14 crc kubenswrapper[4743]: I1011 01:14:14.933330 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxbnr\" (UniqueName: \"kubernetes.io/projected/d015b239-cdc4-4ddd-8e01-4fdc85cbbbd0-kube-api-access-xxbnr\") pod \"aodh-ad4e-account-create-b2rl8\" (UID: \"d015b239-cdc4-4ddd-8e01-4fdc85cbbbd0\") " pod="openstack/aodh-ad4e-account-create-b2rl8" Oct 11 01:14:14 crc kubenswrapper[4743]: I1011 01:14:14.944801 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 11 01:14:15 crc kubenswrapper[4743]: I1011 01:14:15.035452 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxbnr\" (UniqueName: \"kubernetes.io/projected/d015b239-cdc4-4ddd-8e01-4fdc85cbbbd0-kube-api-access-xxbnr\") pod \"aodh-ad4e-account-create-b2rl8\" (UID: \"d015b239-cdc4-4ddd-8e01-4fdc85cbbbd0\") " pod="openstack/aodh-ad4e-account-create-b2rl8" Oct 11 01:14:15 crc kubenswrapper[4743]: I1011 01:14:15.064492 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxbnr\" (UniqueName: \"kubernetes.io/projected/d015b239-cdc4-4ddd-8e01-4fdc85cbbbd0-kube-api-access-xxbnr\") pod \"aodh-ad4e-account-create-b2rl8\" (UID: \"d015b239-cdc4-4ddd-8e01-4fdc85cbbbd0\") " pod="openstack/aodh-ad4e-account-create-b2rl8" Oct 11 01:14:15 crc kubenswrapper[4743]: I1011 01:14:15.151822 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-ad4e-account-create-b2rl8" Oct 11 01:14:15 crc kubenswrapper[4743]: I1011 01:14:15.587284 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-ad4e-account-create-b2rl8"] Oct 11 01:14:16 crc kubenswrapper[4743]: I1011 01:14:16.377321 4743 generic.go:334] "Generic (PLEG): container finished" podID="d015b239-cdc4-4ddd-8e01-4fdc85cbbbd0" containerID="882c56d4e18948fc28a76693ece959a3753d9d6f994f2f457793a77e9f7f6f3a" exitCode=0 Oct 11 01:14:16 crc kubenswrapper[4743]: I1011 01:14:16.377375 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-ad4e-account-create-b2rl8" event={"ID":"d015b239-cdc4-4ddd-8e01-4fdc85cbbbd0","Type":"ContainerDied","Data":"882c56d4e18948fc28a76693ece959a3753d9d6f994f2f457793a77e9f7f6f3a"} Oct 11 01:14:16 crc kubenswrapper[4743]: I1011 01:14:16.377671 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-ad4e-account-create-b2rl8" event={"ID":"d015b239-cdc4-4ddd-8e01-4fdc85cbbbd0","Type":"ContainerStarted","Data":"74c312e622e6f723fa569a6899aa4c4f540c5909ece75a25203e1d3ab4b0096b"} Oct 11 01:14:16 crc kubenswrapper[4743]: I1011 01:14:16.584284 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 11 01:14:16 crc kubenswrapper[4743]: I1011 01:14:16.584347 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 11 01:14:17 crc kubenswrapper[4743]: E1011 01:14:17.286021 4743 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/5251879e3bf27d10461e69029d9bc7180ccb22d83db6aa529ce76f14152c72a4/diff" to get inode usage: stat /var/lib/containers/storage/overlay/5251879e3bf27d10461e69029d9bc7180ccb22d83db6aa529ce76f14152c72a4/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_dnsmasq-dns-688b9f5b49-zn2q6_0eee5a3c-bfbc-4975-ae73-2a33d414993d/dnsmasq-dns/0.log" to get inode usage: stat /var/log/pods/openstack_dnsmasq-dns-688b9f5b49-zn2q6_0eee5a3c-bfbc-4975-ae73-2a33d414993d/dnsmasq-dns/0.log: no such file or directory Oct 11 01:14:17 crc kubenswrapper[4743]: I1011 01:14:17.602073 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d793d5fe-737a-4217-a456-da9e894fe4f6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.224:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 01:14:17 crc kubenswrapper[4743]: I1011 01:14:17.602111 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d793d5fe-737a-4217-a456-da9e894fe4f6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.224:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 01:14:17 crc kubenswrapper[4743]: I1011 01:14:17.971290 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-ad4e-account-create-b2rl8" Oct 11 01:14:18 crc kubenswrapper[4743]: I1011 01:14:18.122173 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxbnr\" (UniqueName: \"kubernetes.io/projected/d015b239-cdc4-4ddd-8e01-4fdc85cbbbd0-kube-api-access-xxbnr\") pod \"d015b239-cdc4-4ddd-8e01-4fdc85cbbbd0\" (UID: \"d015b239-cdc4-4ddd-8e01-4fdc85cbbbd0\") " Oct 11 01:14:18 crc kubenswrapper[4743]: I1011 01:14:18.127895 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d015b239-cdc4-4ddd-8e01-4fdc85cbbbd0-kube-api-access-xxbnr" (OuterVolumeSpecName: "kube-api-access-xxbnr") pod "d015b239-cdc4-4ddd-8e01-4fdc85cbbbd0" (UID: "d015b239-cdc4-4ddd-8e01-4fdc85cbbbd0"). InnerVolumeSpecName "kube-api-access-xxbnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:14:18 crc kubenswrapper[4743]: I1011 01:14:18.225300 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxbnr\" (UniqueName: \"kubernetes.io/projected/d015b239-cdc4-4ddd-8e01-4fdc85cbbbd0-kube-api-access-xxbnr\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:18 crc kubenswrapper[4743]: I1011 01:14:18.399889 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-ad4e-account-create-b2rl8" event={"ID":"d015b239-cdc4-4ddd-8e01-4fdc85cbbbd0","Type":"ContainerDied","Data":"74c312e622e6f723fa569a6899aa4c4f540c5909ece75a25203e1d3ab4b0096b"} Oct 11 01:14:18 crc kubenswrapper[4743]: I1011 01:14:18.399930 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74c312e622e6f723fa569a6899aa4c4f540c5909ece75a25203e1d3ab4b0096b" Oct 11 01:14:18 crc kubenswrapper[4743]: I1011 01:14:18.399949 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-ad4e-account-create-b2rl8" Oct 11 01:14:18 crc kubenswrapper[4743]: I1011 01:14:18.644177 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 11 01:14:18 crc kubenswrapper[4743]: I1011 01:14:18.647153 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 11 01:14:19 crc kubenswrapper[4743]: I1011 01:14:19.727048 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ed0e997e-fa72-4850-ade3-9dce1c59daab" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.225:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 11 01:14:19 crc kubenswrapper[4743]: I1011 01:14:19.727050 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ed0e997e-fa72-4850-ade3-9dce1c59daab" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.225:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 11 01:14:19 crc kubenswrapper[4743]: I1011 01:14:19.944995 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 11 01:14:19 crc kubenswrapper[4743]: I1011 01:14:19.987232 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 11 01:14:20 crc kubenswrapper[4743]: I1011 01:14:20.213471 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-592t9"] Oct 11 01:14:20 crc kubenswrapper[4743]: E1011 01:14:20.214378 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d015b239-cdc4-4ddd-8e01-4fdc85cbbbd0" containerName="mariadb-account-create" Oct 11 01:14:20 crc kubenswrapper[4743]: I1011 01:14:20.214400 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d015b239-cdc4-4ddd-8e01-4fdc85cbbbd0" containerName="mariadb-account-create" Oct 11 01:14:20 crc kubenswrapper[4743]: I1011 01:14:20.214843 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d015b239-cdc4-4ddd-8e01-4fdc85cbbbd0" containerName="mariadb-account-create" Oct 11 01:14:20 crc kubenswrapper[4743]: I1011 01:14:20.215912 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-592t9" Oct 11 01:14:20 crc kubenswrapper[4743]: I1011 01:14:20.238694 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 11 01:14:20 crc kubenswrapper[4743]: I1011 01:14:20.238731 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 11 01:14:20 crc kubenswrapper[4743]: I1011 01:14:20.238832 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-dnc99" Oct 11 01:14:20 crc kubenswrapper[4743]: I1011 01:14:20.250738 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-592t9"] Oct 11 01:14:20 crc kubenswrapper[4743]: I1011 01:14:20.368806 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf81bc47-30a1-4231-a23a-446813610da6-combined-ca-bundle\") pod \"aodh-db-sync-592t9\" (UID: \"cf81bc47-30a1-4231-a23a-446813610da6\") " pod="openstack/aodh-db-sync-592t9" Oct 11 01:14:20 crc kubenswrapper[4743]: I1011 01:14:20.368932 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66bjg\" (UniqueName: \"kubernetes.io/projected/cf81bc47-30a1-4231-a23a-446813610da6-kube-api-access-66bjg\") pod \"aodh-db-sync-592t9\" (UID: \"cf81bc47-30a1-4231-a23a-446813610da6\") " pod="openstack/aodh-db-sync-592t9" Oct 11 01:14:20 crc kubenswrapper[4743]: I1011 01:14:20.369011 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf81bc47-30a1-4231-a23a-446813610da6-scripts\") pod \"aodh-db-sync-592t9\" (UID: \"cf81bc47-30a1-4231-a23a-446813610da6\") " pod="openstack/aodh-db-sync-592t9" Oct 11 01:14:20 crc kubenswrapper[4743]: I1011 01:14:20.369355 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf81bc47-30a1-4231-a23a-446813610da6-config-data\") pod \"aodh-db-sync-592t9\" (UID: \"cf81bc47-30a1-4231-a23a-446813610da6\") " pod="openstack/aodh-db-sync-592t9" Oct 11 01:14:20 crc kubenswrapper[4743]: I1011 01:14:20.449269 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 11 01:14:20 crc kubenswrapper[4743]: I1011 01:14:20.471706 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf81bc47-30a1-4231-a23a-446813610da6-combined-ca-bundle\") pod \"aodh-db-sync-592t9\" (UID: \"cf81bc47-30a1-4231-a23a-446813610da6\") " pod="openstack/aodh-db-sync-592t9" Oct 11 01:14:20 crc kubenswrapper[4743]: I1011 01:14:20.471775 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66bjg\" (UniqueName: \"kubernetes.io/projected/cf81bc47-30a1-4231-a23a-446813610da6-kube-api-access-66bjg\") pod \"aodh-db-sync-592t9\" (UID: \"cf81bc47-30a1-4231-a23a-446813610da6\") " pod="openstack/aodh-db-sync-592t9" Oct 11 01:14:20 crc kubenswrapper[4743]: I1011 01:14:20.471799 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf81bc47-30a1-4231-a23a-446813610da6-scripts\") pod \"aodh-db-sync-592t9\" (UID: \"cf81bc47-30a1-4231-a23a-446813610da6\") " pod="openstack/aodh-db-sync-592t9" Oct 11 01:14:20 crc kubenswrapper[4743]: I1011 01:14:20.471919 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf81bc47-30a1-4231-a23a-446813610da6-config-data\") pod \"aodh-db-sync-592t9\" (UID: \"cf81bc47-30a1-4231-a23a-446813610da6\") " pod="openstack/aodh-db-sync-592t9" Oct 11 01:14:20 crc kubenswrapper[4743]: I1011 01:14:20.480671 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf81bc47-30a1-4231-a23a-446813610da6-scripts\") pod \"aodh-db-sync-592t9\" (UID: \"cf81bc47-30a1-4231-a23a-446813610da6\") " pod="openstack/aodh-db-sync-592t9" Oct 11 01:14:20 crc kubenswrapper[4743]: I1011 01:14:20.487062 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf81bc47-30a1-4231-a23a-446813610da6-combined-ca-bundle\") pod \"aodh-db-sync-592t9\" (UID: \"cf81bc47-30a1-4231-a23a-446813610da6\") " pod="openstack/aodh-db-sync-592t9" Oct 11 01:14:20 crc kubenswrapper[4743]: I1011 01:14:20.487118 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf81bc47-30a1-4231-a23a-446813610da6-config-data\") pod \"aodh-db-sync-592t9\" (UID: \"cf81bc47-30a1-4231-a23a-446813610da6\") " pod="openstack/aodh-db-sync-592t9" Oct 11 01:14:20 crc kubenswrapper[4743]: I1011 01:14:20.492560 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66bjg\" (UniqueName: \"kubernetes.io/projected/cf81bc47-30a1-4231-a23a-446813610da6-kube-api-access-66bjg\") pod \"aodh-db-sync-592t9\" (UID: \"cf81bc47-30a1-4231-a23a-446813610da6\") " pod="openstack/aodh-db-sync-592t9" Oct 11 01:14:20 crc kubenswrapper[4743]: I1011 01:14:20.556087 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-592t9" Oct 11 01:14:21 crc kubenswrapper[4743]: I1011 01:14:21.137179 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-592t9"] Oct 11 01:14:21 crc kubenswrapper[4743]: I1011 01:14:21.424908 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-592t9" event={"ID":"cf81bc47-30a1-4231-a23a-446813610da6","Type":"ContainerStarted","Data":"498c59d00ee590aee0b190b2388b289df775a331684ed71180dc0b486e50efac"} Oct 11 01:14:26 crc kubenswrapper[4743]: I1011 01:14:26.476592 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-592t9" event={"ID":"cf81bc47-30a1-4231-a23a-446813610da6","Type":"ContainerStarted","Data":"9fce45c61e11f7e31457a8d26df0d96937c1c5893b583ce261b36f706cea3973"} Oct 11 01:14:26 crc kubenswrapper[4743]: I1011 01:14:26.492307 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-592t9" podStartSLOduration=1.800720406 podStartE2EDuration="6.492286951s" podCreationTimestamp="2025-10-11 01:14:20 +0000 UTC" firstStartedPulling="2025-10-11 01:14:21.127738729 +0000 UTC m=+1355.780719116" lastFinishedPulling="2025-10-11 01:14:25.819305264 +0000 UTC m=+1360.472285661" observedRunningTime="2025-10-11 01:14:26.49050853 +0000 UTC m=+1361.143488927" watchObservedRunningTime="2025-10-11 01:14:26.492286951 +0000 UTC m=+1361.145267368" Oct 11 01:14:26 crc kubenswrapper[4743]: I1011 01:14:26.594222 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 11 01:14:26 crc kubenswrapper[4743]: I1011 01:14:26.595303 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 11 01:14:26 crc kubenswrapper[4743]: I1011 01:14:26.603165 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 11 01:14:27 crc kubenswrapper[4743]: I1011 01:14:27.494649 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 11 01:14:27 crc kubenswrapper[4743]: W1011 01:14:27.991042 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd441a96d_dada_477d_aa48_0b467c00d5a0.slice/crio-b087a3f710ec0f6aa5d68b818ddb2a7f5cf3db75204a3abcf924510a40755947.scope WatchSource:0}: Error finding container b087a3f710ec0f6aa5d68b818ddb2a7f5cf3db75204a3abcf924510a40755947: Status 404 returned error can't find the container with id b087a3f710ec0f6aa5d68b818ddb2a7f5cf3db75204a3abcf924510a40755947 Oct 11 01:14:27 crc kubenswrapper[4743]: W1011 01:14:27.992145 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e89352a_0aa8_4c41_ba04_4da0c6e59b3d.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e89352a_0aa8_4c41_ba04_4da0c6e59b3d.slice: no such file or directory Oct 11 01:14:28 crc kubenswrapper[4743]: W1011 01:14:28.029397 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd015b239_cdc4_4ddd_8e01_4fdc85cbbbd0.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd015b239_cdc4_4ddd_8e01_4fdc85cbbbd0.slice: no such file or directory Oct 11 01:14:28 crc kubenswrapper[4743]: E1011 01:14:28.293471 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70c3253b_404c_4db1_a0e5_f2f112e94c43.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd441a96d_dada_477d_aa48_0b467c00d5a0.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0eee5a3c_bfbc_4975_ae73_2a33d414993d.slice/crio-a47dc85cfad70d954a488b40336b8c7e14e47decfbc6af22ee4cd25cfd7faf3a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a5ec0fc_3819_4bbf_81d8_ef9d01e2b96f.slice/crio-9636c2404d3e927d58328feea44f3e5ec43ec692622c74b9e7c585561b13cd92\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0eee5a3c_bfbc_4975_ae73_2a33d414993d.slice/crio-conmon-a47dc85cfad70d954a488b40336b8c7e14e47decfbc6af22ee4cd25cfd7faf3a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0eee5a3c_bfbc_4975_ae73_2a33d414993d.slice/crio-d81cc8604a270822001d29705df8201b8f9a2d7fc2322c78dba5267eb154f4ac\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a5ec0fc_3819_4bbf_81d8_ef9d01e2b96f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0eee5a3c_bfbc_4975_ae73_2a33d414993d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54f2bf58_66f4_4f64_9937_6cd0cc80403e.slice/crio-f8e15f250b304e0d9e4f4f3dc284757c8ec34ebde9a85e1536c91f85a1a61d4a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a5ec0fc_3819_4bbf_81d8_ef9d01e2b96f.slice/crio-conmon-a87a6857e891645b6359c9dd35207ef767024de9d3eec1b0b40667a6c7ac7bd8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe9ad7b_9381_4d88_93cf_2ad81eb684d7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54f2bf58_66f4_4f64_9937_6cd0cc80403e.slice\": RecentStats: unable to find data in memory cache]" Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.433541 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.497611 4743 generic.go:334] "Generic (PLEG): container finished" podID="33029235-8b5f-4193-9760-c14a1f6de702" containerID="c8948275b1d35c2b29935cc28850d32ecf3202c63120f5d9810e492a2a782ee8" exitCode=137 Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.497663 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"33029235-8b5f-4193-9760-c14a1f6de702","Type":"ContainerDied","Data":"c8948275b1d35c2b29935cc28850d32ecf3202c63120f5d9810e492a2a782ee8"} Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.497721 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"33029235-8b5f-4193-9760-c14a1f6de702","Type":"ContainerDied","Data":"82408a8718163a2ba9073cf721c7b1dc02d688c57b5d63ac339ecca2b2de1468"} Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.497733 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.497746 4743 scope.go:117] "RemoveContainer" containerID="c8948275b1d35c2b29935cc28850d32ecf3202c63120f5d9810e492a2a782ee8" Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.499537 4743 generic.go:334] "Generic (PLEG): container finished" podID="cf81bc47-30a1-4231-a23a-446813610da6" containerID="9fce45c61e11f7e31457a8d26df0d96937c1c5893b583ce261b36f706cea3973" exitCode=0 Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.500663 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-592t9" event={"ID":"cf81bc47-30a1-4231-a23a-446813610da6","Type":"ContainerDied","Data":"9fce45c61e11f7e31457a8d26df0d96937c1c5893b583ce261b36f706cea3973"} Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.539336 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33029235-8b5f-4193-9760-c14a1f6de702-config-data\") pod \"33029235-8b5f-4193-9760-c14a1f6de702\" (UID: \"33029235-8b5f-4193-9760-c14a1f6de702\") " Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.539736 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6fw8\" (UniqueName: \"kubernetes.io/projected/33029235-8b5f-4193-9760-c14a1f6de702-kube-api-access-h6fw8\") pod \"33029235-8b5f-4193-9760-c14a1f6de702\" (UID: \"33029235-8b5f-4193-9760-c14a1f6de702\") " Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.540035 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33029235-8b5f-4193-9760-c14a1f6de702-combined-ca-bundle\") pod \"33029235-8b5f-4193-9760-c14a1f6de702\" (UID: \"33029235-8b5f-4193-9760-c14a1f6de702\") " Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.541349 4743 scope.go:117] "RemoveContainer" containerID="c8948275b1d35c2b29935cc28850d32ecf3202c63120f5d9810e492a2a782ee8" Oct 11 01:14:28 crc kubenswrapper[4743]: E1011 01:14:28.541932 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8948275b1d35c2b29935cc28850d32ecf3202c63120f5d9810e492a2a782ee8\": container with ID starting with c8948275b1d35c2b29935cc28850d32ecf3202c63120f5d9810e492a2a782ee8 not found: ID does not exist" containerID="c8948275b1d35c2b29935cc28850d32ecf3202c63120f5d9810e492a2a782ee8" Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.541970 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8948275b1d35c2b29935cc28850d32ecf3202c63120f5d9810e492a2a782ee8"} err="failed to get container status \"c8948275b1d35c2b29935cc28850d32ecf3202c63120f5d9810e492a2a782ee8\": rpc error: code = NotFound desc = could not find container \"c8948275b1d35c2b29935cc28850d32ecf3202c63120f5d9810e492a2a782ee8\": container with ID starting with c8948275b1d35c2b29935cc28850d32ecf3202c63120f5d9810e492a2a782ee8 not found: ID does not exist" Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.546210 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33029235-8b5f-4193-9760-c14a1f6de702-kube-api-access-h6fw8" (OuterVolumeSpecName: "kube-api-access-h6fw8") pod "33029235-8b5f-4193-9760-c14a1f6de702" (UID: "33029235-8b5f-4193-9760-c14a1f6de702"). InnerVolumeSpecName "kube-api-access-h6fw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.578087 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33029235-8b5f-4193-9760-c14a1f6de702-config-data" (OuterVolumeSpecName: "config-data") pod "33029235-8b5f-4193-9760-c14a1f6de702" (UID: "33029235-8b5f-4193-9760-c14a1f6de702"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.578109 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33029235-8b5f-4193-9760-c14a1f6de702-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33029235-8b5f-4193-9760-c14a1f6de702" (UID: "33029235-8b5f-4193-9760-c14a1f6de702"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.642508 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33029235-8b5f-4193-9760-c14a1f6de702-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.642538 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33029235-8b5f-4193-9760-c14a1f6de702-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.642934 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6fw8\" (UniqueName: \"kubernetes.io/projected/33029235-8b5f-4193-9760-c14a1f6de702-kube-api-access-h6fw8\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.646038 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.646905 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.648978 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.653282 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.849845 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.858707 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.882010 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 11 01:14:28 crc kubenswrapper[4743]: E1011 01:14:28.882474 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33029235-8b5f-4193-9760-c14a1f6de702" containerName="nova-cell1-novncproxy-novncproxy" Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.882490 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="33029235-8b5f-4193-9760-c14a1f6de702" containerName="nova-cell1-novncproxy-novncproxy" Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.882677 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="33029235-8b5f-4193-9760-c14a1f6de702" containerName="nova-cell1-novncproxy-novncproxy" Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.883588 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.885336 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.885467 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.885635 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 11 01:14:28 crc kubenswrapper[4743]: I1011 01:14:28.909217 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.058134 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d685eabb-511e-4604-9716-7676177726d6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d685eabb-511e-4604-9716-7676177726d6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.058182 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bld2q\" (UniqueName: \"kubernetes.io/projected/d685eabb-511e-4604-9716-7676177726d6-kube-api-access-bld2q\") pod \"nova-cell1-novncproxy-0\" (UID: \"d685eabb-511e-4604-9716-7676177726d6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.058208 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d685eabb-511e-4604-9716-7676177726d6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d685eabb-511e-4604-9716-7676177726d6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.058226 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d685eabb-511e-4604-9716-7676177726d6-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d685eabb-511e-4604-9716-7676177726d6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.058261 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d685eabb-511e-4604-9716-7676177726d6-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d685eabb-511e-4604-9716-7676177726d6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.160155 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d685eabb-511e-4604-9716-7676177726d6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d685eabb-511e-4604-9716-7676177726d6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.160202 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bld2q\" (UniqueName: \"kubernetes.io/projected/d685eabb-511e-4604-9716-7676177726d6-kube-api-access-bld2q\") pod \"nova-cell1-novncproxy-0\" (UID: \"d685eabb-511e-4604-9716-7676177726d6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.160230 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d685eabb-511e-4604-9716-7676177726d6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d685eabb-511e-4604-9716-7676177726d6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.160256 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d685eabb-511e-4604-9716-7676177726d6-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d685eabb-511e-4604-9716-7676177726d6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.160307 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d685eabb-511e-4604-9716-7676177726d6-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d685eabb-511e-4604-9716-7676177726d6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.165080 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d685eabb-511e-4604-9716-7676177726d6-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d685eabb-511e-4604-9716-7676177726d6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.166532 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d685eabb-511e-4604-9716-7676177726d6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d685eabb-511e-4604-9716-7676177726d6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.168127 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d685eabb-511e-4604-9716-7676177726d6-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d685eabb-511e-4604-9716-7676177726d6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.172843 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d685eabb-511e-4604-9716-7676177726d6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d685eabb-511e-4604-9716-7676177726d6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.184386 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bld2q\" (UniqueName: \"kubernetes.io/projected/d685eabb-511e-4604-9716-7676177726d6-kube-api-access-bld2q\") pod \"nova-cell1-novncproxy-0\" (UID: \"d685eabb-511e-4604-9716-7676177726d6\") " pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.241008 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.510396 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.527914 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.703896 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-m4xd7"] Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.709312 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.716586 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-m4xd7"] Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.735707 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.875965 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-m4xd7\" (UID: \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\") " pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.876313 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-m4xd7\" (UID: \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\") " pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.876356 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-config\") pod \"dnsmasq-dns-f84f9ccf-m4xd7\" (UID: \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\") " pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.876395 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-m4xd7\" (UID: \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\") " pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.876461 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5vxg\" (UniqueName: \"kubernetes.io/projected/229ca9c6-760d-4ea3-9599-bb5cfeeea826-kube-api-access-f5vxg\") pod \"dnsmasq-dns-f84f9ccf-m4xd7\" (UID: \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\") " pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.876511 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-m4xd7\" (UID: \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\") " pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.963312 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-592t9" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.979502 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-m4xd7\" (UID: \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\") " pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.979579 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-m4xd7\" (UID: \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\") " pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.979636 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-config\") pod \"dnsmasq-dns-f84f9ccf-m4xd7\" (UID: \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\") " pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.979687 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-m4xd7\" (UID: \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\") " pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.979765 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5vxg\" (UniqueName: \"kubernetes.io/projected/229ca9c6-760d-4ea3-9599-bb5cfeeea826-kube-api-access-f5vxg\") pod \"dnsmasq-dns-f84f9ccf-m4xd7\" (UID: \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\") " pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.979802 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-m4xd7\" (UID: \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\") " pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.981811 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-m4xd7\" (UID: \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\") " pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.982017 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-m4xd7\" (UID: \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\") " pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.982017 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-config\") pod \"dnsmasq-dns-f84f9ccf-m4xd7\" (UID: \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\") " pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.983017 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-m4xd7\" (UID: \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\") " pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" Oct 11 01:14:29 crc kubenswrapper[4743]: I1011 01:14:29.983012 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-m4xd7\" (UID: \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\") " pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" Oct 11 01:14:30 crc kubenswrapper[4743]: I1011 01:14:30.011990 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5vxg\" (UniqueName: \"kubernetes.io/projected/229ca9c6-760d-4ea3-9599-bb5cfeeea826-kube-api-access-f5vxg\") pod \"dnsmasq-dns-f84f9ccf-m4xd7\" (UID: \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\") " pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" Oct 11 01:14:30 crc kubenswrapper[4743]: I1011 01:14:30.081217 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf81bc47-30a1-4231-a23a-446813610da6-config-data\") pod \"cf81bc47-30a1-4231-a23a-446813610da6\" (UID: \"cf81bc47-30a1-4231-a23a-446813610da6\") " Oct 11 01:14:30 crc kubenswrapper[4743]: I1011 01:14:30.081372 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf81bc47-30a1-4231-a23a-446813610da6-combined-ca-bundle\") pod \"cf81bc47-30a1-4231-a23a-446813610da6\" (UID: \"cf81bc47-30a1-4231-a23a-446813610da6\") " Oct 11 01:14:30 crc kubenswrapper[4743]: I1011 01:14:30.081451 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf81bc47-30a1-4231-a23a-446813610da6-scripts\") pod \"cf81bc47-30a1-4231-a23a-446813610da6\" (UID: \"cf81bc47-30a1-4231-a23a-446813610da6\") " Oct 11 01:14:30 crc kubenswrapper[4743]: I1011 01:14:30.081500 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66bjg\" (UniqueName: \"kubernetes.io/projected/cf81bc47-30a1-4231-a23a-446813610da6-kube-api-access-66bjg\") pod \"cf81bc47-30a1-4231-a23a-446813610da6\" (UID: \"cf81bc47-30a1-4231-a23a-446813610da6\") " Oct 11 01:14:30 crc kubenswrapper[4743]: I1011 01:14:30.086053 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf81bc47-30a1-4231-a23a-446813610da6-kube-api-access-66bjg" (OuterVolumeSpecName: "kube-api-access-66bjg") pod "cf81bc47-30a1-4231-a23a-446813610da6" (UID: "cf81bc47-30a1-4231-a23a-446813610da6"). InnerVolumeSpecName "kube-api-access-66bjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:14:30 crc kubenswrapper[4743]: I1011 01:14:30.086107 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf81bc47-30a1-4231-a23a-446813610da6-scripts" (OuterVolumeSpecName: "scripts") pod "cf81bc47-30a1-4231-a23a-446813610da6" (UID: "cf81bc47-30a1-4231-a23a-446813610da6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:30 crc kubenswrapper[4743]: I1011 01:14:30.086529 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" Oct 11 01:14:30 crc kubenswrapper[4743]: I1011 01:14:30.104987 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33029235-8b5f-4193-9760-c14a1f6de702" path="/var/lib/kubelet/pods/33029235-8b5f-4193-9760-c14a1f6de702/volumes" Oct 11 01:14:30 crc kubenswrapper[4743]: I1011 01:14:30.109972 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf81bc47-30a1-4231-a23a-446813610da6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf81bc47-30a1-4231-a23a-446813610da6" (UID: "cf81bc47-30a1-4231-a23a-446813610da6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:30 crc kubenswrapper[4743]: I1011 01:14:30.121733 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf81bc47-30a1-4231-a23a-446813610da6-config-data" (OuterVolumeSpecName: "config-data") pod "cf81bc47-30a1-4231-a23a-446813610da6" (UID: "cf81bc47-30a1-4231-a23a-446813610da6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:30 crc kubenswrapper[4743]: I1011 01:14:30.184438 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf81bc47-30a1-4231-a23a-446813610da6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:30 crc kubenswrapper[4743]: I1011 01:14:30.184660 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf81bc47-30a1-4231-a23a-446813610da6-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:30 crc kubenswrapper[4743]: I1011 01:14:30.184670 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66bjg\" (UniqueName: \"kubernetes.io/projected/cf81bc47-30a1-4231-a23a-446813610da6-kube-api-access-66bjg\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:30 crc kubenswrapper[4743]: I1011 01:14:30.184681 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf81bc47-30a1-4231-a23a-446813610da6-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:30 crc kubenswrapper[4743]: I1011 01:14:30.524182 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d685eabb-511e-4604-9716-7676177726d6","Type":"ContainerStarted","Data":"bd967d3061cd82b6058b0d28c33183f27cdc0c5b540ba69be7ee45e3a8b2922f"} Oct 11 01:14:30 crc kubenswrapper[4743]: I1011 01:14:30.524241 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d685eabb-511e-4604-9716-7676177726d6","Type":"ContainerStarted","Data":"d455b55ece51e4e55ff44181c672a22022c10c42b40e81cb0cc7e84f878a3e8d"} Oct 11 01:14:30 crc kubenswrapper[4743]: I1011 01:14:30.567947 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-592t9" Oct 11 01:14:30 crc kubenswrapper[4743]: I1011 01:14:30.568074 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-592t9" event={"ID":"cf81bc47-30a1-4231-a23a-446813610da6","Type":"ContainerDied","Data":"498c59d00ee590aee0b190b2388b289df775a331684ed71180dc0b486e50efac"} Oct 11 01:14:30 crc kubenswrapper[4743]: I1011 01:14:30.568120 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="498c59d00ee590aee0b190b2388b289df775a331684ed71180dc0b486e50efac" Oct 11 01:14:30 crc kubenswrapper[4743]: I1011 01:14:30.586128 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.586096393 podStartE2EDuration="2.586096393s" podCreationTimestamp="2025-10-11 01:14:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:14:30.548430361 +0000 UTC m=+1365.201410758" watchObservedRunningTime="2025-10-11 01:14:30.586096393 +0000 UTC m=+1365.239076790" Oct 11 01:14:30 crc kubenswrapper[4743]: I1011 01:14:30.684314 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-m4xd7"] Oct 11 01:14:31 crc kubenswrapper[4743]: I1011 01:14:31.577087 4743 generic.go:334] "Generic (PLEG): container finished" podID="229ca9c6-760d-4ea3-9599-bb5cfeeea826" containerID="3712c50e49f77421ca79623016f56cb42e253a0bfdaa9c338c16f927ee50382a" exitCode=0 Oct 11 01:14:31 crc kubenswrapper[4743]: I1011 01:14:31.577177 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" event={"ID":"229ca9c6-760d-4ea3-9599-bb5cfeeea826","Type":"ContainerDied","Data":"3712c50e49f77421ca79623016f56cb42e253a0bfdaa9c338c16f927ee50382a"} Oct 11 01:14:31 crc kubenswrapper[4743]: I1011 01:14:31.577768 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" event={"ID":"229ca9c6-760d-4ea3-9599-bb5cfeeea826","Type":"ContainerStarted","Data":"73a74bb2f1eca33e04276811345bed0cc4307816323e299082df2e988efc3bc9"} Oct 11 01:14:32 crc kubenswrapper[4743]: I1011 01:14:32.212338 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:14:32 crc kubenswrapper[4743]: I1011 01:14:32.213142 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" containerName="ceilometer-central-agent" containerID="cri-o://e505fbffcce071baae5996b069f7b88114241cd03e0592d7fcdf9347f56fc134" gracePeriod=30 Oct 11 01:14:32 crc kubenswrapper[4743]: I1011 01:14:32.213247 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" containerName="proxy-httpd" containerID="cri-o://fec04be03bc7a771a73945d71b479279a36a0acb5b3616a85952a72c28440340" gracePeriod=30 Oct 11 01:14:32 crc kubenswrapper[4743]: I1011 01:14:32.213287 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" containerName="ceilometer-notification-agent" containerID="cri-o://162b0eb903378f50a28bf5eac63de80e676a498a708b9ea30bbd6df497856488" gracePeriod=30 Oct 11 01:14:32 crc kubenswrapper[4743]: I1011 01:14:32.213254 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" containerName="sg-core" containerID="cri-o://a6e09a7bc274890145d0e1f3817133cfa116d0bc162a28ad64d99d13b23408a9" gracePeriod=30 Oct 11 01:14:32 crc kubenswrapper[4743]: I1011 01:14:32.220084 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.223:3000/\": EOF" Oct 11 01:14:32 crc kubenswrapper[4743]: I1011 01:14:32.430376 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 11 01:14:32 crc kubenswrapper[4743]: I1011 01:14:32.631720 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" event={"ID":"229ca9c6-760d-4ea3-9599-bb5cfeeea826","Type":"ContainerStarted","Data":"95ca4b10ceb0b8f7ea64d770770a6ce3deec6c9a69acefa3ac9ed2b0c87e591f"} Oct 11 01:14:32 crc kubenswrapper[4743]: I1011 01:14:32.631814 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" Oct 11 01:14:32 crc kubenswrapper[4743]: I1011 01:14:32.671834 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" podStartSLOduration=3.671817001 podStartE2EDuration="3.671817001s" podCreationTimestamp="2025-10-11 01:14:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:14:32.664718936 +0000 UTC m=+1367.317699333" watchObservedRunningTime="2025-10-11 01:14:32.671817001 +0000 UTC m=+1367.324797398" Oct 11 01:14:32 crc kubenswrapper[4743]: I1011 01:14:32.684135 4743 generic.go:334] "Generic (PLEG): container finished" podID="ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" containerID="fec04be03bc7a771a73945d71b479279a36a0acb5b3616a85952a72c28440340" exitCode=0 Oct 11 01:14:32 crc kubenswrapper[4743]: I1011 01:14:32.684166 4743 generic.go:334] "Generic (PLEG): container finished" podID="ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" containerID="a6e09a7bc274890145d0e1f3817133cfa116d0bc162a28ad64d99d13b23408a9" exitCode=2 Oct 11 01:14:32 crc kubenswrapper[4743]: I1011 01:14:32.684230 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584","Type":"ContainerDied","Data":"fec04be03bc7a771a73945d71b479279a36a0acb5b3616a85952a72c28440340"} Oct 11 01:14:32 crc kubenswrapper[4743]: I1011 01:14:32.684276 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584","Type":"ContainerDied","Data":"a6e09a7bc274890145d0e1f3817133cfa116d0bc162a28ad64d99d13b23408a9"} Oct 11 01:14:32 crc kubenswrapper[4743]: I1011 01:14:32.684337 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ed0e997e-fa72-4850-ade3-9dce1c59daab" containerName="nova-api-log" containerID="cri-o://27f9016a419154431b54d07aa857f96525ec64907314e71d85aeb3110bd6235b" gracePeriod=30 Oct 11 01:14:32 crc kubenswrapper[4743]: I1011 01:14:32.684399 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ed0e997e-fa72-4850-ade3-9dce1c59daab" containerName="nova-api-api" containerID="cri-o://1920b7f7416acfce69472dcfbc0753817b696901d93355db14578f266a9cf045" gracePeriod=30 Oct 11 01:14:33 crc kubenswrapper[4743]: I1011 01:14:33.699069 4743 generic.go:334] "Generic (PLEG): container finished" podID="ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" containerID="162b0eb903378f50a28bf5eac63de80e676a498a708b9ea30bbd6df497856488" exitCode=0 Oct 11 01:14:33 crc kubenswrapper[4743]: I1011 01:14:33.699330 4743 generic.go:334] "Generic (PLEG): container finished" podID="ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" containerID="e505fbffcce071baae5996b069f7b88114241cd03e0592d7fcdf9347f56fc134" exitCode=0 Oct 11 01:14:33 crc kubenswrapper[4743]: I1011 01:14:33.699142 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584","Type":"ContainerDied","Data":"162b0eb903378f50a28bf5eac63de80e676a498a708b9ea30bbd6df497856488"} Oct 11 01:14:33 crc kubenswrapper[4743]: I1011 01:14:33.699395 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584","Type":"ContainerDied","Data":"e505fbffcce071baae5996b069f7b88114241cd03e0592d7fcdf9347f56fc134"} Oct 11 01:14:33 crc kubenswrapper[4743]: I1011 01:14:33.702027 4743 generic.go:334] "Generic (PLEG): container finished" podID="ed0e997e-fa72-4850-ade3-9dce1c59daab" containerID="27f9016a419154431b54d07aa857f96525ec64907314e71d85aeb3110bd6235b" exitCode=143 Oct 11 01:14:33 crc kubenswrapper[4743]: I1011 01:14:33.702102 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ed0e997e-fa72-4850-ade3-9dce1c59daab","Type":"ContainerDied","Data":"27f9016a419154431b54d07aa857f96525ec64907314e71d85aeb3110bd6235b"} Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.081547 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.173450 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-log-httpd\") pod \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.173533 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-run-httpd\") pod \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.173620 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-combined-ca-bundle\") pod \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.173716 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-config-data\") pod \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.173768 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-555cd\" (UniqueName: \"kubernetes.io/projected/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-kube-api-access-555cd\") pod \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.173870 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-scripts\") pod \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.173891 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-sg-core-conf-yaml\") pod \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\" (UID: \"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584\") " Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.174371 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" (UID: "ebe70c7f-81a8-4b0b-84de-05d3ed8cb584"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.174478 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" (UID: "ebe70c7f-81a8-4b0b-84de-05d3ed8cb584"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.175874 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.175900 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.203572 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-kube-api-access-555cd" (OuterVolumeSpecName: "kube-api-access-555cd") pod "ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" (UID: "ebe70c7f-81a8-4b0b-84de-05d3ed8cb584"). InnerVolumeSpecName "kube-api-access-555cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.210689 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-scripts" (OuterVolumeSpecName: "scripts") pod "ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" (UID: "ebe70c7f-81a8-4b0b-84de-05d3ed8cb584"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.230564 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" (UID: "ebe70c7f-81a8-4b0b-84de-05d3ed8cb584"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.241719 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.270961 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" (UID: "ebe70c7f-81a8-4b0b-84de-05d3ed8cb584"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.277275 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-555cd\" (UniqueName: \"kubernetes.io/projected/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-kube-api-access-555cd\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.277304 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.277314 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.277323 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.285077 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-config-data" (OuterVolumeSpecName: "config-data") pod "ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" (UID: "ebe70c7f-81a8-4b0b-84de-05d3ed8cb584"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.379551 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.714688 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ebe70c7f-81a8-4b0b-84de-05d3ed8cb584","Type":"ContainerDied","Data":"d4db77043df5be01297534e501df05b9f20168eda583aee2ef8684072e3019b0"} Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.714765 4743 scope.go:117] "RemoveContainer" containerID="fec04be03bc7a771a73945d71b479279a36a0acb5b3616a85952a72c28440340" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.714797 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.737621 4743 scope.go:117] "RemoveContainer" containerID="a6e09a7bc274890145d0e1f3817133cfa116d0bc162a28ad64d99d13b23408a9" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.769943 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.780390 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.786535 4743 scope.go:117] "RemoveContainer" containerID="162b0eb903378f50a28bf5eac63de80e676a498a708b9ea30bbd6df497856488" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.813437 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:14:34 crc kubenswrapper[4743]: E1011 01:14:34.814046 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf81bc47-30a1-4231-a23a-446813610da6" containerName="aodh-db-sync" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.814070 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf81bc47-30a1-4231-a23a-446813610da6" containerName="aodh-db-sync" Oct 11 01:14:34 crc kubenswrapper[4743]: E1011 01:14:34.814095 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" containerName="sg-core" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.814105 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" containerName="sg-core" Oct 11 01:14:34 crc kubenswrapper[4743]: E1011 01:14:34.814144 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" containerName="ceilometer-notification-agent" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.814153 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" containerName="ceilometer-notification-agent" Oct 11 01:14:34 crc kubenswrapper[4743]: E1011 01:14:34.814163 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" containerName="proxy-httpd" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.814174 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" containerName="proxy-httpd" Oct 11 01:14:34 crc kubenswrapper[4743]: E1011 01:14:34.814189 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" containerName="ceilometer-central-agent" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.814197 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" containerName="ceilometer-central-agent" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.814508 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" containerName="proxy-httpd" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.814529 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" containerName="sg-core" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.814549 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" containerName="ceilometer-notification-agent" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.814571 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" containerName="ceilometer-central-agent" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.814587 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf81bc47-30a1-4231-a23a-446813610da6" containerName="aodh-db-sync" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.817417 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.820463 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.820634 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.844767 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.849024 4743 scope.go:117] "RemoveContainer" containerID="e505fbffcce071baae5996b069f7b88114241cd03e0592d7fcdf9347f56fc134" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.889659 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.899357 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.901506 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15df0508-ff6a-49b2-a9ca-80728adceeed-log-httpd\") pod \"ceilometer-0\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " pod="openstack/ceilometer-0" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.901590 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15df0508-ff6a-49b2-a9ca-80728adceeed-run-httpd\") pod \"ceilometer-0\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " pod="openstack/ceilometer-0" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.901620 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15df0508-ff6a-49b2-a9ca-80728adceeed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " pod="openstack/ceilometer-0" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.901657 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15df0508-ff6a-49b2-a9ca-80728adceeed-scripts\") pod \"ceilometer-0\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " pod="openstack/ceilometer-0" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.901679 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15df0508-ff6a-49b2-a9ca-80728adceeed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " pod="openstack/ceilometer-0" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.901736 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtsg5\" (UniqueName: \"kubernetes.io/projected/15df0508-ff6a-49b2-a9ca-80728adceeed-kube-api-access-wtsg5\") pod \"ceilometer-0\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " pod="openstack/ceilometer-0" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.901756 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15df0508-ff6a-49b2-a9ca-80728adceeed-config-data\") pod \"ceilometer-0\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " pod="openstack/ceilometer-0" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.902832 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.903070 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-dnc99" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.903420 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 11 01:14:34 crc kubenswrapper[4743]: I1011 01:14:34.912125 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.006326 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2500dccd-617d-4164-b9f6-5b675bab6848-scripts\") pod \"aodh-0\" (UID: \"2500dccd-617d-4164-b9f6-5b675bab6848\") " pod="openstack/aodh-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.006637 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2500dccd-617d-4164-b9f6-5b675bab6848-config-data\") pod \"aodh-0\" (UID: \"2500dccd-617d-4164-b9f6-5b675bab6848\") " pod="openstack/aodh-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.006688 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15df0508-ff6a-49b2-a9ca-80728adceeed-log-httpd\") pod \"ceilometer-0\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " pod="openstack/ceilometer-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.006755 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15df0508-ff6a-49b2-a9ca-80728adceeed-run-httpd\") pod \"ceilometer-0\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " pod="openstack/ceilometer-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.006786 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15df0508-ff6a-49b2-a9ca-80728adceeed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " pod="openstack/ceilometer-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.006806 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2500dccd-617d-4164-b9f6-5b675bab6848-combined-ca-bundle\") pod \"aodh-0\" (UID: \"2500dccd-617d-4164-b9f6-5b675bab6848\") " pod="openstack/aodh-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.006850 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15df0508-ff6a-49b2-a9ca-80728adceeed-scripts\") pod \"ceilometer-0\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " pod="openstack/ceilometer-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.006913 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15df0508-ff6a-49b2-a9ca-80728adceeed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " pod="openstack/ceilometer-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.006947 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m2vk\" (UniqueName: \"kubernetes.io/projected/2500dccd-617d-4164-b9f6-5b675bab6848-kube-api-access-6m2vk\") pod \"aodh-0\" (UID: \"2500dccd-617d-4164-b9f6-5b675bab6848\") " pod="openstack/aodh-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.007030 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtsg5\" (UniqueName: \"kubernetes.io/projected/15df0508-ff6a-49b2-a9ca-80728adceeed-kube-api-access-wtsg5\") pod \"ceilometer-0\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " pod="openstack/ceilometer-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.007054 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15df0508-ff6a-49b2-a9ca-80728adceeed-config-data\") pod \"ceilometer-0\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " pod="openstack/ceilometer-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.007624 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15df0508-ff6a-49b2-a9ca-80728adceeed-log-httpd\") pod \"ceilometer-0\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " pod="openstack/ceilometer-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.007815 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15df0508-ff6a-49b2-a9ca-80728adceeed-run-httpd\") pod \"ceilometer-0\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " pod="openstack/ceilometer-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.030531 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15df0508-ff6a-49b2-a9ca-80728adceeed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " pod="openstack/ceilometer-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.031711 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15df0508-ff6a-49b2-a9ca-80728adceeed-config-data\") pod \"ceilometer-0\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " pod="openstack/ceilometer-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.033193 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15df0508-ff6a-49b2-a9ca-80728adceeed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " pod="openstack/ceilometer-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.042906 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15df0508-ff6a-49b2-a9ca-80728adceeed-scripts\") pod \"ceilometer-0\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " pod="openstack/ceilometer-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.060892 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtsg5\" (UniqueName: \"kubernetes.io/projected/15df0508-ff6a-49b2-a9ca-80728adceeed-kube-api-access-wtsg5\") pod \"ceilometer-0\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " pod="openstack/ceilometer-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.110257 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2500dccd-617d-4164-b9f6-5b675bab6848-scripts\") pod \"aodh-0\" (UID: \"2500dccd-617d-4164-b9f6-5b675bab6848\") " pod="openstack/aodh-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.110313 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2500dccd-617d-4164-b9f6-5b675bab6848-config-data\") pod \"aodh-0\" (UID: \"2500dccd-617d-4164-b9f6-5b675bab6848\") " pod="openstack/aodh-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.110411 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2500dccd-617d-4164-b9f6-5b675bab6848-combined-ca-bundle\") pod \"aodh-0\" (UID: \"2500dccd-617d-4164-b9f6-5b675bab6848\") " pod="openstack/aodh-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.110463 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m2vk\" (UniqueName: \"kubernetes.io/projected/2500dccd-617d-4164-b9f6-5b675bab6848-kube-api-access-6m2vk\") pod \"aodh-0\" (UID: \"2500dccd-617d-4164-b9f6-5b675bab6848\") " pod="openstack/aodh-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.114834 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2500dccd-617d-4164-b9f6-5b675bab6848-scripts\") pod \"aodh-0\" (UID: \"2500dccd-617d-4164-b9f6-5b675bab6848\") " pod="openstack/aodh-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.115550 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2500dccd-617d-4164-b9f6-5b675bab6848-config-data\") pod \"aodh-0\" (UID: \"2500dccd-617d-4164-b9f6-5b675bab6848\") " pod="openstack/aodh-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.117460 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2500dccd-617d-4164-b9f6-5b675bab6848-combined-ca-bundle\") pod \"aodh-0\" (UID: \"2500dccd-617d-4164-b9f6-5b675bab6848\") " pod="openstack/aodh-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.125403 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m2vk\" (UniqueName: \"kubernetes.io/projected/2500dccd-617d-4164-b9f6-5b675bab6848-kube-api-access-6m2vk\") pod \"aodh-0\" (UID: \"2500dccd-617d-4164-b9f6-5b675bab6848\") " pod="openstack/aodh-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.148874 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.225730 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.676162 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:14:35 crc kubenswrapper[4743]: W1011 01:14:35.679445 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15df0508_ff6a_49b2_a9ca_80728adceeed.slice/crio-85923eef799bdc3c1d0fc3b00c4cfb3a3e61b46d8a52d84a772807a043772c89 WatchSource:0}: Error finding container 85923eef799bdc3c1d0fc3b00c4cfb3a3e61b46d8a52d84a772807a043772c89: Status 404 returned error can't find the container with id 85923eef799bdc3c1d0fc3b00c4cfb3a3e61b46d8a52d84a772807a043772c89 Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.727592 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15df0508-ff6a-49b2-a9ca-80728adceeed","Type":"ContainerStarted","Data":"85923eef799bdc3c1d0fc3b00c4cfb3a3e61b46d8a52d84a772807a043772c89"} Oct 11 01:14:35 crc kubenswrapper[4743]: W1011 01:14:35.802919 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2500dccd_617d_4164_b9f6_5b675bab6848.slice/crio-1bd255295a1481fa9997871e087787bb0a3dfa4a6a55ed991ea67e819c1ba42e WatchSource:0}: Error finding container 1bd255295a1481fa9997871e087787bb0a3dfa4a6a55ed991ea67e819c1ba42e: Status 404 returned error can't find the container with id 1bd255295a1481fa9997871e087787bb0a3dfa4a6a55ed991ea67e819c1ba42e Oct 11 01:14:35 crc kubenswrapper[4743]: I1011 01:14:35.805300 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.155436 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebe70c7f-81a8-4b0b-84de-05d3ed8cb584" path="/var/lib/kubelet/pods/ebe70c7f-81a8-4b0b-84de-05d3ed8cb584/volumes" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.516399 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.665681 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0e997e-fa72-4850-ade3-9dce1c59daab-combined-ca-bundle\") pod \"ed0e997e-fa72-4850-ade3-9dce1c59daab\" (UID: \"ed0e997e-fa72-4850-ade3-9dce1c59daab\") " Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.666129 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68ns6\" (UniqueName: \"kubernetes.io/projected/ed0e997e-fa72-4850-ade3-9dce1c59daab-kube-api-access-68ns6\") pod \"ed0e997e-fa72-4850-ade3-9dce1c59daab\" (UID: \"ed0e997e-fa72-4850-ade3-9dce1c59daab\") " Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.666172 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed0e997e-fa72-4850-ade3-9dce1c59daab-config-data\") pod \"ed0e997e-fa72-4850-ade3-9dce1c59daab\" (UID: \"ed0e997e-fa72-4850-ade3-9dce1c59daab\") " Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.666283 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed0e997e-fa72-4850-ade3-9dce1c59daab-logs\") pod \"ed0e997e-fa72-4850-ade3-9dce1c59daab\" (UID: \"ed0e997e-fa72-4850-ade3-9dce1c59daab\") " Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.671630 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed0e997e-fa72-4850-ade3-9dce1c59daab-kube-api-access-68ns6" (OuterVolumeSpecName: "kube-api-access-68ns6") pod "ed0e997e-fa72-4850-ade3-9dce1c59daab" (UID: "ed0e997e-fa72-4850-ade3-9dce1c59daab"). InnerVolumeSpecName "kube-api-access-68ns6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.673758 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed0e997e-fa72-4850-ade3-9dce1c59daab-logs" (OuterVolumeSpecName: "logs") pod "ed0e997e-fa72-4850-ade3-9dce1c59daab" (UID: "ed0e997e-fa72-4850-ade3-9dce1c59daab"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.701541 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed0e997e-fa72-4850-ade3-9dce1c59daab-config-data" (OuterVolumeSpecName: "config-data") pod "ed0e997e-fa72-4850-ade3-9dce1c59daab" (UID: "ed0e997e-fa72-4850-ade3-9dce1c59daab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.713516 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed0e997e-fa72-4850-ade3-9dce1c59daab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed0e997e-fa72-4850-ade3-9dce1c59daab" (UID: "ed0e997e-fa72-4850-ade3-9dce1c59daab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.747767 4743 generic.go:334] "Generic (PLEG): container finished" podID="ed0e997e-fa72-4850-ade3-9dce1c59daab" containerID="1920b7f7416acfce69472dcfbc0753817b696901d93355db14578f266a9cf045" exitCode=0 Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.747835 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ed0e997e-fa72-4850-ade3-9dce1c59daab","Type":"ContainerDied","Data":"1920b7f7416acfce69472dcfbc0753817b696901d93355db14578f266a9cf045"} Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.747878 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ed0e997e-fa72-4850-ade3-9dce1c59daab","Type":"ContainerDied","Data":"cba2b4d1a39ce7e144110e60b0e4d7973be3b01555ce3208d23dbf958a1bb66b"} Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.747896 4743 scope.go:117] "RemoveContainer" containerID="1920b7f7416acfce69472dcfbc0753817b696901d93355db14578f266a9cf045" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.748006 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.756361 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15df0508-ff6a-49b2-a9ca-80728adceeed","Type":"ContainerStarted","Data":"362c681795501aec8d43ce09e7837eebef382d9b2352f1ae2e7a176e2113b024"} Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.758265 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"2500dccd-617d-4164-b9f6-5b675bab6848","Type":"ContainerStarted","Data":"eb4ccaf3fb26fca9439b311b8cb335c04270a0a3871b4f22cf7c7437ec0e7e74"} Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.758288 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"2500dccd-617d-4164-b9f6-5b675bab6848","Type":"ContainerStarted","Data":"1bd255295a1481fa9997871e087787bb0a3dfa4a6a55ed991ea67e819c1ba42e"} Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.769868 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed0e997e-fa72-4850-ade3-9dce1c59daab-logs\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.769895 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0e997e-fa72-4850-ade3-9dce1c59daab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.769908 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68ns6\" (UniqueName: \"kubernetes.io/projected/ed0e997e-fa72-4850-ade3-9dce1c59daab-kube-api-access-68ns6\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.769920 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed0e997e-fa72-4850-ade3-9dce1c59daab-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.785105 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.788320 4743 scope.go:117] "RemoveContainer" containerID="27f9016a419154431b54d07aa857f96525ec64907314e71d85aeb3110bd6235b" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.797961 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.805709 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 11 01:14:36 crc kubenswrapper[4743]: E1011 01:14:36.806246 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed0e997e-fa72-4850-ade3-9dce1c59daab" containerName="nova-api-api" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.806263 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0e997e-fa72-4850-ade3-9dce1c59daab" containerName="nova-api-api" Oct 11 01:14:36 crc kubenswrapper[4743]: E1011 01:14:36.806286 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed0e997e-fa72-4850-ade3-9dce1c59daab" containerName="nova-api-log" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.806292 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0e997e-fa72-4850-ade3-9dce1c59daab" containerName="nova-api-log" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.806488 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed0e997e-fa72-4850-ade3-9dce1c59daab" containerName="nova-api-log" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.806508 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed0e997e-fa72-4850-ade3-9dce1c59daab" containerName="nova-api-api" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.807622 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.811450 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.811671 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.811796 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.815152 4743 scope.go:117] "RemoveContainer" containerID="1920b7f7416acfce69472dcfbc0753817b696901d93355db14578f266a9cf045" Oct 11 01:14:36 crc kubenswrapper[4743]: E1011 01:14:36.816030 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1920b7f7416acfce69472dcfbc0753817b696901d93355db14578f266a9cf045\": container with ID starting with 1920b7f7416acfce69472dcfbc0753817b696901d93355db14578f266a9cf045 not found: ID does not exist" containerID="1920b7f7416acfce69472dcfbc0753817b696901d93355db14578f266a9cf045" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.816065 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1920b7f7416acfce69472dcfbc0753817b696901d93355db14578f266a9cf045"} err="failed to get container status \"1920b7f7416acfce69472dcfbc0753817b696901d93355db14578f266a9cf045\": rpc error: code = NotFound desc = could not find container \"1920b7f7416acfce69472dcfbc0753817b696901d93355db14578f266a9cf045\": container with ID starting with 1920b7f7416acfce69472dcfbc0753817b696901d93355db14578f266a9cf045 not found: ID does not exist" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.816094 4743 scope.go:117] "RemoveContainer" containerID="27f9016a419154431b54d07aa857f96525ec64907314e71d85aeb3110bd6235b" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.816367 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 11 01:14:36 crc kubenswrapper[4743]: E1011 01:14:36.816532 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27f9016a419154431b54d07aa857f96525ec64907314e71d85aeb3110bd6235b\": container with ID starting with 27f9016a419154431b54d07aa857f96525ec64907314e71d85aeb3110bd6235b not found: ID does not exist" containerID="27f9016a419154431b54d07aa857f96525ec64907314e71d85aeb3110bd6235b" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.816571 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27f9016a419154431b54d07aa857f96525ec64907314e71d85aeb3110bd6235b"} err="failed to get container status \"27f9016a419154431b54d07aa857f96525ec64907314e71d85aeb3110bd6235b\": rpc error: code = NotFound desc = could not find container \"27f9016a419154431b54d07aa857f96525ec64907314e71d85aeb3110bd6235b\": container with ID starting with 27f9016a419154431b54d07aa857f96525ec64907314e71d85aeb3110bd6235b not found: ID does not exist" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.973296 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d840543d-ffc6-405a-8e01-e89d6f237820-public-tls-certs\") pod \"nova-api-0\" (UID: \"d840543d-ffc6-405a-8e01-e89d6f237820\") " pod="openstack/nova-api-0" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.973382 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d840543d-ffc6-405a-8e01-e89d6f237820-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d840543d-ffc6-405a-8e01-e89d6f237820\") " pod="openstack/nova-api-0" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.973457 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d840543d-ffc6-405a-8e01-e89d6f237820-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d840543d-ffc6-405a-8e01-e89d6f237820\") " pod="openstack/nova-api-0" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.973482 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d840543d-ffc6-405a-8e01-e89d6f237820-logs\") pod \"nova-api-0\" (UID: \"d840543d-ffc6-405a-8e01-e89d6f237820\") " pod="openstack/nova-api-0" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.973511 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsrb2\" (UniqueName: \"kubernetes.io/projected/d840543d-ffc6-405a-8e01-e89d6f237820-kube-api-access-xsrb2\") pod \"nova-api-0\" (UID: \"d840543d-ffc6-405a-8e01-e89d6f237820\") " pod="openstack/nova-api-0" Oct 11 01:14:36 crc kubenswrapper[4743]: I1011 01:14:36.973539 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d840543d-ffc6-405a-8e01-e89d6f237820-config-data\") pod \"nova-api-0\" (UID: \"d840543d-ffc6-405a-8e01-e89d6f237820\") " pod="openstack/nova-api-0" Oct 11 01:14:37 crc kubenswrapper[4743]: I1011 01:14:37.074832 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsrb2\" (UniqueName: \"kubernetes.io/projected/d840543d-ffc6-405a-8e01-e89d6f237820-kube-api-access-xsrb2\") pod \"nova-api-0\" (UID: \"d840543d-ffc6-405a-8e01-e89d6f237820\") " pod="openstack/nova-api-0" Oct 11 01:14:37 crc kubenswrapper[4743]: I1011 01:14:37.074979 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d840543d-ffc6-405a-8e01-e89d6f237820-config-data\") pod \"nova-api-0\" (UID: \"d840543d-ffc6-405a-8e01-e89d6f237820\") " pod="openstack/nova-api-0" Oct 11 01:14:37 crc kubenswrapper[4743]: I1011 01:14:37.075224 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d840543d-ffc6-405a-8e01-e89d6f237820-public-tls-certs\") pod \"nova-api-0\" (UID: \"d840543d-ffc6-405a-8e01-e89d6f237820\") " pod="openstack/nova-api-0" Oct 11 01:14:37 crc kubenswrapper[4743]: I1011 01:14:37.075434 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d840543d-ffc6-405a-8e01-e89d6f237820-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d840543d-ffc6-405a-8e01-e89d6f237820\") " pod="openstack/nova-api-0" Oct 11 01:14:37 crc kubenswrapper[4743]: I1011 01:14:37.076048 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d840543d-ffc6-405a-8e01-e89d6f237820-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d840543d-ffc6-405a-8e01-e89d6f237820\") " pod="openstack/nova-api-0" Oct 11 01:14:37 crc kubenswrapper[4743]: I1011 01:14:37.076106 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d840543d-ffc6-405a-8e01-e89d6f237820-logs\") pod \"nova-api-0\" (UID: \"d840543d-ffc6-405a-8e01-e89d6f237820\") " pod="openstack/nova-api-0" Oct 11 01:14:37 crc kubenswrapper[4743]: I1011 01:14:37.076487 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d840543d-ffc6-405a-8e01-e89d6f237820-logs\") pod \"nova-api-0\" (UID: \"d840543d-ffc6-405a-8e01-e89d6f237820\") " pod="openstack/nova-api-0" Oct 11 01:14:37 crc kubenswrapper[4743]: I1011 01:14:37.078650 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d840543d-ffc6-405a-8e01-e89d6f237820-config-data\") pod \"nova-api-0\" (UID: \"d840543d-ffc6-405a-8e01-e89d6f237820\") " pod="openstack/nova-api-0" Oct 11 01:14:37 crc kubenswrapper[4743]: I1011 01:14:37.080363 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d840543d-ffc6-405a-8e01-e89d6f237820-public-tls-certs\") pod \"nova-api-0\" (UID: \"d840543d-ffc6-405a-8e01-e89d6f237820\") " pod="openstack/nova-api-0" Oct 11 01:14:37 crc kubenswrapper[4743]: I1011 01:14:37.080472 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d840543d-ffc6-405a-8e01-e89d6f237820-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d840543d-ffc6-405a-8e01-e89d6f237820\") " pod="openstack/nova-api-0" Oct 11 01:14:37 crc kubenswrapper[4743]: I1011 01:14:37.080525 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d840543d-ffc6-405a-8e01-e89d6f237820-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d840543d-ffc6-405a-8e01-e89d6f237820\") " pod="openstack/nova-api-0" Oct 11 01:14:37 crc kubenswrapper[4743]: I1011 01:14:37.093354 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsrb2\" (UniqueName: \"kubernetes.io/projected/d840543d-ffc6-405a-8e01-e89d6f237820-kube-api-access-xsrb2\") pod \"nova-api-0\" (UID: \"d840543d-ffc6-405a-8e01-e89d6f237820\") " pod="openstack/nova-api-0" Oct 11 01:14:37 crc kubenswrapper[4743]: I1011 01:14:37.127073 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 11 01:14:37 crc kubenswrapper[4743]: I1011 01:14:37.630259 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 11 01:14:37 crc kubenswrapper[4743]: I1011 01:14:37.680357 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:14:37 crc kubenswrapper[4743]: I1011 01:14:37.788808 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15df0508-ff6a-49b2-a9ca-80728adceeed","Type":"ContainerStarted","Data":"f87f6575ead149a7c51791d8c58d34b33501e2d070690f495eb06f057a97e875"} Oct 11 01:14:37 crc kubenswrapper[4743]: I1011 01:14:37.791323 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d840543d-ffc6-405a-8e01-e89d6f237820","Type":"ContainerStarted","Data":"7909e44e39e7762d01699709adb21703f942be526e8a930e8eb21cdb5db79a66"} Oct 11 01:14:37 crc kubenswrapper[4743]: I1011 01:14:37.834059 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Oct 11 01:14:38 crc kubenswrapper[4743]: I1011 01:14:38.102722 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed0e997e-fa72-4850-ade3-9dce1c59daab" path="/var/lib/kubelet/pods/ed0e997e-fa72-4850-ade3-9dce1c59daab/volumes" Oct 11 01:14:38 crc kubenswrapper[4743]: I1011 01:14:38.802871 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d840543d-ffc6-405a-8e01-e89d6f237820","Type":"ContainerStarted","Data":"37c168db353224216e44b6c5a7a662e54d1f6647d619bdc64991536f270d4d38"} Oct 11 01:14:39 crc kubenswrapper[4743]: I1011 01:14:39.241560 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:14:39 crc kubenswrapper[4743]: I1011 01:14:39.273290 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:14:39 crc kubenswrapper[4743]: I1011 01:14:39.820004 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15df0508-ff6a-49b2-a9ca-80728adceeed","Type":"ContainerStarted","Data":"7bd09168ae2e684cd0fff7d087d25c45753110e20e21a220d95f12439edb08fb"} Oct 11 01:14:39 crc kubenswrapper[4743]: I1011 01:14:39.824720 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d840543d-ffc6-405a-8e01-e89d6f237820","Type":"ContainerStarted","Data":"7b354e26e1d24c4652be4a6a1cebd923176c80fa9eb40c4ba4ac0b9e0281807d"} Oct 11 01:14:39 crc kubenswrapper[4743]: I1011 01:14:39.831367 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"2500dccd-617d-4164-b9f6-5b675bab6848","Type":"ContainerStarted","Data":"d0b8e39936f9f2d8724ca6b9865d50e050a156eca6c8df578714ef6f7d2f8499"} Oct 11 01:14:39 crc kubenswrapper[4743]: I1011 01:14:39.850823 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.85080211 podStartE2EDuration="3.85080211s" podCreationTimestamp="2025-10-11 01:14:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:14:39.847527623 +0000 UTC m=+1374.500508040" watchObservedRunningTime="2025-10-11 01:14:39.85080211 +0000 UTC m=+1374.503782517" Oct 11 01:14:39 crc kubenswrapper[4743]: I1011 01:14:39.877909 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.065569 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-jckg7"] Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.067300 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jckg7" Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.069387 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.070038 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.075311 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-jckg7"] Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.088160 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.147919 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96dd3888-06fd-48f8-a0b2-b320851dd83c-config-data\") pod \"nova-cell1-cell-mapping-jckg7\" (UID: \"96dd3888-06fd-48f8-a0b2-b320851dd83c\") " pod="openstack/nova-cell1-cell-mapping-jckg7" Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.148032 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96dd3888-06fd-48f8-a0b2-b320851dd83c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-jckg7\" (UID: \"96dd3888-06fd-48f8-a0b2-b320851dd83c\") " pod="openstack/nova-cell1-cell-mapping-jckg7" Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.148089 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmc25\" (UniqueName: \"kubernetes.io/projected/96dd3888-06fd-48f8-a0b2-b320851dd83c-kube-api-access-dmc25\") pod \"nova-cell1-cell-mapping-jckg7\" (UID: \"96dd3888-06fd-48f8-a0b2-b320851dd83c\") " pod="openstack/nova-cell1-cell-mapping-jckg7" Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.148127 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96dd3888-06fd-48f8-a0b2-b320851dd83c-scripts\") pod \"nova-cell1-cell-mapping-jckg7\" (UID: \"96dd3888-06fd-48f8-a0b2-b320851dd83c\") " pod="openstack/nova-cell1-cell-mapping-jckg7" Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.149020 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-4jblv"] Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.149282 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" podUID="6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e" containerName="dnsmasq-dns" containerID="cri-o://74190e0dd8ccc2e83c3d5b116f213c662e79d07e807ce83d49955bd900f89bd7" gracePeriod=10 Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.250025 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96dd3888-06fd-48f8-a0b2-b320851dd83c-config-data\") pod \"nova-cell1-cell-mapping-jckg7\" (UID: \"96dd3888-06fd-48f8-a0b2-b320851dd83c\") " pod="openstack/nova-cell1-cell-mapping-jckg7" Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.250125 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96dd3888-06fd-48f8-a0b2-b320851dd83c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-jckg7\" (UID: \"96dd3888-06fd-48f8-a0b2-b320851dd83c\") " pod="openstack/nova-cell1-cell-mapping-jckg7" Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.250179 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmc25\" (UniqueName: \"kubernetes.io/projected/96dd3888-06fd-48f8-a0b2-b320851dd83c-kube-api-access-dmc25\") pod \"nova-cell1-cell-mapping-jckg7\" (UID: \"96dd3888-06fd-48f8-a0b2-b320851dd83c\") " pod="openstack/nova-cell1-cell-mapping-jckg7" Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.250204 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96dd3888-06fd-48f8-a0b2-b320851dd83c-scripts\") pod \"nova-cell1-cell-mapping-jckg7\" (UID: \"96dd3888-06fd-48f8-a0b2-b320851dd83c\") " pod="openstack/nova-cell1-cell-mapping-jckg7" Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.260751 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96dd3888-06fd-48f8-a0b2-b320851dd83c-config-data\") pod \"nova-cell1-cell-mapping-jckg7\" (UID: \"96dd3888-06fd-48f8-a0b2-b320851dd83c\") " pod="openstack/nova-cell1-cell-mapping-jckg7" Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.261175 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96dd3888-06fd-48f8-a0b2-b320851dd83c-scripts\") pod \"nova-cell1-cell-mapping-jckg7\" (UID: \"96dd3888-06fd-48f8-a0b2-b320851dd83c\") " pod="openstack/nova-cell1-cell-mapping-jckg7" Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.261211 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96dd3888-06fd-48f8-a0b2-b320851dd83c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-jckg7\" (UID: \"96dd3888-06fd-48f8-a0b2-b320851dd83c\") " pod="openstack/nova-cell1-cell-mapping-jckg7" Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.276347 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmc25\" (UniqueName: \"kubernetes.io/projected/96dd3888-06fd-48f8-a0b2-b320851dd83c-kube-api-access-dmc25\") pod \"nova-cell1-cell-mapping-jckg7\" (UID: \"96dd3888-06fd-48f8-a0b2-b320851dd83c\") " pod="openstack/nova-cell1-cell-mapping-jckg7" Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.383931 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jckg7" Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.855316 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.895777 4743 generic.go:334] "Generic (PLEG): container finished" podID="6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e" containerID="74190e0dd8ccc2e83c3d5b116f213c662e79d07e807ce83d49955bd900f89bd7" exitCode=0 Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.896757 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.897229 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" event={"ID":"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e","Type":"ContainerDied","Data":"74190e0dd8ccc2e83c3d5b116f213c662e79d07e807ce83d49955bd900f89bd7"} Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.897262 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-4jblv" event={"ID":"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e","Type":"ContainerDied","Data":"8bb997653ee9e1ec1cadd1d7226ae873b8bdfd6950608a45fe3592f870a6bb9f"} Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.897278 4743 scope.go:117] "RemoveContainer" containerID="74190e0dd8ccc2e83c3d5b116f213c662e79d07e807ce83d49955bd900f89bd7" Oct 11 01:14:40 crc kubenswrapper[4743]: I1011 01:14:40.941086 4743 scope.go:117] "RemoveContainer" containerID="aab812d8f91da73c9ce3f9c987b2942b54f867c302cf8ee7930cc507d4bf356b" Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.004679 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-dns-swift-storage-0\") pod \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\" (UID: \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\") " Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.004825 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-config\") pod \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\" (UID: \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\") " Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.004964 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-ovsdbserver-nb\") pod \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\" (UID: \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\") " Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.005062 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-dns-svc\") pod \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\" (UID: \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\") " Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.005082 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb9p6\" (UniqueName: \"kubernetes.io/projected/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-kube-api-access-fb9p6\") pod \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\" (UID: \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\") " Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.005131 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-ovsdbserver-sb\") pod \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\" (UID: \"6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e\") " Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.032132 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-kube-api-access-fb9p6" (OuterVolumeSpecName: "kube-api-access-fb9p6") pod "6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e" (UID: "6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e"). InnerVolumeSpecName "kube-api-access-fb9p6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.032612 4743 scope.go:117] "RemoveContainer" containerID="74190e0dd8ccc2e83c3d5b116f213c662e79d07e807ce83d49955bd900f89bd7" Oct 11 01:14:41 crc kubenswrapper[4743]: E1011 01:14:41.038987 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74190e0dd8ccc2e83c3d5b116f213c662e79d07e807ce83d49955bd900f89bd7\": container with ID starting with 74190e0dd8ccc2e83c3d5b116f213c662e79d07e807ce83d49955bd900f89bd7 not found: ID does not exist" containerID="74190e0dd8ccc2e83c3d5b116f213c662e79d07e807ce83d49955bd900f89bd7" Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.039032 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74190e0dd8ccc2e83c3d5b116f213c662e79d07e807ce83d49955bd900f89bd7"} err="failed to get container status \"74190e0dd8ccc2e83c3d5b116f213c662e79d07e807ce83d49955bd900f89bd7\": rpc error: code = NotFound desc = could not find container \"74190e0dd8ccc2e83c3d5b116f213c662e79d07e807ce83d49955bd900f89bd7\": container with ID starting with 74190e0dd8ccc2e83c3d5b116f213c662e79d07e807ce83d49955bd900f89bd7 not found: ID does not exist" Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.039060 4743 scope.go:117] "RemoveContainer" containerID="aab812d8f91da73c9ce3f9c987b2942b54f867c302cf8ee7930cc507d4bf356b" Oct 11 01:14:41 crc kubenswrapper[4743]: E1011 01:14:41.039434 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aab812d8f91da73c9ce3f9c987b2942b54f867c302cf8ee7930cc507d4bf356b\": container with ID starting with aab812d8f91da73c9ce3f9c987b2942b54f867c302cf8ee7930cc507d4bf356b not found: ID does not exist" containerID="aab812d8f91da73c9ce3f9c987b2942b54f867c302cf8ee7930cc507d4bf356b" Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.039493 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aab812d8f91da73c9ce3f9c987b2942b54f867c302cf8ee7930cc507d4bf356b"} err="failed to get container status \"aab812d8f91da73c9ce3f9c987b2942b54f867c302cf8ee7930cc507d4bf356b\": rpc error: code = NotFound desc = could not find container \"aab812d8f91da73c9ce3f9c987b2942b54f867c302cf8ee7930cc507d4bf356b\": container with ID starting with aab812d8f91da73c9ce3f9c987b2942b54f867c302cf8ee7930cc507d4bf356b not found: ID does not exist" Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.114400 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-jckg7"] Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.114593 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb9p6\" (UniqueName: \"kubernetes.io/projected/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-kube-api-access-fb9p6\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.134568 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-config" (OuterVolumeSpecName: "config") pod "6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e" (UID: "6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.155807 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e" (UID: "6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:14:41 crc kubenswrapper[4743]: W1011 01:14:41.155960 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96dd3888_06fd_48f8_a0b2_b320851dd83c.slice/crio-6a0128ce8d5e91d2ec7e9c8222a32ad9c8bbe4c16be5d36ea36e1e5a3b7bd8b9 WatchSource:0}: Error finding container 6a0128ce8d5e91d2ec7e9c8222a32ad9c8bbe4c16be5d36ea36e1e5a3b7bd8b9: Status 404 returned error can't find the container with id 6a0128ce8d5e91d2ec7e9c8222a32ad9c8bbe4c16be5d36ea36e1e5a3b7bd8b9 Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.158659 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e" (UID: "6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.166990 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e" (UID: "6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.169601 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e" (UID: "6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.216207 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.216237 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.216246 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.216255 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.216265 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.232617 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-4jblv"] Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.267204 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-4jblv"] Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.914394 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jckg7" event={"ID":"96dd3888-06fd-48f8-a0b2-b320851dd83c","Type":"ContainerStarted","Data":"b2544f0edc3793fa09e9eeafb6b0d684aa074fd3f9ba192b8139c99fb91ed17f"} Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.914727 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jckg7" event={"ID":"96dd3888-06fd-48f8-a0b2-b320851dd83c","Type":"ContainerStarted","Data":"6a0128ce8d5e91d2ec7e9c8222a32ad9c8bbe4c16be5d36ea36e1e5a3b7bd8b9"} Oct 11 01:14:41 crc kubenswrapper[4743]: I1011 01:14:41.967701 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-jckg7" podStartSLOduration=1.9676838939999999 podStartE2EDuration="1.967683894s" podCreationTimestamp="2025-10-11 01:14:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:14:41.953640727 +0000 UTC m=+1376.606621124" watchObservedRunningTime="2025-10-11 01:14:41.967683894 +0000 UTC m=+1376.620664291" Oct 11 01:14:42 crc kubenswrapper[4743]: I1011 01:14:42.108877 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e" path="/var/lib/kubelet/pods/6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e/volumes" Oct 11 01:14:42 crc kubenswrapper[4743]: I1011 01:14:42.930821 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15df0508-ff6a-49b2-a9ca-80728adceeed","Type":"ContainerStarted","Data":"4eb366535771ea42329ec46200c035fb66a8fd4ca6c5fa3fc1b05abff4bc1fe5"} Oct 11 01:14:42 crc kubenswrapper[4743]: I1011 01:14:42.930999 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15df0508-ff6a-49b2-a9ca-80728adceeed" containerName="ceilometer-central-agent" containerID="cri-o://362c681795501aec8d43ce09e7837eebef382d9b2352f1ae2e7a176e2113b024" gracePeriod=30 Oct 11 01:14:42 crc kubenswrapper[4743]: I1011 01:14:42.931013 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15df0508-ff6a-49b2-a9ca-80728adceeed" containerName="ceilometer-notification-agent" containerID="cri-o://f87f6575ead149a7c51791d8c58d34b33501e2d070690f495eb06f057a97e875" gracePeriod=30 Oct 11 01:14:42 crc kubenswrapper[4743]: I1011 01:14:42.931026 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15df0508-ff6a-49b2-a9ca-80728adceeed" containerName="proxy-httpd" containerID="cri-o://4eb366535771ea42329ec46200c035fb66a8fd4ca6c5fa3fc1b05abff4bc1fe5" gracePeriod=30 Oct 11 01:14:42 crc kubenswrapper[4743]: I1011 01:14:42.931328 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 11 01:14:42 crc kubenswrapper[4743]: I1011 01:14:42.931058 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15df0508-ff6a-49b2-a9ca-80728adceeed" containerName="sg-core" containerID="cri-o://7bd09168ae2e684cd0fff7d087d25c45753110e20e21a220d95f12439edb08fb" gracePeriod=30 Oct 11 01:14:42 crc kubenswrapper[4743]: I1011 01:14:42.937331 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"2500dccd-617d-4164-b9f6-5b675bab6848","Type":"ContainerStarted","Data":"247c39c4fe75b005beceaba45f57f61a8e12c590e90ab0ff835fc207e3a65893"} Oct 11 01:14:42 crc kubenswrapper[4743]: I1011 01:14:42.970475 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.43582668 podStartE2EDuration="8.970448194s" podCreationTimestamp="2025-10-11 01:14:34 +0000 UTC" firstStartedPulling="2025-10-11 01:14:35.682101924 +0000 UTC m=+1370.335082321" lastFinishedPulling="2025-10-11 01:14:42.216723438 +0000 UTC m=+1376.869703835" observedRunningTime="2025-10-11 01:14:42.958575445 +0000 UTC m=+1377.611555842" watchObservedRunningTime="2025-10-11 01:14:42.970448194 +0000 UTC m=+1377.623428601" Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.031249 4743 generic.go:334] "Generic (PLEG): container finished" podID="15df0508-ff6a-49b2-a9ca-80728adceeed" containerID="4eb366535771ea42329ec46200c035fb66a8fd4ca6c5fa3fc1b05abff4bc1fe5" exitCode=0 Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.031665 4743 generic.go:334] "Generic (PLEG): container finished" podID="15df0508-ff6a-49b2-a9ca-80728adceeed" containerID="7bd09168ae2e684cd0fff7d087d25c45753110e20e21a220d95f12439edb08fb" exitCode=2 Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.031675 4743 generic.go:334] "Generic (PLEG): container finished" podID="15df0508-ff6a-49b2-a9ca-80728adceeed" containerID="f87f6575ead149a7c51791d8c58d34b33501e2d070690f495eb06f057a97e875" exitCode=0 Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.031681 4743 generic.go:334] "Generic (PLEG): container finished" podID="15df0508-ff6a-49b2-a9ca-80728adceeed" containerID="362c681795501aec8d43ce09e7837eebef382d9b2352f1ae2e7a176e2113b024" exitCode=0 Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.031701 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15df0508-ff6a-49b2-a9ca-80728adceeed","Type":"ContainerDied","Data":"4eb366535771ea42329ec46200c035fb66a8fd4ca6c5fa3fc1b05abff4bc1fe5"} Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.031725 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15df0508-ff6a-49b2-a9ca-80728adceeed","Type":"ContainerDied","Data":"7bd09168ae2e684cd0fff7d087d25c45753110e20e21a220d95f12439edb08fb"} Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.031734 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15df0508-ff6a-49b2-a9ca-80728adceeed","Type":"ContainerDied","Data":"f87f6575ead149a7c51791d8c58d34b33501e2d070690f495eb06f057a97e875"} Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.031745 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15df0508-ff6a-49b2-a9ca-80728adceeed","Type":"ContainerDied","Data":"362c681795501aec8d43ce09e7837eebef382d9b2352f1ae2e7a176e2113b024"} Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.031753 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15df0508-ff6a-49b2-a9ca-80728adceeed","Type":"ContainerDied","Data":"85923eef799bdc3c1d0fc3b00c4cfb3a3e61b46d8a52d84a772807a043772c89"} Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.031764 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85923eef799bdc3c1d0fc3b00c4cfb3a3e61b46d8a52d84a772807a043772c89" Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.063601 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.185606 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15df0508-ff6a-49b2-a9ca-80728adceeed-combined-ca-bundle\") pod \"15df0508-ff6a-49b2-a9ca-80728adceeed\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.185715 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15df0508-ff6a-49b2-a9ca-80728adceeed-run-httpd\") pod \"15df0508-ff6a-49b2-a9ca-80728adceeed\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.186089 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15df0508-ff6a-49b2-a9ca-80728adceeed-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "15df0508-ff6a-49b2-a9ca-80728adceeed" (UID: "15df0508-ff6a-49b2-a9ca-80728adceeed"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.186141 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtsg5\" (UniqueName: \"kubernetes.io/projected/15df0508-ff6a-49b2-a9ca-80728adceeed-kube-api-access-wtsg5\") pod \"15df0508-ff6a-49b2-a9ca-80728adceeed\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.186165 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15df0508-ff6a-49b2-a9ca-80728adceeed-scripts\") pod \"15df0508-ff6a-49b2-a9ca-80728adceeed\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.186220 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15df0508-ff6a-49b2-a9ca-80728adceeed-log-httpd\") pod \"15df0508-ff6a-49b2-a9ca-80728adceeed\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.186317 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15df0508-ff6a-49b2-a9ca-80728adceeed-sg-core-conf-yaml\") pod \"15df0508-ff6a-49b2-a9ca-80728adceeed\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.186341 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15df0508-ff6a-49b2-a9ca-80728adceeed-config-data\") pod \"15df0508-ff6a-49b2-a9ca-80728adceeed\" (UID: \"15df0508-ff6a-49b2-a9ca-80728adceeed\") " Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.186803 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15df0508-ff6a-49b2-a9ca-80728adceeed-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.189842 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15df0508-ff6a-49b2-a9ca-80728adceeed-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "15df0508-ff6a-49b2-a9ca-80728adceeed" (UID: "15df0508-ff6a-49b2-a9ca-80728adceeed"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.196096 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15df0508-ff6a-49b2-a9ca-80728adceeed-kube-api-access-wtsg5" (OuterVolumeSpecName: "kube-api-access-wtsg5") pod "15df0508-ff6a-49b2-a9ca-80728adceeed" (UID: "15df0508-ff6a-49b2-a9ca-80728adceeed"). InnerVolumeSpecName "kube-api-access-wtsg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.196184 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15df0508-ff6a-49b2-a9ca-80728adceeed-scripts" (OuterVolumeSpecName: "scripts") pod "15df0508-ff6a-49b2-a9ca-80728adceeed" (UID: "15df0508-ff6a-49b2-a9ca-80728adceeed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.242235 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15df0508-ff6a-49b2-a9ca-80728adceeed-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "15df0508-ff6a-49b2-a9ca-80728adceeed" (UID: "15df0508-ff6a-49b2-a9ca-80728adceeed"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.292315 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15df0508-ff6a-49b2-a9ca-80728adceeed-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.292348 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15df0508-ff6a-49b2-a9ca-80728adceeed-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.292359 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtsg5\" (UniqueName: \"kubernetes.io/projected/15df0508-ff6a-49b2-a9ca-80728adceeed-kube-api-access-wtsg5\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.292367 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15df0508-ff6a-49b2-a9ca-80728adceeed-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.314071 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15df0508-ff6a-49b2-a9ca-80728adceeed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15df0508-ff6a-49b2-a9ca-80728adceeed" (UID: "15df0508-ff6a-49b2-a9ca-80728adceeed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.339069 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15df0508-ff6a-49b2-a9ca-80728adceeed-config-data" (OuterVolumeSpecName: "config-data") pod "15df0508-ff6a-49b2-a9ca-80728adceeed" (UID: "15df0508-ff6a-49b2-a9ca-80728adceeed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.394005 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15df0508-ff6a-49b2-a9ca-80728adceeed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:44 crc kubenswrapper[4743]: I1011 01:14:44.394294 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15df0508-ff6a-49b2-a9ca-80728adceeed-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.050459 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.050515 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"2500dccd-617d-4164-b9f6-5b675bab6848","Type":"ContainerStarted","Data":"f4979c1b2742d60f73ba7e21f5e233e8c8299e13cf264183d5b33228e648515e"} Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.050984 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="2500dccd-617d-4164-b9f6-5b675bab6848" containerName="aodh-api" containerID="cri-o://eb4ccaf3fb26fca9439b311b8cb335c04270a0a3871b4f22cf7c7437ec0e7e74" gracePeriod=30 Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.051017 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="2500dccd-617d-4164-b9f6-5b675bab6848" containerName="aodh-notifier" containerID="cri-o://247c39c4fe75b005beceaba45f57f61a8e12c590e90ab0ff835fc207e3a65893" gracePeriod=30 Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.051049 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="2500dccd-617d-4164-b9f6-5b675bab6848" containerName="aodh-evaluator" containerID="cri-o://d0b8e39936f9f2d8724ca6b9865d50e050a156eca6c8df578714ef6f7d2f8499" gracePeriod=30 Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.051117 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="2500dccd-617d-4164-b9f6-5b675bab6848" containerName="aodh-listener" containerID="cri-o://f4979c1b2742d60f73ba7e21f5e233e8c8299e13cf264183d5b33228e648515e" gracePeriod=30 Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.114427 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=3.003978366 podStartE2EDuration="11.114407593s" podCreationTimestamp="2025-10-11 01:14:34 +0000 UTC" firstStartedPulling="2025-10-11 01:14:35.806301725 +0000 UTC m=+1370.459282122" lastFinishedPulling="2025-10-11 01:14:43.916730952 +0000 UTC m=+1378.569711349" observedRunningTime="2025-10-11 01:14:45.081500875 +0000 UTC m=+1379.734481272" watchObservedRunningTime="2025-10-11 01:14:45.114407593 +0000 UTC m=+1379.767387990" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.159487 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.179362 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.193901 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:14:45 crc kubenswrapper[4743]: E1011 01:14:45.194374 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e" containerName="dnsmasq-dns" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.194390 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e" containerName="dnsmasq-dns" Oct 11 01:14:45 crc kubenswrapper[4743]: E1011 01:14:45.194410 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15df0508-ff6a-49b2-a9ca-80728adceeed" containerName="ceilometer-central-agent" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.194416 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="15df0508-ff6a-49b2-a9ca-80728adceeed" containerName="ceilometer-central-agent" Oct 11 01:14:45 crc kubenswrapper[4743]: E1011 01:14:45.194425 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15df0508-ff6a-49b2-a9ca-80728adceeed" containerName="ceilometer-notification-agent" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.194431 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="15df0508-ff6a-49b2-a9ca-80728adceeed" containerName="ceilometer-notification-agent" Oct 11 01:14:45 crc kubenswrapper[4743]: E1011 01:14:45.194449 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15df0508-ff6a-49b2-a9ca-80728adceeed" containerName="proxy-httpd" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.194455 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="15df0508-ff6a-49b2-a9ca-80728adceeed" containerName="proxy-httpd" Oct 11 01:14:45 crc kubenswrapper[4743]: E1011 01:14:45.194466 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e" containerName="init" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.194472 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e" containerName="init" Oct 11 01:14:45 crc kubenswrapper[4743]: E1011 01:14:45.194481 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15df0508-ff6a-49b2-a9ca-80728adceeed" containerName="sg-core" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.194487 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="15df0508-ff6a-49b2-a9ca-80728adceeed" containerName="sg-core" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.195636 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fd3ff73-a9d9-493a-b4d2-1a8a73cb8d5e" containerName="dnsmasq-dns" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.195679 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="15df0508-ff6a-49b2-a9ca-80728adceeed" containerName="ceilometer-notification-agent" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.195699 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="15df0508-ff6a-49b2-a9ca-80728adceeed" containerName="proxy-httpd" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.195711 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="15df0508-ff6a-49b2-a9ca-80728adceeed" containerName="sg-core" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.195718 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="15df0508-ff6a-49b2-a9ca-80728adceeed" containerName="ceilometer-central-agent" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.197748 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.200066 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.202315 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.205625 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.332210 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e83a1d4-6bd8-42a3-bb28-e287346008eb-config-data\") pod \"ceilometer-0\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " pod="openstack/ceilometer-0" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.332301 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e83a1d4-6bd8-42a3-bb28-e287346008eb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " pod="openstack/ceilometer-0" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.332348 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxkgt\" (UniqueName: \"kubernetes.io/projected/6e83a1d4-6bd8-42a3-bb28-e287346008eb-kube-api-access-xxkgt\") pod \"ceilometer-0\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " pod="openstack/ceilometer-0" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.332379 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e83a1d4-6bd8-42a3-bb28-e287346008eb-run-httpd\") pod \"ceilometer-0\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " pod="openstack/ceilometer-0" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.332402 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e83a1d4-6bd8-42a3-bb28-e287346008eb-log-httpd\") pod \"ceilometer-0\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " pod="openstack/ceilometer-0" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.332427 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e83a1d4-6bd8-42a3-bb28-e287346008eb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " pod="openstack/ceilometer-0" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.332445 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e83a1d4-6bd8-42a3-bb28-e287346008eb-scripts\") pod \"ceilometer-0\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " pod="openstack/ceilometer-0" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.434233 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e83a1d4-6bd8-42a3-bb28-e287346008eb-run-httpd\") pod \"ceilometer-0\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " pod="openstack/ceilometer-0" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.434290 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e83a1d4-6bd8-42a3-bb28-e287346008eb-log-httpd\") pod \"ceilometer-0\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " pod="openstack/ceilometer-0" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.434330 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e83a1d4-6bd8-42a3-bb28-e287346008eb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " pod="openstack/ceilometer-0" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.434353 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e83a1d4-6bd8-42a3-bb28-e287346008eb-scripts\") pod \"ceilometer-0\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " pod="openstack/ceilometer-0" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.434401 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e83a1d4-6bd8-42a3-bb28-e287346008eb-config-data\") pod \"ceilometer-0\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " pod="openstack/ceilometer-0" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.434469 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e83a1d4-6bd8-42a3-bb28-e287346008eb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " pod="openstack/ceilometer-0" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.434516 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxkgt\" (UniqueName: \"kubernetes.io/projected/6e83a1d4-6bd8-42a3-bb28-e287346008eb-kube-api-access-xxkgt\") pod \"ceilometer-0\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " pod="openstack/ceilometer-0" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.434699 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e83a1d4-6bd8-42a3-bb28-e287346008eb-run-httpd\") pod \"ceilometer-0\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " pod="openstack/ceilometer-0" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.434809 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e83a1d4-6bd8-42a3-bb28-e287346008eb-log-httpd\") pod \"ceilometer-0\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " pod="openstack/ceilometer-0" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.440034 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e83a1d4-6bd8-42a3-bb28-e287346008eb-scripts\") pod \"ceilometer-0\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " pod="openstack/ceilometer-0" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.440677 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e83a1d4-6bd8-42a3-bb28-e287346008eb-config-data\") pod \"ceilometer-0\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " pod="openstack/ceilometer-0" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.441010 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e83a1d4-6bd8-42a3-bb28-e287346008eb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " pod="openstack/ceilometer-0" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.441137 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e83a1d4-6bd8-42a3-bb28-e287346008eb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " pod="openstack/ceilometer-0" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.450467 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxkgt\" (UniqueName: \"kubernetes.io/projected/6e83a1d4-6bd8-42a3-bb28-e287346008eb-kube-api-access-xxkgt\") pod \"ceilometer-0\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " pod="openstack/ceilometer-0" Oct 11 01:14:45 crc kubenswrapper[4743]: I1011 01:14:45.520457 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:14:46 crc kubenswrapper[4743]: I1011 01:14:46.059940 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:14:46 crc kubenswrapper[4743]: I1011 01:14:46.069049 4743 generic.go:334] "Generic (PLEG): container finished" podID="2500dccd-617d-4164-b9f6-5b675bab6848" containerID="247c39c4fe75b005beceaba45f57f61a8e12c590e90ab0ff835fc207e3a65893" exitCode=0 Oct 11 01:14:46 crc kubenswrapper[4743]: I1011 01:14:46.069075 4743 generic.go:334] "Generic (PLEG): container finished" podID="2500dccd-617d-4164-b9f6-5b675bab6848" containerID="d0b8e39936f9f2d8724ca6b9865d50e050a156eca6c8df578714ef6f7d2f8499" exitCode=0 Oct 11 01:14:46 crc kubenswrapper[4743]: I1011 01:14:46.069084 4743 generic.go:334] "Generic (PLEG): container finished" podID="2500dccd-617d-4164-b9f6-5b675bab6848" containerID="eb4ccaf3fb26fca9439b311b8cb335c04270a0a3871b4f22cf7c7437ec0e7e74" exitCode=0 Oct 11 01:14:46 crc kubenswrapper[4743]: I1011 01:14:46.069106 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"2500dccd-617d-4164-b9f6-5b675bab6848","Type":"ContainerDied","Data":"247c39c4fe75b005beceaba45f57f61a8e12c590e90ab0ff835fc207e3a65893"} Oct 11 01:14:46 crc kubenswrapper[4743]: I1011 01:14:46.069133 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"2500dccd-617d-4164-b9f6-5b675bab6848","Type":"ContainerDied","Data":"d0b8e39936f9f2d8724ca6b9865d50e050a156eca6c8df578714ef6f7d2f8499"} Oct 11 01:14:46 crc kubenswrapper[4743]: I1011 01:14:46.069143 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"2500dccd-617d-4164-b9f6-5b675bab6848","Type":"ContainerDied","Data":"eb4ccaf3fb26fca9439b311b8cb335c04270a0a3871b4f22cf7c7437ec0e7e74"} Oct 11 01:14:46 crc kubenswrapper[4743]: W1011 01:14:46.072672 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e83a1d4_6bd8_42a3_bb28_e287346008eb.slice/crio-c544add559f535a3bac8c2f15320fac6c07420bc9bf9cab1f9632f240d303984 WatchSource:0}: Error finding container c544add559f535a3bac8c2f15320fac6c07420bc9bf9cab1f9632f240d303984: Status 404 returned error can't find the container with id c544add559f535a3bac8c2f15320fac6c07420bc9bf9cab1f9632f240d303984 Oct 11 01:14:46 crc kubenswrapper[4743]: I1011 01:14:46.103509 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15df0508-ff6a-49b2-a9ca-80728adceeed" path="/var/lib/kubelet/pods/15df0508-ff6a-49b2-a9ca-80728adceeed/volumes" Oct 11 01:14:47 crc kubenswrapper[4743]: I1011 01:14:47.089626 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e83a1d4-6bd8-42a3-bb28-e287346008eb","Type":"ContainerStarted","Data":"07c51e8b3fe3126872d6ed9d8747571270d6daa2db06adcb8ed6c8bebb399abe"} Oct 11 01:14:47 crc kubenswrapper[4743]: I1011 01:14:47.090378 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e83a1d4-6bd8-42a3-bb28-e287346008eb","Type":"ContainerStarted","Data":"c544add559f535a3bac8c2f15320fac6c07420bc9bf9cab1f9632f240d303984"} Oct 11 01:14:47 crc kubenswrapper[4743]: I1011 01:14:47.092541 4743 generic.go:334] "Generic (PLEG): container finished" podID="96dd3888-06fd-48f8-a0b2-b320851dd83c" containerID="b2544f0edc3793fa09e9eeafb6b0d684aa074fd3f9ba192b8139c99fb91ed17f" exitCode=0 Oct 11 01:14:47 crc kubenswrapper[4743]: I1011 01:14:47.092603 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jckg7" event={"ID":"96dd3888-06fd-48f8-a0b2-b320851dd83c","Type":"ContainerDied","Data":"b2544f0edc3793fa09e9eeafb6b0d684aa074fd3f9ba192b8139c99fb91ed17f"} Oct 11 01:14:47 crc kubenswrapper[4743]: I1011 01:14:47.130255 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 11 01:14:47 crc kubenswrapper[4743]: I1011 01:14:47.130309 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 11 01:14:48 crc kubenswrapper[4743]: I1011 01:14:48.121085 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e83a1d4-6bd8-42a3-bb28-e287346008eb","Type":"ContainerStarted","Data":"796327c276dad74f5ee54a09941f46c4f158e69a7744a237b421738162fb00ef"} Oct 11 01:14:48 crc kubenswrapper[4743]: I1011 01:14:48.148178 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d840543d-ffc6-405a-8e01-e89d6f237820" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.233:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 01:14:48 crc kubenswrapper[4743]: I1011 01:14:48.149462 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d840543d-ffc6-405a-8e01-e89d6f237820" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.233:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 01:14:48 crc kubenswrapper[4743]: I1011 01:14:48.640316 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jckg7" Oct 11 01:14:48 crc kubenswrapper[4743]: I1011 01:14:48.699323 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmc25\" (UniqueName: \"kubernetes.io/projected/96dd3888-06fd-48f8-a0b2-b320851dd83c-kube-api-access-dmc25\") pod \"96dd3888-06fd-48f8-a0b2-b320851dd83c\" (UID: \"96dd3888-06fd-48f8-a0b2-b320851dd83c\") " Oct 11 01:14:48 crc kubenswrapper[4743]: I1011 01:14:48.699391 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96dd3888-06fd-48f8-a0b2-b320851dd83c-combined-ca-bundle\") pod \"96dd3888-06fd-48f8-a0b2-b320851dd83c\" (UID: \"96dd3888-06fd-48f8-a0b2-b320851dd83c\") " Oct 11 01:14:48 crc kubenswrapper[4743]: I1011 01:14:48.699426 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96dd3888-06fd-48f8-a0b2-b320851dd83c-config-data\") pod \"96dd3888-06fd-48f8-a0b2-b320851dd83c\" (UID: \"96dd3888-06fd-48f8-a0b2-b320851dd83c\") " Oct 11 01:14:48 crc kubenswrapper[4743]: I1011 01:14:48.700162 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96dd3888-06fd-48f8-a0b2-b320851dd83c-scripts\") pod \"96dd3888-06fd-48f8-a0b2-b320851dd83c\" (UID: \"96dd3888-06fd-48f8-a0b2-b320851dd83c\") " Oct 11 01:14:48 crc kubenswrapper[4743]: I1011 01:14:48.705467 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96dd3888-06fd-48f8-a0b2-b320851dd83c-scripts" (OuterVolumeSpecName: "scripts") pod "96dd3888-06fd-48f8-a0b2-b320851dd83c" (UID: "96dd3888-06fd-48f8-a0b2-b320851dd83c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:48 crc kubenswrapper[4743]: I1011 01:14:48.705922 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96dd3888-06fd-48f8-a0b2-b320851dd83c-kube-api-access-dmc25" (OuterVolumeSpecName: "kube-api-access-dmc25") pod "96dd3888-06fd-48f8-a0b2-b320851dd83c" (UID: "96dd3888-06fd-48f8-a0b2-b320851dd83c"). InnerVolumeSpecName "kube-api-access-dmc25". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:14:48 crc kubenswrapper[4743]: I1011 01:14:48.728529 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96dd3888-06fd-48f8-a0b2-b320851dd83c-config-data" (OuterVolumeSpecName: "config-data") pod "96dd3888-06fd-48f8-a0b2-b320851dd83c" (UID: "96dd3888-06fd-48f8-a0b2-b320851dd83c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:48 crc kubenswrapper[4743]: I1011 01:14:48.741538 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96dd3888-06fd-48f8-a0b2-b320851dd83c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96dd3888-06fd-48f8-a0b2-b320851dd83c" (UID: "96dd3888-06fd-48f8-a0b2-b320851dd83c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:48 crc kubenswrapper[4743]: I1011 01:14:48.803111 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmc25\" (UniqueName: \"kubernetes.io/projected/96dd3888-06fd-48f8-a0b2-b320851dd83c-kube-api-access-dmc25\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:48 crc kubenswrapper[4743]: I1011 01:14:48.803159 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96dd3888-06fd-48f8-a0b2-b320851dd83c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:48 crc kubenswrapper[4743]: I1011 01:14:48.803176 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96dd3888-06fd-48f8-a0b2-b320851dd83c-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:48 crc kubenswrapper[4743]: I1011 01:14:48.803189 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96dd3888-06fd-48f8-a0b2-b320851dd83c-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:49 crc kubenswrapper[4743]: I1011 01:14:49.134471 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jckg7" event={"ID":"96dd3888-06fd-48f8-a0b2-b320851dd83c","Type":"ContainerDied","Data":"6a0128ce8d5e91d2ec7e9c8222a32ad9c8bbe4c16be5d36ea36e1e5a3b7bd8b9"} Oct 11 01:14:49 crc kubenswrapper[4743]: I1011 01:14:49.134506 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a0128ce8d5e91d2ec7e9c8222a32ad9c8bbe4c16be5d36ea36e1e5a3b7bd8b9" Oct 11 01:14:49 crc kubenswrapper[4743]: I1011 01:14:49.135912 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jckg7" Oct 11 01:14:49 crc kubenswrapper[4743]: I1011 01:14:49.136788 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e83a1d4-6bd8-42a3-bb28-e287346008eb","Type":"ContainerStarted","Data":"52bbd5b0da23de52947433753c9d905103f50933a78d3511cb78abff8c33ad6c"} Oct 11 01:14:49 crc kubenswrapper[4743]: I1011 01:14:49.307977 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 11 01:14:49 crc kubenswrapper[4743]: I1011 01:14:49.308196 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d840543d-ffc6-405a-8e01-e89d6f237820" containerName="nova-api-log" containerID="cri-o://37c168db353224216e44b6c5a7a662e54d1f6647d619bdc64991536f270d4d38" gracePeriod=30 Oct 11 01:14:49 crc kubenswrapper[4743]: I1011 01:14:49.308605 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d840543d-ffc6-405a-8e01-e89d6f237820" containerName="nova-api-api" containerID="cri-o://7b354e26e1d24c4652be4a6a1cebd923176c80fa9eb40c4ba4ac0b9e0281807d" gracePeriod=30 Oct 11 01:14:49 crc kubenswrapper[4743]: I1011 01:14:49.343902 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 11 01:14:49 crc kubenswrapper[4743]: I1011 01:14:49.344124 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="006c6efe-f2d3-4ce9-9a99-335f3830c7a8" containerName="nova-scheduler-scheduler" containerID="cri-o://39d8a64c03d232369b7c9a54d892dfc59a3a473201af45f8213daebdfe49e5ee" gracePeriod=30 Oct 11 01:14:49 crc kubenswrapper[4743]: I1011 01:14:49.365841 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 11 01:14:49 crc kubenswrapper[4743]: I1011 01:14:49.366051 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d793d5fe-737a-4217-a456-da9e894fe4f6" containerName="nova-metadata-log" containerID="cri-o://ede02981f3e2e33c2b02a8ec26332116f62d787b57e8852eed1b0667c12ef36b" gracePeriod=30 Oct 11 01:14:49 crc kubenswrapper[4743]: I1011 01:14:49.366470 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d793d5fe-737a-4217-a456-da9e894fe4f6" containerName="nova-metadata-metadata" containerID="cri-o://ec265beae7c1aed3e04ae57daf470fe63e995730fc233089dfda473fec83b875" gracePeriod=30 Oct 11 01:14:49 crc kubenswrapper[4743]: E1011 01:14:49.946486 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="39d8a64c03d232369b7c9a54d892dfc59a3a473201af45f8213daebdfe49e5ee" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 11 01:14:49 crc kubenswrapper[4743]: E1011 01:14:49.955291 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="39d8a64c03d232369b7c9a54d892dfc59a3a473201af45f8213daebdfe49e5ee" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 11 01:14:49 crc kubenswrapper[4743]: E1011 01:14:49.961948 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="39d8a64c03d232369b7c9a54d892dfc59a3a473201af45f8213daebdfe49e5ee" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 11 01:14:49 crc kubenswrapper[4743]: E1011 01:14:49.962040 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="006c6efe-f2d3-4ce9-9a99-335f3830c7a8" containerName="nova-scheduler-scheduler" Oct 11 01:14:50 crc kubenswrapper[4743]: I1011 01:14:50.152887 4743 generic.go:334] "Generic (PLEG): container finished" podID="d793d5fe-737a-4217-a456-da9e894fe4f6" containerID="ede02981f3e2e33c2b02a8ec26332116f62d787b57e8852eed1b0667c12ef36b" exitCode=143 Oct 11 01:14:50 crc kubenswrapper[4743]: I1011 01:14:50.152973 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d793d5fe-737a-4217-a456-da9e894fe4f6","Type":"ContainerDied","Data":"ede02981f3e2e33c2b02a8ec26332116f62d787b57e8852eed1b0667c12ef36b"} Oct 11 01:14:50 crc kubenswrapper[4743]: I1011 01:14:50.155308 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e83a1d4-6bd8-42a3-bb28-e287346008eb","Type":"ContainerStarted","Data":"9d5adba1ea7452bbfa5bb0da625a06575600b71d527247e283031273dfb30694"} Oct 11 01:14:50 crc kubenswrapper[4743]: I1011 01:14:50.155455 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 11 01:14:50 crc kubenswrapper[4743]: I1011 01:14:50.158327 4743 generic.go:334] "Generic (PLEG): container finished" podID="d840543d-ffc6-405a-8e01-e89d6f237820" containerID="37c168db353224216e44b6c5a7a662e54d1f6647d619bdc64991536f270d4d38" exitCode=143 Oct 11 01:14:50 crc kubenswrapper[4743]: I1011 01:14:50.158404 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d840543d-ffc6-405a-8e01-e89d6f237820","Type":"ContainerDied","Data":"37c168db353224216e44b6c5a7a662e54d1f6647d619bdc64991536f270d4d38"} Oct 11 01:14:50 crc kubenswrapper[4743]: I1011 01:14:50.184021 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.7985657590000002 podStartE2EDuration="5.18399114s" podCreationTimestamp="2025-10-11 01:14:45 +0000 UTC" firstStartedPulling="2025-10-11 01:14:46.074905861 +0000 UTC m=+1380.727886258" lastFinishedPulling="2025-10-11 01:14:49.460331242 +0000 UTC m=+1384.113311639" observedRunningTime="2025-10-11 01:14:50.172472988 +0000 UTC m=+1384.825453395" watchObservedRunningTime="2025-10-11 01:14:50.18399114 +0000 UTC m=+1384.836971547" Oct 11 01:14:52 crc kubenswrapper[4743]: I1011 01:14:52.531576 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d793d5fe-737a-4217-a456-da9e894fe4f6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.224:8775/\": read tcp 10.217.0.2:53368->10.217.0.224:8775: read: connection reset by peer" Oct 11 01:14:52 crc kubenswrapper[4743]: I1011 01:14:52.532180 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d793d5fe-737a-4217-a456-da9e894fe4f6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.224:8775/\": read tcp 10.217.0.2:53376->10.217.0.224:8775: read: connection reset by peer" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.161489 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.216034 4743 generic.go:334] "Generic (PLEG): container finished" podID="d793d5fe-737a-4217-a456-da9e894fe4f6" containerID="ec265beae7c1aed3e04ae57daf470fe63e995730fc233089dfda473fec83b875" exitCode=0 Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.216086 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d793d5fe-737a-4217-a456-da9e894fe4f6","Type":"ContainerDied","Data":"ec265beae7c1aed3e04ae57daf470fe63e995730fc233089dfda473fec83b875"} Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.216128 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d793d5fe-737a-4217-a456-da9e894fe4f6","Type":"ContainerDied","Data":"ddef3838cf5a006593af9ccee9a34115dbc92e3ead78a10dedae00593e259159"} Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.216178 4743 scope.go:117] "RemoveContainer" containerID="ec265beae7c1aed3e04ae57daf470fe63e995730fc233089dfda473fec83b875" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.216305 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.247089 4743 scope.go:117] "RemoveContainer" containerID="ede02981f3e2e33c2b02a8ec26332116f62d787b57e8852eed1b0667c12ef36b" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.281236 4743 scope.go:117] "RemoveContainer" containerID="ec265beae7c1aed3e04ae57daf470fe63e995730fc233089dfda473fec83b875" Oct 11 01:14:53 crc kubenswrapper[4743]: E1011 01:14:53.282194 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec265beae7c1aed3e04ae57daf470fe63e995730fc233089dfda473fec83b875\": container with ID starting with ec265beae7c1aed3e04ae57daf470fe63e995730fc233089dfda473fec83b875 not found: ID does not exist" containerID="ec265beae7c1aed3e04ae57daf470fe63e995730fc233089dfda473fec83b875" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.282227 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec265beae7c1aed3e04ae57daf470fe63e995730fc233089dfda473fec83b875"} err="failed to get container status \"ec265beae7c1aed3e04ae57daf470fe63e995730fc233089dfda473fec83b875\": rpc error: code = NotFound desc = could not find container \"ec265beae7c1aed3e04ae57daf470fe63e995730fc233089dfda473fec83b875\": container with ID starting with ec265beae7c1aed3e04ae57daf470fe63e995730fc233089dfda473fec83b875 not found: ID does not exist" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.282249 4743 scope.go:117] "RemoveContainer" containerID="ede02981f3e2e33c2b02a8ec26332116f62d787b57e8852eed1b0667c12ef36b" Oct 11 01:14:53 crc kubenswrapper[4743]: E1011 01:14:53.282714 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ede02981f3e2e33c2b02a8ec26332116f62d787b57e8852eed1b0667c12ef36b\": container with ID starting with ede02981f3e2e33c2b02a8ec26332116f62d787b57e8852eed1b0667c12ef36b not found: ID does not exist" containerID="ede02981f3e2e33c2b02a8ec26332116f62d787b57e8852eed1b0667c12ef36b" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.282753 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ede02981f3e2e33c2b02a8ec26332116f62d787b57e8852eed1b0667c12ef36b"} err="failed to get container status \"ede02981f3e2e33c2b02a8ec26332116f62d787b57e8852eed1b0667c12ef36b\": rpc error: code = NotFound desc = could not find container \"ede02981f3e2e33c2b02a8ec26332116f62d787b57e8852eed1b0667c12ef36b\": container with ID starting with ede02981f3e2e33c2b02a8ec26332116f62d787b57e8852eed1b0667c12ef36b not found: ID does not exist" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.301612 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llt4r\" (UniqueName: \"kubernetes.io/projected/d793d5fe-737a-4217-a456-da9e894fe4f6-kube-api-access-llt4r\") pod \"d793d5fe-737a-4217-a456-da9e894fe4f6\" (UID: \"d793d5fe-737a-4217-a456-da9e894fe4f6\") " Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.301765 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d793d5fe-737a-4217-a456-da9e894fe4f6-nova-metadata-tls-certs\") pod \"d793d5fe-737a-4217-a456-da9e894fe4f6\" (UID: \"d793d5fe-737a-4217-a456-da9e894fe4f6\") " Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.301843 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d793d5fe-737a-4217-a456-da9e894fe4f6-config-data\") pod \"d793d5fe-737a-4217-a456-da9e894fe4f6\" (UID: \"d793d5fe-737a-4217-a456-da9e894fe4f6\") " Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.301875 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d793d5fe-737a-4217-a456-da9e894fe4f6-logs\") pod \"d793d5fe-737a-4217-a456-da9e894fe4f6\" (UID: \"d793d5fe-737a-4217-a456-da9e894fe4f6\") " Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.302025 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d793d5fe-737a-4217-a456-da9e894fe4f6-combined-ca-bundle\") pod \"d793d5fe-737a-4217-a456-da9e894fe4f6\" (UID: \"d793d5fe-737a-4217-a456-da9e894fe4f6\") " Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.308506 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d793d5fe-737a-4217-a456-da9e894fe4f6-kube-api-access-llt4r" (OuterVolumeSpecName: "kube-api-access-llt4r") pod "d793d5fe-737a-4217-a456-da9e894fe4f6" (UID: "d793d5fe-737a-4217-a456-da9e894fe4f6"). InnerVolumeSpecName "kube-api-access-llt4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.310015 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d793d5fe-737a-4217-a456-da9e894fe4f6-logs" (OuterVolumeSpecName: "logs") pod "d793d5fe-737a-4217-a456-da9e894fe4f6" (UID: "d793d5fe-737a-4217-a456-da9e894fe4f6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.345431 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d793d5fe-737a-4217-a456-da9e894fe4f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d793d5fe-737a-4217-a456-da9e894fe4f6" (UID: "d793d5fe-737a-4217-a456-da9e894fe4f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.348404 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d793d5fe-737a-4217-a456-da9e894fe4f6-config-data" (OuterVolumeSpecName: "config-data") pod "d793d5fe-737a-4217-a456-da9e894fe4f6" (UID: "d793d5fe-737a-4217-a456-da9e894fe4f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.391607 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d793d5fe-737a-4217-a456-da9e894fe4f6-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "d793d5fe-737a-4217-a456-da9e894fe4f6" (UID: "d793d5fe-737a-4217-a456-da9e894fe4f6"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.403990 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d793d5fe-737a-4217-a456-da9e894fe4f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.404035 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llt4r\" (UniqueName: \"kubernetes.io/projected/d793d5fe-737a-4217-a456-da9e894fe4f6-kube-api-access-llt4r\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.404049 4743 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d793d5fe-737a-4217-a456-da9e894fe4f6-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.404058 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d793d5fe-737a-4217-a456-da9e894fe4f6-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.404066 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d793d5fe-737a-4217-a456-da9e894fe4f6-logs\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.552109 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.560241 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.619106 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 11 01:14:53 crc kubenswrapper[4743]: E1011 01:14:53.619590 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96dd3888-06fd-48f8-a0b2-b320851dd83c" containerName="nova-manage" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.619605 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="96dd3888-06fd-48f8-a0b2-b320851dd83c" containerName="nova-manage" Oct 11 01:14:53 crc kubenswrapper[4743]: E1011 01:14:53.619630 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d793d5fe-737a-4217-a456-da9e894fe4f6" containerName="nova-metadata-log" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.619637 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d793d5fe-737a-4217-a456-da9e894fe4f6" containerName="nova-metadata-log" Oct 11 01:14:53 crc kubenswrapper[4743]: E1011 01:14:53.619657 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d793d5fe-737a-4217-a456-da9e894fe4f6" containerName="nova-metadata-metadata" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.619662 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d793d5fe-737a-4217-a456-da9e894fe4f6" containerName="nova-metadata-metadata" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.619841 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="96dd3888-06fd-48f8-a0b2-b320851dd83c" containerName="nova-manage" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.619873 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d793d5fe-737a-4217-a456-da9e894fe4f6" containerName="nova-metadata-metadata" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.619885 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d793d5fe-737a-4217-a456-da9e894fe4f6" containerName="nova-metadata-log" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.621020 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.623669 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.623823 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.637568 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.812228 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e4a6f0-05ef-4f39-96f6-1e44cd3753d4-config-data\") pod \"nova-metadata-0\" (UID: \"10e4a6f0-05ef-4f39-96f6-1e44cd3753d4\") " pod="openstack/nova-metadata-0" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.812343 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/10e4a6f0-05ef-4f39-96f6-1e44cd3753d4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"10e4a6f0-05ef-4f39-96f6-1e44cd3753d4\") " pod="openstack/nova-metadata-0" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.812394 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f46np\" (UniqueName: \"kubernetes.io/projected/10e4a6f0-05ef-4f39-96f6-1e44cd3753d4-kube-api-access-f46np\") pod \"nova-metadata-0\" (UID: \"10e4a6f0-05ef-4f39-96f6-1e44cd3753d4\") " pod="openstack/nova-metadata-0" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.812602 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10e4a6f0-05ef-4f39-96f6-1e44cd3753d4-logs\") pod \"nova-metadata-0\" (UID: \"10e4a6f0-05ef-4f39-96f6-1e44cd3753d4\") " pod="openstack/nova-metadata-0" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.812786 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e4a6f0-05ef-4f39-96f6-1e44cd3753d4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"10e4a6f0-05ef-4f39-96f6-1e44cd3753d4\") " pod="openstack/nova-metadata-0" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.914421 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/10e4a6f0-05ef-4f39-96f6-1e44cd3753d4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"10e4a6f0-05ef-4f39-96f6-1e44cd3753d4\") " pod="openstack/nova-metadata-0" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.914497 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f46np\" (UniqueName: \"kubernetes.io/projected/10e4a6f0-05ef-4f39-96f6-1e44cd3753d4-kube-api-access-f46np\") pod \"nova-metadata-0\" (UID: \"10e4a6f0-05ef-4f39-96f6-1e44cd3753d4\") " pod="openstack/nova-metadata-0" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.914578 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10e4a6f0-05ef-4f39-96f6-1e44cd3753d4-logs\") pod \"nova-metadata-0\" (UID: \"10e4a6f0-05ef-4f39-96f6-1e44cd3753d4\") " pod="openstack/nova-metadata-0" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.914641 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e4a6f0-05ef-4f39-96f6-1e44cd3753d4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"10e4a6f0-05ef-4f39-96f6-1e44cd3753d4\") " pod="openstack/nova-metadata-0" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.914698 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e4a6f0-05ef-4f39-96f6-1e44cd3753d4-config-data\") pod \"nova-metadata-0\" (UID: \"10e4a6f0-05ef-4f39-96f6-1e44cd3753d4\") " pod="openstack/nova-metadata-0" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.915241 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10e4a6f0-05ef-4f39-96f6-1e44cd3753d4-logs\") pod \"nova-metadata-0\" (UID: \"10e4a6f0-05ef-4f39-96f6-1e44cd3753d4\") " pod="openstack/nova-metadata-0" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.919230 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e4a6f0-05ef-4f39-96f6-1e44cd3753d4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"10e4a6f0-05ef-4f39-96f6-1e44cd3753d4\") " pod="openstack/nova-metadata-0" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.919391 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/10e4a6f0-05ef-4f39-96f6-1e44cd3753d4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"10e4a6f0-05ef-4f39-96f6-1e44cd3753d4\") " pod="openstack/nova-metadata-0" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.919802 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e4a6f0-05ef-4f39-96f6-1e44cd3753d4-config-data\") pod \"nova-metadata-0\" (UID: \"10e4a6f0-05ef-4f39-96f6-1e44cd3753d4\") " pod="openstack/nova-metadata-0" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.939309 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f46np\" (UniqueName: \"kubernetes.io/projected/10e4a6f0-05ef-4f39-96f6-1e44cd3753d4-kube-api-access-f46np\") pod \"nova-metadata-0\" (UID: \"10e4a6f0-05ef-4f39-96f6-1e44cd3753d4\") " pod="openstack/nova-metadata-0" Oct 11 01:14:53 crc kubenswrapper[4743]: I1011 01:14:53.955497 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.097601 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.116054 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d793d5fe-737a-4217-a456-da9e894fe4f6" path="/var/lib/kubelet/pods/d793d5fe-737a-4217-a456-da9e894fe4f6/volumes" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.222963 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/006c6efe-f2d3-4ce9-9a99-335f3830c7a8-config-data\") pod \"006c6efe-f2d3-4ce9-9a99-335f3830c7a8\" (UID: \"006c6efe-f2d3-4ce9-9a99-335f3830c7a8\") " Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.223125 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/006c6efe-f2d3-4ce9-9a99-335f3830c7a8-combined-ca-bundle\") pod \"006c6efe-f2d3-4ce9-9a99-335f3830c7a8\" (UID: \"006c6efe-f2d3-4ce9-9a99-335f3830c7a8\") " Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.223237 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmrn4\" (UniqueName: \"kubernetes.io/projected/006c6efe-f2d3-4ce9-9a99-335f3830c7a8-kube-api-access-wmrn4\") pod \"006c6efe-f2d3-4ce9-9a99-335f3830c7a8\" (UID: \"006c6efe-f2d3-4ce9-9a99-335f3830c7a8\") " Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.226877 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/006c6efe-f2d3-4ce9-9a99-335f3830c7a8-kube-api-access-wmrn4" (OuterVolumeSpecName: "kube-api-access-wmrn4") pod "006c6efe-f2d3-4ce9-9a99-335f3830c7a8" (UID: "006c6efe-f2d3-4ce9-9a99-335f3830c7a8"). InnerVolumeSpecName "kube-api-access-wmrn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.229543 4743 generic.go:334] "Generic (PLEG): container finished" podID="d840543d-ffc6-405a-8e01-e89d6f237820" containerID="7b354e26e1d24c4652be4a6a1cebd923176c80fa9eb40c4ba4ac0b9e0281807d" exitCode=0 Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.229603 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d840543d-ffc6-405a-8e01-e89d6f237820","Type":"ContainerDied","Data":"7b354e26e1d24c4652be4a6a1cebd923176c80fa9eb40c4ba4ac0b9e0281807d"} Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.231346 4743 generic.go:334] "Generic (PLEG): container finished" podID="006c6efe-f2d3-4ce9-9a99-335f3830c7a8" containerID="39d8a64c03d232369b7c9a54d892dfc59a3a473201af45f8213daebdfe49e5ee" exitCode=0 Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.231458 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.231588 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"006c6efe-f2d3-4ce9-9a99-335f3830c7a8","Type":"ContainerDied","Data":"39d8a64c03d232369b7c9a54d892dfc59a3a473201af45f8213daebdfe49e5ee"} Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.231623 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"006c6efe-f2d3-4ce9-9a99-335f3830c7a8","Type":"ContainerDied","Data":"a7796a5e6da8d551c9538ce9830202a82ac0cf2e9763fedb67956642003c2f81"} Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.231645 4743 scope.go:117] "RemoveContainer" containerID="39d8a64c03d232369b7c9a54d892dfc59a3a473201af45f8213daebdfe49e5ee" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.238085 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.255810 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/006c6efe-f2d3-4ce9-9a99-335f3830c7a8-config-data" (OuterVolumeSpecName: "config-data") pod "006c6efe-f2d3-4ce9-9a99-335f3830c7a8" (UID: "006c6efe-f2d3-4ce9-9a99-335f3830c7a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.284575 4743 scope.go:117] "RemoveContainer" containerID="39d8a64c03d232369b7c9a54d892dfc59a3a473201af45f8213daebdfe49e5ee" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.284562 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/006c6efe-f2d3-4ce9-9a99-335f3830c7a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "006c6efe-f2d3-4ce9-9a99-335f3830c7a8" (UID: "006c6efe-f2d3-4ce9-9a99-335f3830c7a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:54 crc kubenswrapper[4743]: E1011 01:14:54.285635 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39d8a64c03d232369b7c9a54d892dfc59a3a473201af45f8213daebdfe49e5ee\": container with ID starting with 39d8a64c03d232369b7c9a54d892dfc59a3a473201af45f8213daebdfe49e5ee not found: ID does not exist" containerID="39d8a64c03d232369b7c9a54d892dfc59a3a473201af45f8213daebdfe49e5ee" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.285673 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39d8a64c03d232369b7c9a54d892dfc59a3a473201af45f8213daebdfe49e5ee"} err="failed to get container status \"39d8a64c03d232369b7c9a54d892dfc59a3a473201af45f8213daebdfe49e5ee\": rpc error: code = NotFound desc = could not find container \"39d8a64c03d232369b7c9a54d892dfc59a3a473201af45f8213daebdfe49e5ee\": container with ID starting with 39d8a64c03d232369b7c9a54d892dfc59a3a473201af45f8213daebdfe49e5ee not found: ID does not exist" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.326031 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/006c6efe-f2d3-4ce9-9a99-335f3830c7a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.326068 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmrn4\" (UniqueName: \"kubernetes.io/projected/006c6efe-f2d3-4ce9-9a99-335f3830c7a8-kube-api-access-wmrn4\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.326111 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/006c6efe-f2d3-4ce9-9a99-335f3830c7a8-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.426996 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d840543d-ffc6-405a-8e01-e89d6f237820-combined-ca-bundle\") pod \"d840543d-ffc6-405a-8e01-e89d6f237820\" (UID: \"d840543d-ffc6-405a-8e01-e89d6f237820\") " Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.427382 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d840543d-ffc6-405a-8e01-e89d6f237820-logs\") pod \"d840543d-ffc6-405a-8e01-e89d6f237820\" (UID: \"d840543d-ffc6-405a-8e01-e89d6f237820\") " Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.427457 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsrb2\" (UniqueName: \"kubernetes.io/projected/d840543d-ffc6-405a-8e01-e89d6f237820-kube-api-access-xsrb2\") pod \"d840543d-ffc6-405a-8e01-e89d6f237820\" (UID: \"d840543d-ffc6-405a-8e01-e89d6f237820\") " Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.427802 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d840543d-ffc6-405a-8e01-e89d6f237820-internal-tls-certs\") pod \"d840543d-ffc6-405a-8e01-e89d6f237820\" (UID: \"d840543d-ffc6-405a-8e01-e89d6f237820\") " Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.427895 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d840543d-ffc6-405a-8e01-e89d6f237820-public-tls-certs\") pod \"d840543d-ffc6-405a-8e01-e89d6f237820\" (UID: \"d840543d-ffc6-405a-8e01-e89d6f237820\") " Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.427901 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d840543d-ffc6-405a-8e01-e89d6f237820-logs" (OuterVolumeSpecName: "logs") pod "d840543d-ffc6-405a-8e01-e89d6f237820" (UID: "d840543d-ffc6-405a-8e01-e89d6f237820"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.427946 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d840543d-ffc6-405a-8e01-e89d6f237820-config-data\") pod \"d840543d-ffc6-405a-8e01-e89d6f237820\" (UID: \"d840543d-ffc6-405a-8e01-e89d6f237820\") " Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.428537 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d840543d-ffc6-405a-8e01-e89d6f237820-logs\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.457489 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d840543d-ffc6-405a-8e01-e89d6f237820-kube-api-access-xsrb2" (OuterVolumeSpecName: "kube-api-access-xsrb2") pod "d840543d-ffc6-405a-8e01-e89d6f237820" (UID: "d840543d-ffc6-405a-8e01-e89d6f237820"). InnerVolumeSpecName "kube-api-access-xsrb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.463506 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d840543d-ffc6-405a-8e01-e89d6f237820-config-data" (OuterVolumeSpecName: "config-data") pod "d840543d-ffc6-405a-8e01-e89d6f237820" (UID: "d840543d-ffc6-405a-8e01-e89d6f237820"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.463599 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d840543d-ffc6-405a-8e01-e89d6f237820-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d840543d-ffc6-405a-8e01-e89d6f237820" (UID: "d840543d-ffc6-405a-8e01-e89d6f237820"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.495107 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d840543d-ffc6-405a-8e01-e89d6f237820-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d840543d-ffc6-405a-8e01-e89d6f237820" (UID: "d840543d-ffc6-405a-8e01-e89d6f237820"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.498080 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d840543d-ffc6-405a-8e01-e89d6f237820-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d840543d-ffc6-405a-8e01-e89d6f237820" (UID: "d840543d-ffc6-405a-8e01-e89d6f237820"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.501575 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.530085 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d840543d-ffc6-405a-8e01-e89d6f237820-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.530133 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d840543d-ffc6-405a-8e01-e89d6f237820-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.530146 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d840543d-ffc6-405a-8e01-e89d6f237820-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.530159 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsrb2\" (UniqueName: \"kubernetes.io/projected/d840543d-ffc6-405a-8e01-e89d6f237820-kube-api-access-xsrb2\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.530173 4743 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d840543d-ffc6-405a-8e01-e89d6f237820-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.577932 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.591343 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.621274 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 11 01:14:54 crc kubenswrapper[4743]: E1011 01:14:54.660895 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d840543d-ffc6-405a-8e01-e89d6f237820" containerName="nova-api-log" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.660936 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d840543d-ffc6-405a-8e01-e89d6f237820" containerName="nova-api-log" Oct 11 01:14:54 crc kubenswrapper[4743]: E1011 01:14:54.660967 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="006c6efe-f2d3-4ce9-9a99-335f3830c7a8" containerName="nova-scheduler-scheduler" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.660974 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="006c6efe-f2d3-4ce9-9a99-335f3830c7a8" containerName="nova-scheduler-scheduler" Oct 11 01:14:54 crc kubenswrapper[4743]: E1011 01:14:54.660994 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d840543d-ffc6-405a-8e01-e89d6f237820" containerName="nova-api-api" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.661001 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d840543d-ffc6-405a-8e01-e89d6f237820" containerName="nova-api-api" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.662197 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d840543d-ffc6-405a-8e01-e89d6f237820" containerName="nova-api-log" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.662264 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="006c6efe-f2d3-4ce9-9a99-335f3830c7a8" containerName="nova-scheduler-scheduler" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.662324 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d840543d-ffc6-405a-8e01-e89d6f237820" containerName="nova-api-api" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.665168 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.668449 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.673035 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.845670 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whn6q\" (UniqueName: \"kubernetes.io/projected/449c91f5-e998-4889-b148-30f334b03bc8-kube-api-access-whn6q\") pod \"nova-scheduler-0\" (UID: \"449c91f5-e998-4889-b148-30f334b03bc8\") " pod="openstack/nova-scheduler-0" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.846072 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449c91f5-e998-4889-b148-30f334b03bc8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"449c91f5-e998-4889-b148-30f334b03bc8\") " pod="openstack/nova-scheduler-0" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.846123 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/449c91f5-e998-4889-b148-30f334b03bc8-config-data\") pod \"nova-scheduler-0\" (UID: \"449c91f5-e998-4889-b148-30f334b03bc8\") " pod="openstack/nova-scheduler-0" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.948542 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449c91f5-e998-4889-b148-30f334b03bc8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"449c91f5-e998-4889-b148-30f334b03bc8\") " pod="openstack/nova-scheduler-0" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.948629 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/449c91f5-e998-4889-b148-30f334b03bc8-config-data\") pod \"nova-scheduler-0\" (UID: \"449c91f5-e998-4889-b148-30f334b03bc8\") " pod="openstack/nova-scheduler-0" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.948769 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whn6q\" (UniqueName: \"kubernetes.io/projected/449c91f5-e998-4889-b148-30f334b03bc8-kube-api-access-whn6q\") pod \"nova-scheduler-0\" (UID: \"449c91f5-e998-4889-b148-30f334b03bc8\") " pod="openstack/nova-scheduler-0" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.954492 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/449c91f5-e998-4889-b148-30f334b03bc8-config-data\") pod \"nova-scheduler-0\" (UID: \"449c91f5-e998-4889-b148-30f334b03bc8\") " pod="openstack/nova-scheduler-0" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.954947 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449c91f5-e998-4889-b148-30f334b03bc8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"449c91f5-e998-4889-b148-30f334b03bc8\") " pod="openstack/nova-scheduler-0" Oct 11 01:14:54 crc kubenswrapper[4743]: I1011 01:14:54.972327 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whn6q\" (UniqueName: \"kubernetes.io/projected/449c91f5-e998-4889-b148-30f334b03bc8-kube-api-access-whn6q\") pod \"nova-scheduler-0\" (UID: \"449c91f5-e998-4889-b148-30f334b03bc8\") " pod="openstack/nova-scheduler-0" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.006298 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.260095 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.260094 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d840543d-ffc6-405a-8e01-e89d6f237820","Type":"ContainerDied","Data":"7909e44e39e7762d01699709adb21703f942be526e8a930e8eb21cdb5db79a66"} Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.260522 4743 scope.go:117] "RemoveContainer" containerID="7b354e26e1d24c4652be4a6a1cebd923176c80fa9eb40c4ba4ac0b9e0281807d" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.264334 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10e4a6f0-05ef-4f39-96f6-1e44cd3753d4","Type":"ContainerStarted","Data":"8095ef6ad89637e89a5f55d7f2ffb5872d9de1c67b786bd12d4e482c27276508"} Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.264449 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10e4a6f0-05ef-4f39-96f6-1e44cd3753d4","Type":"ContainerStarted","Data":"c888d13c17c7efd87f4373fa4dca52d6f55a255aea9e0c5c0afa132159b1cb22"} Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.264514 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10e4a6f0-05ef-4f39-96f6-1e44cd3753d4","Type":"ContainerStarted","Data":"7340aea165cd5d8c190a57db197b5aef66e5c4b05371835b6d3ecd8ebd3190b1"} Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.286808 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.286790069 podStartE2EDuration="2.286790069s" podCreationTimestamp="2025-10-11 01:14:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:14:55.281392535 +0000 UTC m=+1389.934372932" watchObservedRunningTime="2025-10-11 01:14:55.286790069 +0000 UTC m=+1389.939770466" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.296063 4743 scope.go:117] "RemoveContainer" containerID="37c168db353224216e44b6c5a7a662e54d1f6647d619bdc64991536f270d4d38" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.322940 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.331881 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.339538 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.341457 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.343826 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.343995 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.344096 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.349441 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.457562 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ea9288-7c98-4c3b-a903-76053391426e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"41ea9288-7c98-4c3b-a903-76053391426e\") " pod="openstack/nova-api-0" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.457665 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ea9288-7c98-4c3b-a903-76053391426e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"41ea9288-7c98-4c3b-a903-76053391426e\") " pod="openstack/nova-api-0" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.457732 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ea9288-7c98-4c3b-a903-76053391426e-logs\") pod \"nova-api-0\" (UID: \"41ea9288-7c98-4c3b-a903-76053391426e\") " pod="openstack/nova-api-0" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.457824 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ea9288-7c98-4c3b-a903-76053391426e-config-data\") pod \"nova-api-0\" (UID: \"41ea9288-7c98-4c3b-a903-76053391426e\") " pod="openstack/nova-api-0" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.457921 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2spqg\" (UniqueName: \"kubernetes.io/projected/41ea9288-7c98-4c3b-a903-76053391426e-kube-api-access-2spqg\") pod \"nova-api-0\" (UID: \"41ea9288-7c98-4c3b-a903-76053391426e\") " pod="openstack/nova-api-0" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.458146 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ea9288-7c98-4c3b-a903-76053391426e-public-tls-certs\") pod \"nova-api-0\" (UID: \"41ea9288-7c98-4c3b-a903-76053391426e\") " pod="openstack/nova-api-0" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.460050 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.560422 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ea9288-7c98-4c3b-a903-76053391426e-public-tls-certs\") pod \"nova-api-0\" (UID: \"41ea9288-7c98-4c3b-a903-76053391426e\") " pod="openstack/nova-api-0" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.560516 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ea9288-7c98-4c3b-a903-76053391426e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"41ea9288-7c98-4c3b-a903-76053391426e\") " pod="openstack/nova-api-0" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.560540 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ea9288-7c98-4c3b-a903-76053391426e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"41ea9288-7c98-4c3b-a903-76053391426e\") " pod="openstack/nova-api-0" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.560574 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ea9288-7c98-4c3b-a903-76053391426e-logs\") pod \"nova-api-0\" (UID: \"41ea9288-7c98-4c3b-a903-76053391426e\") " pod="openstack/nova-api-0" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.560613 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ea9288-7c98-4c3b-a903-76053391426e-config-data\") pod \"nova-api-0\" (UID: \"41ea9288-7c98-4c3b-a903-76053391426e\") " pod="openstack/nova-api-0" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.560667 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2spqg\" (UniqueName: \"kubernetes.io/projected/41ea9288-7c98-4c3b-a903-76053391426e-kube-api-access-2spqg\") pod \"nova-api-0\" (UID: \"41ea9288-7c98-4c3b-a903-76053391426e\") " pod="openstack/nova-api-0" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.561970 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ea9288-7c98-4c3b-a903-76053391426e-logs\") pod \"nova-api-0\" (UID: \"41ea9288-7c98-4c3b-a903-76053391426e\") " pod="openstack/nova-api-0" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.565594 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ea9288-7c98-4c3b-a903-76053391426e-config-data\") pod \"nova-api-0\" (UID: \"41ea9288-7c98-4c3b-a903-76053391426e\") " pod="openstack/nova-api-0" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.566211 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ea9288-7c98-4c3b-a903-76053391426e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"41ea9288-7c98-4c3b-a903-76053391426e\") " pod="openstack/nova-api-0" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.566551 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ea9288-7c98-4c3b-a903-76053391426e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"41ea9288-7c98-4c3b-a903-76053391426e\") " pod="openstack/nova-api-0" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.568077 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ea9288-7c98-4c3b-a903-76053391426e-public-tls-certs\") pod \"nova-api-0\" (UID: \"41ea9288-7c98-4c3b-a903-76053391426e\") " pod="openstack/nova-api-0" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.577811 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2spqg\" (UniqueName: \"kubernetes.io/projected/41ea9288-7c98-4c3b-a903-76053391426e-kube-api-access-2spqg\") pod \"nova-api-0\" (UID: \"41ea9288-7c98-4c3b-a903-76053391426e\") " pod="openstack/nova-api-0" Oct 11 01:14:55 crc kubenswrapper[4743]: I1011 01:14:55.659580 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 11 01:14:56 crc kubenswrapper[4743]: I1011 01:14:56.112014 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="006c6efe-f2d3-4ce9-9a99-335f3830c7a8" path="/var/lib/kubelet/pods/006c6efe-f2d3-4ce9-9a99-335f3830c7a8/volumes" Oct 11 01:14:56 crc kubenswrapper[4743]: I1011 01:14:56.114172 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d840543d-ffc6-405a-8e01-e89d6f237820" path="/var/lib/kubelet/pods/d840543d-ffc6-405a-8e01-e89d6f237820/volumes" Oct 11 01:14:56 crc kubenswrapper[4743]: I1011 01:14:56.166358 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 11 01:14:56 crc kubenswrapper[4743]: I1011 01:14:56.284344 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"41ea9288-7c98-4c3b-a903-76053391426e","Type":"ContainerStarted","Data":"d19ceb12149e0dfce4bdd8daf763401927dcfe64a342ae5da36bd0fb1823a657"} Oct 11 01:14:56 crc kubenswrapper[4743]: I1011 01:14:56.286517 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"449c91f5-e998-4889-b148-30f334b03bc8","Type":"ContainerStarted","Data":"8068ce9b55d9fe17140e91fadeae3d7e3f64026473c7969e26115597bb4a8c9b"} Oct 11 01:14:56 crc kubenswrapper[4743]: I1011 01:14:56.286559 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"449c91f5-e998-4889-b148-30f334b03bc8","Type":"ContainerStarted","Data":"e0bef0cfc418e1658c38458be568447edef0cf10827bc458ba1e6cf08c9349d5"} Oct 11 01:14:57 crc kubenswrapper[4743]: I1011 01:14:57.330529 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"41ea9288-7c98-4c3b-a903-76053391426e","Type":"ContainerStarted","Data":"888ebc9a381f74a058bd5a6e61ff1d3d2528b071410fa200f2e428141eaabcec"} Oct 11 01:14:57 crc kubenswrapper[4743]: I1011 01:14:57.331781 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"41ea9288-7c98-4c3b-a903-76053391426e","Type":"ContainerStarted","Data":"6dbdb45c7cb66a33d8761fc9b7f1b5081918122c2a6f0659fc20a864eca146a4"} Oct 11 01:14:57 crc kubenswrapper[4743]: I1011 01:14:57.358899 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.358878118 podStartE2EDuration="2.358878118s" podCreationTimestamp="2025-10-11 01:14:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:14:57.352500116 +0000 UTC m=+1392.005480523" watchObservedRunningTime="2025-10-11 01:14:57.358878118 +0000 UTC m=+1392.011858535" Oct 11 01:14:57 crc kubenswrapper[4743]: I1011 01:14:57.360886 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.360877163 podStartE2EDuration="3.360877163s" podCreationTimestamp="2025-10-11 01:14:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:14:56.303380932 +0000 UTC m=+1390.956361409" watchObservedRunningTime="2025-10-11 01:14:57.360877163 +0000 UTC m=+1392.013857580" Oct 11 01:14:58 crc kubenswrapper[4743]: I1011 01:14:58.957123 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 11 01:14:58 crc kubenswrapper[4743]: I1011 01:14:58.957384 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 11 01:15:00 crc kubenswrapper[4743]: I1011 01:15:00.006849 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 11 01:15:00 crc kubenswrapper[4743]: I1011 01:15:00.154701 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335755-r9b8t"] Oct 11 01:15:00 crc kubenswrapper[4743]: I1011 01:15:00.156648 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335755-r9b8t" Oct 11 01:15:00 crc kubenswrapper[4743]: I1011 01:15:00.159397 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 11 01:15:00 crc kubenswrapper[4743]: I1011 01:15:00.163963 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 11 01:15:00 crc kubenswrapper[4743]: I1011 01:15:00.171118 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335755-r9b8t"] Oct 11 01:15:00 crc kubenswrapper[4743]: I1011 01:15:00.303045 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13d28b94-fc19-4f99-98f4-5e0891a1a7a7-secret-volume\") pod \"collect-profiles-29335755-r9b8t\" (UID: \"13d28b94-fc19-4f99-98f4-5e0891a1a7a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335755-r9b8t" Oct 11 01:15:00 crc kubenswrapper[4743]: I1011 01:15:00.303129 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13d28b94-fc19-4f99-98f4-5e0891a1a7a7-config-volume\") pod \"collect-profiles-29335755-r9b8t\" (UID: \"13d28b94-fc19-4f99-98f4-5e0891a1a7a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335755-r9b8t" Oct 11 01:15:00 crc kubenswrapper[4743]: I1011 01:15:00.303248 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs74j\" (UniqueName: \"kubernetes.io/projected/13d28b94-fc19-4f99-98f4-5e0891a1a7a7-kube-api-access-fs74j\") pod \"collect-profiles-29335755-r9b8t\" (UID: \"13d28b94-fc19-4f99-98f4-5e0891a1a7a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335755-r9b8t" Oct 11 01:15:00 crc kubenswrapper[4743]: I1011 01:15:00.406016 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13d28b94-fc19-4f99-98f4-5e0891a1a7a7-secret-volume\") pod \"collect-profiles-29335755-r9b8t\" (UID: \"13d28b94-fc19-4f99-98f4-5e0891a1a7a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335755-r9b8t" Oct 11 01:15:00 crc kubenswrapper[4743]: I1011 01:15:00.406102 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13d28b94-fc19-4f99-98f4-5e0891a1a7a7-config-volume\") pod \"collect-profiles-29335755-r9b8t\" (UID: \"13d28b94-fc19-4f99-98f4-5e0891a1a7a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335755-r9b8t" Oct 11 01:15:00 crc kubenswrapper[4743]: I1011 01:15:00.406195 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs74j\" (UniqueName: \"kubernetes.io/projected/13d28b94-fc19-4f99-98f4-5e0891a1a7a7-kube-api-access-fs74j\") pod \"collect-profiles-29335755-r9b8t\" (UID: \"13d28b94-fc19-4f99-98f4-5e0891a1a7a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335755-r9b8t" Oct 11 01:15:00 crc kubenswrapper[4743]: I1011 01:15:00.407894 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13d28b94-fc19-4f99-98f4-5e0891a1a7a7-config-volume\") pod \"collect-profiles-29335755-r9b8t\" (UID: \"13d28b94-fc19-4f99-98f4-5e0891a1a7a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335755-r9b8t" Oct 11 01:15:00 crc kubenswrapper[4743]: I1011 01:15:00.415786 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13d28b94-fc19-4f99-98f4-5e0891a1a7a7-secret-volume\") pod \"collect-profiles-29335755-r9b8t\" (UID: \"13d28b94-fc19-4f99-98f4-5e0891a1a7a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335755-r9b8t" Oct 11 01:15:00 crc kubenswrapper[4743]: I1011 01:15:00.423590 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs74j\" (UniqueName: \"kubernetes.io/projected/13d28b94-fc19-4f99-98f4-5e0891a1a7a7-kube-api-access-fs74j\") pod \"collect-profiles-29335755-r9b8t\" (UID: \"13d28b94-fc19-4f99-98f4-5e0891a1a7a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335755-r9b8t" Oct 11 01:15:00 crc kubenswrapper[4743]: I1011 01:15:00.489270 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335755-r9b8t" Oct 11 01:15:01 crc kubenswrapper[4743]: I1011 01:15:01.005487 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335755-r9b8t"] Oct 11 01:15:01 crc kubenswrapper[4743]: I1011 01:15:01.388160 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335755-r9b8t" event={"ID":"13d28b94-fc19-4f99-98f4-5e0891a1a7a7","Type":"ContainerStarted","Data":"a658a83b250b03c861c4f9d2ba2ecf815fade25c78b3eb625204af21d38a1ca6"} Oct 11 01:15:01 crc kubenswrapper[4743]: I1011 01:15:01.388530 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335755-r9b8t" event={"ID":"13d28b94-fc19-4f99-98f4-5e0891a1a7a7","Type":"ContainerStarted","Data":"661a9ab9a604e8b89f84332fd63eaa3111fec2c7e623113ee4ee6166f6b0a656"} Oct 11 01:15:01 crc kubenswrapper[4743]: I1011 01:15:01.416804 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29335755-r9b8t" podStartSLOduration=1.416781268 podStartE2EDuration="1.416781268s" podCreationTimestamp="2025-10-11 01:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:15:01.404489222 +0000 UTC m=+1396.057469659" watchObservedRunningTime="2025-10-11 01:15:01.416781268 +0000 UTC m=+1396.069761685" Oct 11 01:15:02 crc kubenswrapper[4743]: I1011 01:15:02.404315 4743 generic.go:334] "Generic (PLEG): container finished" podID="13d28b94-fc19-4f99-98f4-5e0891a1a7a7" containerID="a658a83b250b03c861c4f9d2ba2ecf815fade25c78b3eb625204af21d38a1ca6" exitCode=0 Oct 11 01:15:02 crc kubenswrapper[4743]: I1011 01:15:02.404660 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335755-r9b8t" event={"ID":"13d28b94-fc19-4f99-98f4-5e0891a1a7a7","Type":"ContainerDied","Data":"a658a83b250b03c861c4f9d2ba2ecf815fade25c78b3eb625204af21d38a1ca6"} Oct 11 01:15:03 crc kubenswrapper[4743]: I1011 01:15:03.956258 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 11 01:15:03 crc kubenswrapper[4743]: I1011 01:15:03.956557 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 11 01:15:03 crc kubenswrapper[4743]: I1011 01:15:03.981850 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335755-r9b8t" Oct 11 01:15:04 crc kubenswrapper[4743]: I1011 01:15:04.096439 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13d28b94-fc19-4f99-98f4-5e0891a1a7a7-config-volume\") pod \"13d28b94-fc19-4f99-98f4-5e0891a1a7a7\" (UID: \"13d28b94-fc19-4f99-98f4-5e0891a1a7a7\") " Oct 11 01:15:04 crc kubenswrapper[4743]: I1011 01:15:04.097247 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13d28b94-fc19-4f99-98f4-5e0891a1a7a7-config-volume" (OuterVolumeSpecName: "config-volume") pod "13d28b94-fc19-4f99-98f4-5e0891a1a7a7" (UID: "13d28b94-fc19-4f99-98f4-5e0891a1a7a7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:15:04 crc kubenswrapper[4743]: I1011 01:15:04.097599 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13d28b94-fc19-4f99-98f4-5e0891a1a7a7-secret-volume\") pod \"13d28b94-fc19-4f99-98f4-5e0891a1a7a7\" (UID: \"13d28b94-fc19-4f99-98f4-5e0891a1a7a7\") " Oct 11 01:15:04 crc kubenswrapper[4743]: I1011 01:15:04.097766 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs74j\" (UniqueName: \"kubernetes.io/projected/13d28b94-fc19-4f99-98f4-5e0891a1a7a7-kube-api-access-fs74j\") pod \"13d28b94-fc19-4f99-98f4-5e0891a1a7a7\" (UID: \"13d28b94-fc19-4f99-98f4-5e0891a1a7a7\") " Oct 11 01:15:04 crc kubenswrapper[4743]: I1011 01:15:04.099338 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13d28b94-fc19-4f99-98f4-5e0891a1a7a7-config-volume\") on node \"crc\" DevicePath \"\"" Oct 11 01:15:04 crc kubenswrapper[4743]: I1011 01:15:04.105006 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13d28b94-fc19-4f99-98f4-5e0891a1a7a7-kube-api-access-fs74j" (OuterVolumeSpecName: "kube-api-access-fs74j") pod "13d28b94-fc19-4f99-98f4-5e0891a1a7a7" (UID: "13d28b94-fc19-4f99-98f4-5e0891a1a7a7"). InnerVolumeSpecName "kube-api-access-fs74j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:15:04 crc kubenswrapper[4743]: I1011 01:15:04.109093 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13d28b94-fc19-4f99-98f4-5e0891a1a7a7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "13d28b94-fc19-4f99-98f4-5e0891a1a7a7" (UID: "13d28b94-fc19-4f99-98f4-5e0891a1a7a7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:15:04 crc kubenswrapper[4743]: I1011 01:15:04.202631 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13d28b94-fc19-4f99-98f4-5e0891a1a7a7-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 11 01:15:04 crc kubenswrapper[4743]: I1011 01:15:04.202780 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fs74j\" (UniqueName: \"kubernetes.io/projected/13d28b94-fc19-4f99-98f4-5e0891a1a7a7-kube-api-access-fs74j\") on node \"crc\" DevicePath \"\"" Oct 11 01:15:04 crc kubenswrapper[4743]: I1011 01:15:04.427414 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335755-r9b8t" event={"ID":"13d28b94-fc19-4f99-98f4-5e0891a1a7a7","Type":"ContainerDied","Data":"661a9ab9a604e8b89f84332fd63eaa3111fec2c7e623113ee4ee6166f6b0a656"} Oct 11 01:15:04 crc kubenswrapper[4743]: I1011 01:15:04.427699 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="661a9ab9a604e8b89f84332fd63eaa3111fec2c7e623113ee4ee6166f6b0a656" Oct 11 01:15:04 crc kubenswrapper[4743]: I1011 01:15:04.427707 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335755-r9b8t" Oct 11 01:15:04 crc kubenswrapper[4743]: I1011 01:15:04.973136 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="10e4a6f0-05ef-4f39-96f6-1e44cd3753d4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.236:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 01:15:04 crc kubenswrapper[4743]: I1011 01:15:04.973159 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="10e4a6f0-05ef-4f39-96f6-1e44cd3753d4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.236:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 01:15:05 crc kubenswrapper[4743]: I1011 01:15:05.008364 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 11 01:15:05 crc kubenswrapper[4743]: I1011 01:15:05.063792 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 11 01:15:05 crc kubenswrapper[4743]: I1011 01:15:05.489388 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 11 01:15:05 crc kubenswrapper[4743]: I1011 01:15:05.661070 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 11 01:15:05 crc kubenswrapper[4743]: I1011 01:15:05.661132 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 11 01:15:06 crc kubenswrapper[4743]: I1011 01:15:06.673015 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="41ea9288-7c98-4c3b-a903-76053391426e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.238:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 01:15:06 crc kubenswrapper[4743]: I1011 01:15:06.673035 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="41ea9288-7c98-4c3b-a903-76053391426e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.238:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 11 01:15:13 crc kubenswrapper[4743]: I1011 01:15:13.965429 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 11 01:15:13 crc kubenswrapper[4743]: I1011 01:15:13.965988 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 11 01:15:13 crc kubenswrapper[4743]: I1011 01:15:13.971406 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 11 01:15:13 crc kubenswrapper[4743]: I1011 01:15:13.973493 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 11 01:15:15 crc kubenswrapper[4743]: W1011 01:15:15.139214 4743 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13d28b94_fc19_4f99_98f4_5e0891a1a7a7.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13d28b94_fc19_4f99_98f4_5e0891a1a7a7.slice: no such file or directory Oct 11 01:15:15 crc kubenswrapper[4743]: E1011 01:15:15.431054 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd840543d_ffc6_405a_8e01_e89d6f237820.slice/crio-7b354e26e1d24c4652be4a6a1cebd923176c80fa9eb40c4ba4ac0b9e0281807d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd840543d_ffc6_405a_8e01_e89d6f237820.slice/crio-conmon-7b354e26e1d24c4652be4a6a1cebd923176c80fa9eb40c4ba4ac0b9e0281807d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd840543d_ffc6_405a_8e01_e89d6f237820.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2500dccd_617d_4164_b9f6_5b675bab6848.slice/crio-conmon-f4979c1b2742d60f73ba7e21f5e233e8c8299e13cf264183d5b33228e648515e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod006c6efe_f2d3_4ce9_9a99_335f3830c7a8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2500dccd_617d_4164_b9f6_5b675bab6848.slice/crio-f4979c1b2742d60f73ba7e21f5e233e8c8299e13cf264183d5b33228e648515e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod006c6efe_f2d3_4ce9_9a99_335f3830c7a8.slice/crio-a7796a5e6da8d551c9538ce9830202a82ac0cf2e9763fedb67956642003c2f81\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd840543d_ffc6_405a_8e01_e89d6f237820.slice/crio-7909e44e39e7762d01699709adb21703f942be526e8a930e8eb21cdb5db79a66\": RecentStats: unable to find data in memory cache]" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.535522 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.545061 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.565207 4743 generic.go:334] "Generic (PLEG): container finished" podID="2500dccd-617d-4164-b9f6-5b675bab6848" containerID="f4979c1b2742d60f73ba7e21f5e233e8c8299e13cf264183d5b33228e648515e" exitCode=137 Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.565398 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.566577 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"2500dccd-617d-4164-b9f6-5b675bab6848","Type":"ContainerDied","Data":"f4979c1b2742d60f73ba7e21f5e233e8c8299e13cf264183d5b33228e648515e"} Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.566707 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"2500dccd-617d-4164-b9f6-5b675bab6848","Type":"ContainerDied","Data":"1bd255295a1481fa9997871e087787bb0a3dfa4a6a55ed991ea67e819c1ba42e"} Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.566787 4743 scope.go:117] "RemoveContainer" containerID="f4979c1b2742d60f73ba7e21f5e233e8c8299e13cf264183d5b33228e648515e" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.615789 4743 scope.go:117] "RemoveContainer" containerID="247c39c4fe75b005beceaba45f57f61a8e12c590e90ab0ff835fc207e3a65893" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.641142 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2500dccd-617d-4164-b9f6-5b675bab6848-combined-ca-bundle\") pod \"2500dccd-617d-4164-b9f6-5b675bab6848\" (UID: \"2500dccd-617d-4164-b9f6-5b675bab6848\") " Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.641385 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2500dccd-617d-4164-b9f6-5b675bab6848-scripts\") pod \"2500dccd-617d-4164-b9f6-5b675bab6848\" (UID: \"2500dccd-617d-4164-b9f6-5b675bab6848\") " Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.641512 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2500dccd-617d-4164-b9f6-5b675bab6848-config-data\") pod \"2500dccd-617d-4164-b9f6-5b675bab6848\" (UID: \"2500dccd-617d-4164-b9f6-5b675bab6848\") " Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.641575 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6m2vk\" (UniqueName: \"kubernetes.io/projected/2500dccd-617d-4164-b9f6-5b675bab6848-kube-api-access-6m2vk\") pod \"2500dccd-617d-4164-b9f6-5b675bab6848\" (UID: \"2500dccd-617d-4164-b9f6-5b675bab6848\") " Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.648386 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2500dccd-617d-4164-b9f6-5b675bab6848-scripts" (OuterVolumeSpecName: "scripts") pod "2500dccd-617d-4164-b9f6-5b675bab6848" (UID: "2500dccd-617d-4164-b9f6-5b675bab6848"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.660506 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2500dccd-617d-4164-b9f6-5b675bab6848-kube-api-access-6m2vk" (OuterVolumeSpecName: "kube-api-access-6m2vk") pod "2500dccd-617d-4164-b9f6-5b675bab6848" (UID: "2500dccd-617d-4164-b9f6-5b675bab6848"). InnerVolumeSpecName "kube-api-access-6m2vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.674567 4743 scope.go:117] "RemoveContainer" containerID="d0b8e39936f9f2d8724ca6b9865d50e050a156eca6c8df578714ef6f7d2f8499" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.675828 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.677257 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.677732 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.702506 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.743743 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2500dccd-617d-4164-b9f6-5b675bab6848-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.743777 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6m2vk\" (UniqueName: \"kubernetes.io/projected/2500dccd-617d-4164-b9f6-5b675bab6848-kube-api-access-6m2vk\") on node \"crc\" DevicePath \"\"" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.784722 4743 scope.go:117] "RemoveContainer" containerID="eb4ccaf3fb26fca9439b311b8cb335c04270a0a3871b4f22cf7c7437ec0e7e74" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.801098 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2500dccd-617d-4164-b9f6-5b675bab6848-config-data" (OuterVolumeSpecName: "config-data") pod "2500dccd-617d-4164-b9f6-5b675bab6848" (UID: "2500dccd-617d-4164-b9f6-5b675bab6848"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.802254 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2500dccd-617d-4164-b9f6-5b675bab6848-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2500dccd-617d-4164-b9f6-5b675bab6848" (UID: "2500dccd-617d-4164-b9f6-5b675bab6848"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.809965 4743 scope.go:117] "RemoveContainer" containerID="f4979c1b2742d60f73ba7e21f5e233e8c8299e13cf264183d5b33228e648515e" Oct 11 01:15:15 crc kubenswrapper[4743]: E1011 01:15:15.818530 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4979c1b2742d60f73ba7e21f5e233e8c8299e13cf264183d5b33228e648515e\": container with ID starting with f4979c1b2742d60f73ba7e21f5e233e8c8299e13cf264183d5b33228e648515e not found: ID does not exist" containerID="f4979c1b2742d60f73ba7e21f5e233e8c8299e13cf264183d5b33228e648515e" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.818578 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4979c1b2742d60f73ba7e21f5e233e8c8299e13cf264183d5b33228e648515e"} err="failed to get container status \"f4979c1b2742d60f73ba7e21f5e233e8c8299e13cf264183d5b33228e648515e\": rpc error: code = NotFound desc = could not find container \"f4979c1b2742d60f73ba7e21f5e233e8c8299e13cf264183d5b33228e648515e\": container with ID starting with f4979c1b2742d60f73ba7e21f5e233e8c8299e13cf264183d5b33228e648515e not found: ID does not exist" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.818606 4743 scope.go:117] "RemoveContainer" containerID="247c39c4fe75b005beceaba45f57f61a8e12c590e90ab0ff835fc207e3a65893" Oct 11 01:15:15 crc kubenswrapper[4743]: E1011 01:15:15.819722 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"247c39c4fe75b005beceaba45f57f61a8e12c590e90ab0ff835fc207e3a65893\": container with ID starting with 247c39c4fe75b005beceaba45f57f61a8e12c590e90ab0ff835fc207e3a65893 not found: ID does not exist" containerID="247c39c4fe75b005beceaba45f57f61a8e12c590e90ab0ff835fc207e3a65893" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.819775 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"247c39c4fe75b005beceaba45f57f61a8e12c590e90ab0ff835fc207e3a65893"} err="failed to get container status \"247c39c4fe75b005beceaba45f57f61a8e12c590e90ab0ff835fc207e3a65893\": rpc error: code = NotFound desc = could not find container \"247c39c4fe75b005beceaba45f57f61a8e12c590e90ab0ff835fc207e3a65893\": container with ID starting with 247c39c4fe75b005beceaba45f57f61a8e12c590e90ab0ff835fc207e3a65893 not found: ID does not exist" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.819802 4743 scope.go:117] "RemoveContainer" containerID="d0b8e39936f9f2d8724ca6b9865d50e050a156eca6c8df578714ef6f7d2f8499" Oct 11 01:15:15 crc kubenswrapper[4743]: E1011 01:15:15.820290 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0b8e39936f9f2d8724ca6b9865d50e050a156eca6c8df578714ef6f7d2f8499\": container with ID starting with d0b8e39936f9f2d8724ca6b9865d50e050a156eca6c8df578714ef6f7d2f8499 not found: ID does not exist" containerID="d0b8e39936f9f2d8724ca6b9865d50e050a156eca6c8df578714ef6f7d2f8499" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.820319 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0b8e39936f9f2d8724ca6b9865d50e050a156eca6c8df578714ef6f7d2f8499"} err="failed to get container status \"d0b8e39936f9f2d8724ca6b9865d50e050a156eca6c8df578714ef6f7d2f8499\": rpc error: code = NotFound desc = could not find container \"d0b8e39936f9f2d8724ca6b9865d50e050a156eca6c8df578714ef6f7d2f8499\": container with ID starting with d0b8e39936f9f2d8724ca6b9865d50e050a156eca6c8df578714ef6f7d2f8499 not found: ID does not exist" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.820338 4743 scope.go:117] "RemoveContainer" containerID="eb4ccaf3fb26fca9439b311b8cb335c04270a0a3871b4f22cf7c7437ec0e7e74" Oct 11 01:15:15 crc kubenswrapper[4743]: E1011 01:15:15.821232 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb4ccaf3fb26fca9439b311b8cb335c04270a0a3871b4f22cf7c7437ec0e7e74\": container with ID starting with eb4ccaf3fb26fca9439b311b8cb335c04270a0a3871b4f22cf7c7437ec0e7e74 not found: ID does not exist" containerID="eb4ccaf3fb26fca9439b311b8cb335c04270a0a3871b4f22cf7c7437ec0e7e74" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.821259 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb4ccaf3fb26fca9439b311b8cb335c04270a0a3871b4f22cf7c7437ec0e7e74"} err="failed to get container status \"eb4ccaf3fb26fca9439b311b8cb335c04270a0a3871b4f22cf7c7437ec0e7e74\": rpc error: code = NotFound desc = could not find container \"eb4ccaf3fb26fca9439b311b8cb335c04270a0a3871b4f22cf7c7437ec0e7e74\": container with ID starting with eb4ccaf3fb26fca9439b311b8cb335c04270a0a3871b4f22cf7c7437ec0e7e74 not found: ID does not exist" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.846016 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2500dccd-617d-4164-b9f6-5b675bab6848-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.846049 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2500dccd-617d-4164-b9f6-5b675bab6848-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.902840 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.914303 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.927786 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 11 01:15:15 crc kubenswrapper[4743]: E1011 01:15:15.928279 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2500dccd-617d-4164-b9f6-5b675bab6848" containerName="aodh-api" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.928295 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2500dccd-617d-4164-b9f6-5b675bab6848" containerName="aodh-api" Oct 11 01:15:15 crc kubenswrapper[4743]: E1011 01:15:15.928314 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2500dccd-617d-4164-b9f6-5b675bab6848" containerName="aodh-listener" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.928321 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2500dccd-617d-4164-b9f6-5b675bab6848" containerName="aodh-listener" Oct 11 01:15:15 crc kubenswrapper[4743]: E1011 01:15:15.928343 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2500dccd-617d-4164-b9f6-5b675bab6848" containerName="aodh-notifier" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.928348 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2500dccd-617d-4164-b9f6-5b675bab6848" containerName="aodh-notifier" Oct 11 01:15:15 crc kubenswrapper[4743]: E1011 01:15:15.928357 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2500dccd-617d-4164-b9f6-5b675bab6848" containerName="aodh-evaluator" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.928362 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2500dccd-617d-4164-b9f6-5b675bab6848" containerName="aodh-evaluator" Oct 11 01:15:15 crc kubenswrapper[4743]: E1011 01:15:15.928379 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13d28b94-fc19-4f99-98f4-5e0891a1a7a7" containerName="collect-profiles" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.928385 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="13d28b94-fc19-4f99-98f4-5e0891a1a7a7" containerName="collect-profiles" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.928558 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2500dccd-617d-4164-b9f6-5b675bab6848" containerName="aodh-api" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.928576 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2500dccd-617d-4164-b9f6-5b675bab6848" containerName="aodh-evaluator" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.928584 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="13d28b94-fc19-4f99-98f4-5e0891a1a7a7" containerName="collect-profiles" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.928593 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2500dccd-617d-4164-b9f6-5b675bab6848" containerName="aodh-notifier" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.928611 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2500dccd-617d-4164-b9f6-5b675bab6848" containerName="aodh-listener" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.931122 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.934135 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.934152 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.934300 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.934624 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.936240 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-dnc99" Oct 11 01:15:15 crc kubenswrapper[4743]: I1011 01:15:15.944579 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 11 01:15:16 crc kubenswrapper[4743]: I1011 01:15:16.049955 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mh2h\" (UniqueName: \"kubernetes.io/projected/254da181-6fe9-4682-bd43-816f813ba12e-kube-api-access-8mh2h\") pod \"aodh-0\" (UID: \"254da181-6fe9-4682-bd43-816f813ba12e\") " pod="openstack/aodh-0" Oct 11 01:15:16 crc kubenswrapper[4743]: I1011 01:15:16.050055 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-public-tls-certs\") pod \"aodh-0\" (UID: \"254da181-6fe9-4682-bd43-816f813ba12e\") " pod="openstack/aodh-0" Oct 11 01:15:16 crc kubenswrapper[4743]: I1011 01:15:16.050146 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-scripts\") pod \"aodh-0\" (UID: \"254da181-6fe9-4682-bd43-816f813ba12e\") " pod="openstack/aodh-0" Oct 11 01:15:16 crc kubenswrapper[4743]: I1011 01:15:16.050241 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"254da181-6fe9-4682-bd43-816f813ba12e\") " pod="openstack/aodh-0" Oct 11 01:15:16 crc kubenswrapper[4743]: I1011 01:15:16.050311 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-internal-tls-certs\") pod \"aodh-0\" (UID: \"254da181-6fe9-4682-bd43-816f813ba12e\") " pod="openstack/aodh-0" Oct 11 01:15:16 crc kubenswrapper[4743]: I1011 01:15:16.050364 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-config-data\") pod \"aodh-0\" (UID: \"254da181-6fe9-4682-bd43-816f813ba12e\") " pod="openstack/aodh-0" Oct 11 01:15:16 crc kubenswrapper[4743]: I1011 01:15:16.105919 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2500dccd-617d-4164-b9f6-5b675bab6848" path="/var/lib/kubelet/pods/2500dccd-617d-4164-b9f6-5b675bab6848/volumes" Oct 11 01:15:16 crc kubenswrapper[4743]: I1011 01:15:16.152839 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-scripts\") pod \"aodh-0\" (UID: \"254da181-6fe9-4682-bd43-816f813ba12e\") " pod="openstack/aodh-0" Oct 11 01:15:16 crc kubenswrapper[4743]: I1011 01:15:16.152941 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"254da181-6fe9-4682-bd43-816f813ba12e\") " pod="openstack/aodh-0" Oct 11 01:15:16 crc kubenswrapper[4743]: I1011 01:15:16.152977 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-internal-tls-certs\") pod \"aodh-0\" (UID: \"254da181-6fe9-4682-bd43-816f813ba12e\") " pod="openstack/aodh-0" Oct 11 01:15:16 crc kubenswrapper[4743]: I1011 01:15:16.153003 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-config-data\") pod \"aodh-0\" (UID: \"254da181-6fe9-4682-bd43-816f813ba12e\") " pod="openstack/aodh-0" Oct 11 01:15:16 crc kubenswrapper[4743]: I1011 01:15:16.153070 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mh2h\" (UniqueName: \"kubernetes.io/projected/254da181-6fe9-4682-bd43-816f813ba12e-kube-api-access-8mh2h\") pod \"aodh-0\" (UID: \"254da181-6fe9-4682-bd43-816f813ba12e\") " pod="openstack/aodh-0" Oct 11 01:15:16 crc kubenswrapper[4743]: I1011 01:15:16.153146 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-public-tls-certs\") pod \"aodh-0\" (UID: \"254da181-6fe9-4682-bd43-816f813ba12e\") " pod="openstack/aodh-0" Oct 11 01:15:16 crc kubenswrapper[4743]: I1011 01:15:16.157381 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-scripts\") pod \"aodh-0\" (UID: \"254da181-6fe9-4682-bd43-816f813ba12e\") " pod="openstack/aodh-0" Oct 11 01:15:16 crc kubenswrapper[4743]: I1011 01:15:16.157876 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-config-data\") pod \"aodh-0\" (UID: \"254da181-6fe9-4682-bd43-816f813ba12e\") " pod="openstack/aodh-0" Oct 11 01:15:16 crc kubenswrapper[4743]: I1011 01:15:16.158012 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-public-tls-certs\") pod \"aodh-0\" (UID: \"254da181-6fe9-4682-bd43-816f813ba12e\") " pod="openstack/aodh-0" Oct 11 01:15:16 crc kubenswrapper[4743]: I1011 01:15:16.158484 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-internal-tls-certs\") pod \"aodh-0\" (UID: \"254da181-6fe9-4682-bd43-816f813ba12e\") " pod="openstack/aodh-0" Oct 11 01:15:16 crc kubenswrapper[4743]: I1011 01:15:16.158654 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"254da181-6fe9-4682-bd43-816f813ba12e\") " pod="openstack/aodh-0" Oct 11 01:15:16 crc kubenswrapper[4743]: I1011 01:15:16.174542 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mh2h\" (UniqueName: \"kubernetes.io/projected/254da181-6fe9-4682-bd43-816f813ba12e-kube-api-access-8mh2h\") pod \"aodh-0\" (UID: \"254da181-6fe9-4682-bd43-816f813ba12e\") " pod="openstack/aodh-0" Oct 11 01:15:16 crc kubenswrapper[4743]: I1011 01:15:16.299953 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 11 01:15:16 crc kubenswrapper[4743]: I1011 01:15:16.585003 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 11 01:15:16 crc kubenswrapper[4743]: I1011 01:15:16.616374 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 11 01:15:16 crc kubenswrapper[4743]: I1011 01:15:16.810654 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 11 01:15:16 crc kubenswrapper[4743]: W1011 01:15:16.811611 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod254da181_6fe9_4682_bd43_816f813ba12e.slice/crio-7099b2a1fa41c98fb35217617f117fd1dd2403a154d2738f63140eddafa32b82 WatchSource:0}: Error finding container 7099b2a1fa41c98fb35217617f117fd1dd2403a154d2738f63140eddafa32b82: Status 404 returned error can't find the container with id 7099b2a1fa41c98fb35217617f117fd1dd2403a154d2738f63140eddafa32b82 Oct 11 01:15:17 crc kubenswrapper[4743]: I1011 01:15:17.594547 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"254da181-6fe9-4682-bd43-816f813ba12e","Type":"ContainerStarted","Data":"d128ad37459eb5634a29cf39a7494af9679bd6902c3f6ee6b2216d3be40b3078"} Oct 11 01:15:17 crc kubenswrapper[4743]: I1011 01:15:17.594883 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"254da181-6fe9-4682-bd43-816f813ba12e","Type":"ContainerStarted","Data":"7099b2a1fa41c98fb35217617f117fd1dd2403a154d2738f63140eddafa32b82"} Oct 11 01:15:18 crc kubenswrapper[4743]: I1011 01:15:18.604975 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"254da181-6fe9-4682-bd43-816f813ba12e","Type":"ContainerStarted","Data":"033e2e558ddb6cb699a9662bc357732dd4a800294d81a4c8770f67688fb25170"} Oct 11 01:15:19 crc kubenswrapper[4743]: I1011 01:15:19.621763 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"254da181-6fe9-4682-bd43-816f813ba12e","Type":"ContainerStarted","Data":"da047fb9ecb7be96c74cda03df8686e154b6765b8ccdc04985e5f900ae0f7bea"} Oct 11 01:15:20 crc kubenswrapper[4743]: I1011 01:15:20.636851 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"254da181-6fe9-4682-bd43-816f813ba12e","Type":"ContainerStarted","Data":"e2821ec74f463bf28d96387c01c02ec061d3a17ef4b5482121c5f82f1a340d29"} Oct 11 01:15:20 crc kubenswrapper[4743]: I1011 01:15:20.685971 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=3.165513009 podStartE2EDuration="5.685950971s" podCreationTimestamp="2025-10-11 01:15:15 +0000 UTC" firstStartedPulling="2025-10-11 01:15:16.815026044 +0000 UTC m=+1411.468006441" lastFinishedPulling="2025-10-11 01:15:19.335464006 +0000 UTC m=+1413.988444403" observedRunningTime="2025-10-11 01:15:20.682119454 +0000 UTC m=+1415.335099851" watchObservedRunningTime="2025-10-11 01:15:20.685950971 +0000 UTC m=+1415.338931368" Oct 11 01:15:20 crc kubenswrapper[4743]: I1011 01:15:20.694942 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 11 01:15:20 crc kubenswrapper[4743]: I1011 01:15:20.695165 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="5cbb33ea-578f-4987-94cf-d6bf069a2953" containerName="kube-state-metrics" containerID="cri-o://45f7e650696ec5856a270b233a567caa087f6fd7ff31b79a6e5744eae8f8752c" gracePeriod=30 Oct 11 01:15:20 crc kubenswrapper[4743]: I1011 01:15:20.840268 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Oct 11 01:15:20 crc kubenswrapper[4743]: I1011 01:15:20.840656 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="4981c3d3-04e6-4e36-8a2c-fa34b65d8621" containerName="mysqld-exporter" containerID="cri-o://56e76dde75b702f858326d4fbaf27c11f7664ed304128b414db4e2e2b05e6fe4" gracePeriod=30 Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.390275 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.470393 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45tn7\" (UniqueName: \"kubernetes.io/projected/5cbb33ea-578f-4987-94cf-d6bf069a2953-kube-api-access-45tn7\") pod \"5cbb33ea-578f-4987-94cf-d6bf069a2953\" (UID: \"5cbb33ea-578f-4987-94cf-d6bf069a2953\") " Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.479748 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cbb33ea-578f-4987-94cf-d6bf069a2953-kube-api-access-45tn7" (OuterVolumeSpecName: "kube-api-access-45tn7") pod "5cbb33ea-578f-4987-94cf-d6bf069a2953" (UID: "5cbb33ea-578f-4987-94cf-d6bf069a2953"). InnerVolumeSpecName "kube-api-access-45tn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.572758 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45tn7\" (UniqueName: \"kubernetes.io/projected/5cbb33ea-578f-4987-94cf-d6bf069a2953-kube-api-access-45tn7\") on node \"crc\" DevicePath \"\"" Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.653881 4743 generic.go:334] "Generic (PLEG): container finished" podID="5cbb33ea-578f-4987-94cf-d6bf069a2953" containerID="45f7e650696ec5856a270b233a567caa087f6fd7ff31b79a6e5744eae8f8752c" exitCode=2 Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.654003 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5cbb33ea-578f-4987-94cf-d6bf069a2953","Type":"ContainerDied","Data":"45f7e650696ec5856a270b233a567caa087f6fd7ff31b79a6e5744eae8f8752c"} Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.654063 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5cbb33ea-578f-4987-94cf-d6bf069a2953","Type":"ContainerDied","Data":"9291f4f91ed2ee5c893ac2b941c2e074511507b55442ae5f2bebd97bc81a9c4d"} Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.654085 4743 scope.go:117] "RemoveContainer" containerID="45f7e650696ec5856a270b233a567caa087f6fd7ff31b79a6e5744eae8f8752c" Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.654323 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.658744 4743 generic.go:334] "Generic (PLEG): container finished" podID="4981c3d3-04e6-4e36-8a2c-fa34b65d8621" containerID="56e76dde75b702f858326d4fbaf27c11f7664ed304128b414db4e2e2b05e6fe4" exitCode=2 Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.663942 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"4981c3d3-04e6-4e36-8a2c-fa34b65d8621","Type":"ContainerDied","Data":"56e76dde75b702f858326d4fbaf27c11f7664ed304128b414db4e2e2b05e6fe4"} Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.700104 4743 scope.go:117] "RemoveContainer" containerID="45f7e650696ec5856a270b233a567caa087f6fd7ff31b79a6e5744eae8f8752c" Oct 11 01:15:21 crc kubenswrapper[4743]: E1011 01:15:21.700707 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45f7e650696ec5856a270b233a567caa087f6fd7ff31b79a6e5744eae8f8752c\": container with ID starting with 45f7e650696ec5856a270b233a567caa087f6fd7ff31b79a6e5744eae8f8752c not found: ID does not exist" containerID="45f7e650696ec5856a270b233a567caa087f6fd7ff31b79a6e5744eae8f8752c" Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.700770 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f7e650696ec5856a270b233a567caa087f6fd7ff31b79a6e5744eae8f8752c"} err="failed to get container status \"45f7e650696ec5856a270b233a567caa087f6fd7ff31b79a6e5744eae8f8752c\": rpc error: code = NotFound desc = could not find container \"45f7e650696ec5856a270b233a567caa087f6fd7ff31b79a6e5744eae8f8752c\": container with ID starting with 45f7e650696ec5856a270b233a567caa087f6fd7ff31b79a6e5744eae8f8752c not found: ID does not exist" Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.753934 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.798430 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.824330 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 11 01:15:21 crc kubenswrapper[4743]: E1011 01:15:21.825221 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cbb33ea-578f-4987-94cf-d6bf069a2953" containerName="kube-state-metrics" Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.825239 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cbb33ea-578f-4987-94cf-d6bf069a2953" containerName="kube-state-metrics" Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.825493 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cbb33ea-578f-4987-94cf-d6bf069a2953" containerName="kube-state-metrics" Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.826621 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.829563 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.829581 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.834426 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.880177 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bbvj\" (UniqueName: \"kubernetes.io/projected/923b0fb7-1d93-491e-a1e0-73614b302fdb-kube-api-access-9bbvj\") pod \"kube-state-metrics-0\" (UID: \"923b0fb7-1d93-491e-a1e0-73614b302fdb\") " pod="openstack/kube-state-metrics-0" Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.880315 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/923b0fb7-1d93-491e-a1e0-73614b302fdb-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"923b0fb7-1d93-491e-a1e0-73614b302fdb\") " pod="openstack/kube-state-metrics-0" Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.880343 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/923b0fb7-1d93-491e-a1e0-73614b302fdb-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"923b0fb7-1d93-491e-a1e0-73614b302fdb\") " pod="openstack/kube-state-metrics-0" Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.880395 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/923b0fb7-1d93-491e-a1e0-73614b302fdb-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"923b0fb7-1d93-491e-a1e0-73614b302fdb\") " pod="openstack/kube-state-metrics-0" Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.983616 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bbvj\" (UniqueName: \"kubernetes.io/projected/923b0fb7-1d93-491e-a1e0-73614b302fdb-kube-api-access-9bbvj\") pod \"kube-state-metrics-0\" (UID: \"923b0fb7-1d93-491e-a1e0-73614b302fdb\") " pod="openstack/kube-state-metrics-0" Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.984041 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/923b0fb7-1d93-491e-a1e0-73614b302fdb-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"923b0fb7-1d93-491e-a1e0-73614b302fdb\") " pod="openstack/kube-state-metrics-0" Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.984166 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/923b0fb7-1d93-491e-a1e0-73614b302fdb-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"923b0fb7-1d93-491e-a1e0-73614b302fdb\") " pod="openstack/kube-state-metrics-0" Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.984294 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/923b0fb7-1d93-491e-a1e0-73614b302fdb-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"923b0fb7-1d93-491e-a1e0-73614b302fdb\") " pod="openstack/kube-state-metrics-0" Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.991336 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/923b0fb7-1d93-491e-a1e0-73614b302fdb-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"923b0fb7-1d93-491e-a1e0-73614b302fdb\") " pod="openstack/kube-state-metrics-0" Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.993442 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/923b0fb7-1d93-491e-a1e0-73614b302fdb-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"923b0fb7-1d93-491e-a1e0-73614b302fdb\") " pod="openstack/kube-state-metrics-0" Oct 11 01:15:21 crc kubenswrapper[4743]: I1011 01:15:21.997349 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/923b0fb7-1d93-491e-a1e0-73614b302fdb-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"923b0fb7-1d93-491e-a1e0-73614b302fdb\") " pod="openstack/kube-state-metrics-0" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.000981 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bbvj\" (UniqueName: \"kubernetes.io/projected/923b0fb7-1d93-491e-a1e0-73614b302fdb-kube-api-access-9bbvj\") pod \"kube-state-metrics-0\" (UID: \"923b0fb7-1d93-491e-a1e0-73614b302fdb\") " pod="openstack/kube-state-metrics-0" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.084954 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.108436 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cbb33ea-578f-4987-94cf-d6bf069a2953" path="/var/lib/kubelet/pods/5cbb33ea-578f-4987-94cf-d6bf069a2953/volumes" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.153660 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.186766 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4981c3d3-04e6-4e36-8a2c-fa34b65d8621-config-data\") pod \"4981c3d3-04e6-4e36-8a2c-fa34b65d8621\" (UID: \"4981c3d3-04e6-4e36-8a2c-fa34b65d8621\") " Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.187069 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4981c3d3-04e6-4e36-8a2c-fa34b65d8621-combined-ca-bundle\") pod \"4981c3d3-04e6-4e36-8a2c-fa34b65d8621\" (UID: \"4981c3d3-04e6-4e36-8a2c-fa34b65d8621\") " Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.187115 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtd5s\" (UniqueName: \"kubernetes.io/projected/4981c3d3-04e6-4e36-8a2c-fa34b65d8621-kube-api-access-wtd5s\") pod \"4981c3d3-04e6-4e36-8a2c-fa34b65d8621\" (UID: \"4981c3d3-04e6-4e36-8a2c-fa34b65d8621\") " Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.191639 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4981c3d3-04e6-4e36-8a2c-fa34b65d8621-kube-api-access-wtd5s" (OuterVolumeSpecName: "kube-api-access-wtd5s") pod "4981c3d3-04e6-4e36-8a2c-fa34b65d8621" (UID: "4981c3d3-04e6-4e36-8a2c-fa34b65d8621"). InnerVolumeSpecName "kube-api-access-wtd5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.265965 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4981c3d3-04e6-4e36-8a2c-fa34b65d8621-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4981c3d3-04e6-4e36-8a2c-fa34b65d8621" (UID: "4981c3d3-04e6-4e36-8a2c-fa34b65d8621"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.289998 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4981c3d3-04e6-4e36-8a2c-fa34b65d8621-config-data" (OuterVolumeSpecName: "config-data") pod "4981c3d3-04e6-4e36-8a2c-fa34b65d8621" (UID: "4981c3d3-04e6-4e36-8a2c-fa34b65d8621"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.298469 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4981c3d3-04e6-4e36-8a2c-fa34b65d8621-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.298500 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtd5s\" (UniqueName: \"kubernetes.io/projected/4981c3d3-04e6-4e36-8a2c-fa34b65d8621-kube-api-access-wtd5s\") on node \"crc\" DevicePath \"\"" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.298523 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4981c3d3-04e6-4e36-8a2c-fa34b65d8621-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.670931 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"4981c3d3-04e6-4e36-8a2c-fa34b65d8621","Type":"ContainerDied","Data":"6eebfb975ac36d7a6d0cbcbb162a2e9d833635fcd05ceaba60ac7a13d30de7f5"} Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.670992 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.671025 4743 scope.go:117] "RemoveContainer" containerID="56e76dde75b702f858326d4fbaf27c11f7664ed304128b414db4e2e2b05e6fe4" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.709920 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.732850 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.744523 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Oct 11 01:15:22 crc kubenswrapper[4743]: E1011 01:15:22.745219 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4981c3d3-04e6-4e36-8a2c-fa34b65d8621" containerName="mysqld-exporter" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.745240 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4981c3d3-04e6-4e36-8a2c-fa34b65d8621" containerName="mysqld-exporter" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.745489 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4981c3d3-04e6-4e36-8a2c-fa34b65d8621" containerName="mysqld-exporter" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.746499 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.750592 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.750775 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.759060 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Oct 11 01:15:22 crc kubenswrapper[4743]: W1011 01:15:22.790617 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod923b0fb7_1d93_491e_a1e0_73614b302fdb.slice/crio-c17e3e4cb6f70aa003898cd8df9516cfeda93555531348727c744d1ece4b27c2 WatchSource:0}: Error finding container c17e3e4cb6f70aa003898cd8df9516cfeda93555531348727c744d1ece4b27c2: Status 404 returned error can't find the container with id c17e3e4cb6f70aa003898cd8df9516cfeda93555531348727c744d1ece4b27c2 Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.791702 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.817374 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t75m\" (UniqueName: \"kubernetes.io/projected/50798e93-52c7-4ee3-b94a-295fbcc7eeba-kube-api-access-5t75m\") pod \"mysqld-exporter-0\" (UID: \"50798e93-52c7-4ee3-b94a-295fbcc7eeba\") " pod="openstack/mysqld-exporter-0" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.817844 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50798e93-52c7-4ee3-b94a-295fbcc7eeba-config-data\") pod \"mysqld-exporter-0\" (UID: \"50798e93-52c7-4ee3-b94a-295fbcc7eeba\") " pod="openstack/mysqld-exporter-0" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.817908 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/50798e93-52c7-4ee3-b94a-295fbcc7eeba-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"50798e93-52c7-4ee3-b94a-295fbcc7eeba\") " pod="openstack/mysqld-exporter-0" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.818067 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50798e93-52c7-4ee3-b94a-295fbcc7eeba-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"50798e93-52c7-4ee3-b94a-295fbcc7eeba\") " pod="openstack/mysqld-exporter-0" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.920036 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50798e93-52c7-4ee3-b94a-295fbcc7eeba-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"50798e93-52c7-4ee3-b94a-295fbcc7eeba\") " pod="openstack/mysqld-exporter-0" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.920250 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t75m\" (UniqueName: \"kubernetes.io/projected/50798e93-52c7-4ee3-b94a-295fbcc7eeba-kube-api-access-5t75m\") pod \"mysqld-exporter-0\" (UID: \"50798e93-52c7-4ee3-b94a-295fbcc7eeba\") " pod="openstack/mysqld-exporter-0" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.920296 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50798e93-52c7-4ee3-b94a-295fbcc7eeba-config-data\") pod \"mysqld-exporter-0\" (UID: \"50798e93-52c7-4ee3-b94a-295fbcc7eeba\") " pod="openstack/mysqld-exporter-0" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.920321 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/50798e93-52c7-4ee3-b94a-295fbcc7eeba-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"50798e93-52c7-4ee3-b94a-295fbcc7eeba\") " pod="openstack/mysqld-exporter-0" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.930108 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50798e93-52c7-4ee3-b94a-295fbcc7eeba-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"50798e93-52c7-4ee3-b94a-295fbcc7eeba\") " pod="openstack/mysqld-exporter-0" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.930250 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50798e93-52c7-4ee3-b94a-295fbcc7eeba-config-data\") pod \"mysqld-exporter-0\" (UID: \"50798e93-52c7-4ee3-b94a-295fbcc7eeba\") " pod="openstack/mysqld-exporter-0" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.943571 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t75m\" (UniqueName: \"kubernetes.io/projected/50798e93-52c7-4ee3-b94a-295fbcc7eeba-kube-api-access-5t75m\") pod \"mysqld-exporter-0\" (UID: \"50798e93-52c7-4ee3-b94a-295fbcc7eeba\") " pod="openstack/mysqld-exporter-0" Oct 11 01:15:22 crc kubenswrapper[4743]: I1011 01:15:22.951460 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/50798e93-52c7-4ee3-b94a-295fbcc7eeba-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"50798e93-52c7-4ee3-b94a-295fbcc7eeba\") " pod="openstack/mysqld-exporter-0" Oct 11 01:15:23 crc kubenswrapper[4743]: I1011 01:15:23.068501 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Oct 11 01:15:23 crc kubenswrapper[4743]: I1011 01:15:23.295832 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:15:23 crc kubenswrapper[4743]: I1011 01:15:23.296647 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e83a1d4-6bd8-42a3-bb28-e287346008eb" containerName="ceilometer-central-agent" containerID="cri-o://07c51e8b3fe3126872d6ed9d8747571270d6daa2db06adcb8ed6c8bebb399abe" gracePeriod=30 Oct 11 01:15:23 crc kubenswrapper[4743]: I1011 01:15:23.297835 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e83a1d4-6bd8-42a3-bb28-e287346008eb" containerName="proxy-httpd" containerID="cri-o://9d5adba1ea7452bbfa5bb0da625a06575600b71d527247e283031273dfb30694" gracePeriod=30 Oct 11 01:15:23 crc kubenswrapper[4743]: I1011 01:15:23.297968 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e83a1d4-6bd8-42a3-bb28-e287346008eb" containerName="ceilometer-notification-agent" containerID="cri-o://796327c276dad74f5ee54a09941f46c4f158e69a7744a237b421738162fb00ef" gracePeriod=30 Oct 11 01:15:23 crc kubenswrapper[4743]: I1011 01:15:23.298035 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e83a1d4-6bd8-42a3-bb28-e287346008eb" containerName="sg-core" containerID="cri-o://52bbd5b0da23de52947433753c9d905103f50933a78d3511cb78abff8c33ad6c" gracePeriod=30 Oct 11 01:15:23 crc kubenswrapper[4743]: I1011 01:15:23.635145 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Oct 11 01:15:23 crc kubenswrapper[4743]: W1011 01:15:23.644552 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50798e93_52c7_4ee3_b94a_295fbcc7eeba.slice/crio-6d0ac1587bf2f83db00919740e57a31b9e27797c465ecf9153f6bedd29494715 WatchSource:0}: Error finding container 6d0ac1587bf2f83db00919740e57a31b9e27797c465ecf9153f6bedd29494715: Status 404 returned error can't find the container with id 6d0ac1587bf2f83db00919740e57a31b9e27797c465ecf9153f6bedd29494715 Oct 11 01:15:23 crc kubenswrapper[4743]: I1011 01:15:23.687309 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"923b0fb7-1d93-491e-a1e0-73614b302fdb","Type":"ContainerStarted","Data":"3aea738772b33060b27f60b45723453809767cc02cd6ad4fd971c7b11e838f51"} Oct 11 01:15:23 crc kubenswrapper[4743]: I1011 01:15:23.687375 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"923b0fb7-1d93-491e-a1e0-73614b302fdb","Type":"ContainerStarted","Data":"c17e3e4cb6f70aa003898cd8df9516cfeda93555531348727c744d1ece4b27c2"} Oct 11 01:15:23 crc kubenswrapper[4743]: I1011 01:15:23.687433 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 11 01:15:23 crc kubenswrapper[4743]: I1011 01:15:23.688613 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"50798e93-52c7-4ee3-b94a-295fbcc7eeba","Type":"ContainerStarted","Data":"6d0ac1587bf2f83db00919740e57a31b9e27797c465ecf9153f6bedd29494715"} Oct 11 01:15:23 crc kubenswrapper[4743]: I1011 01:15:23.691606 4743 generic.go:334] "Generic (PLEG): container finished" podID="6e83a1d4-6bd8-42a3-bb28-e287346008eb" containerID="9d5adba1ea7452bbfa5bb0da625a06575600b71d527247e283031273dfb30694" exitCode=0 Oct 11 01:15:23 crc kubenswrapper[4743]: I1011 01:15:23.691640 4743 generic.go:334] "Generic (PLEG): container finished" podID="6e83a1d4-6bd8-42a3-bb28-e287346008eb" containerID="52bbd5b0da23de52947433753c9d905103f50933a78d3511cb78abff8c33ad6c" exitCode=2 Oct 11 01:15:23 crc kubenswrapper[4743]: I1011 01:15:23.691659 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e83a1d4-6bd8-42a3-bb28-e287346008eb","Type":"ContainerDied","Data":"9d5adba1ea7452bbfa5bb0da625a06575600b71d527247e283031273dfb30694"} Oct 11 01:15:23 crc kubenswrapper[4743]: I1011 01:15:23.691691 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e83a1d4-6bd8-42a3-bb28-e287346008eb","Type":"ContainerDied","Data":"52bbd5b0da23de52947433753c9d905103f50933a78d3511cb78abff8c33ad6c"} Oct 11 01:15:23 crc kubenswrapper[4743]: I1011 01:15:23.713686 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.33720667 podStartE2EDuration="2.713662711s" podCreationTimestamp="2025-10-11 01:15:21 +0000 UTC" firstStartedPulling="2025-10-11 01:15:22.793781158 +0000 UTC m=+1417.446761555" lastFinishedPulling="2025-10-11 01:15:23.170237199 +0000 UTC m=+1417.823217596" observedRunningTime="2025-10-11 01:15:23.701259073 +0000 UTC m=+1418.354239480" watchObservedRunningTime="2025-10-11 01:15:23.713662711 +0000 UTC m=+1418.366643118" Oct 11 01:15:24 crc kubenswrapper[4743]: I1011 01:15:24.146561 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4981c3d3-04e6-4e36-8a2c-fa34b65d8621" path="/var/lib/kubelet/pods/4981c3d3-04e6-4e36-8a2c-fa34b65d8621/volumes" Oct 11 01:15:24 crc kubenswrapper[4743]: I1011 01:15:24.705169 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"50798e93-52c7-4ee3-b94a-295fbcc7eeba","Type":"ContainerStarted","Data":"23c69e1098648067ec292ef57a00d160e5dd7fa665d5c28ec9f8e966363b98da"} Oct 11 01:15:24 crc kubenswrapper[4743]: I1011 01:15:24.707613 4743 generic.go:334] "Generic (PLEG): container finished" podID="6e83a1d4-6bd8-42a3-bb28-e287346008eb" containerID="07c51e8b3fe3126872d6ed9d8747571270d6daa2db06adcb8ed6c8bebb399abe" exitCode=0 Oct 11 01:15:24 crc kubenswrapper[4743]: I1011 01:15:24.707673 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e83a1d4-6bd8-42a3-bb28-e287346008eb","Type":"ContainerDied","Data":"07c51e8b3fe3126872d6ed9d8747571270d6daa2db06adcb8ed6c8bebb399abe"} Oct 11 01:15:24 crc kubenswrapper[4743]: I1011 01:15:24.719506 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=2.175774067 podStartE2EDuration="2.719495385s" podCreationTimestamp="2025-10-11 01:15:22 +0000 UTC" firstStartedPulling="2025-10-11 01:15:23.650267558 +0000 UTC m=+1418.303247955" lastFinishedPulling="2025-10-11 01:15:24.193988856 +0000 UTC m=+1418.846969273" observedRunningTime="2025-10-11 01:15:24.718365685 +0000 UTC m=+1419.371346102" watchObservedRunningTime="2025-10-11 01:15:24.719495385 +0000 UTC m=+1419.372475782" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.768885 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.824725 4743 generic.go:334] "Generic (PLEG): container finished" podID="6e83a1d4-6bd8-42a3-bb28-e287346008eb" containerID="796327c276dad74f5ee54a09941f46c4f158e69a7744a237b421738162fb00ef" exitCode=0 Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.824770 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e83a1d4-6bd8-42a3-bb28-e287346008eb","Type":"ContainerDied","Data":"796327c276dad74f5ee54a09941f46c4f158e69a7744a237b421738162fb00ef"} Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.824797 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e83a1d4-6bd8-42a3-bb28-e287346008eb","Type":"ContainerDied","Data":"c544add559f535a3bac8c2f15320fac6c07420bc9bf9cab1f9632f240d303984"} Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.824814 4743 scope.go:117] "RemoveContainer" containerID="9d5adba1ea7452bbfa5bb0da625a06575600b71d527247e283031273dfb30694" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.824970 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.859520 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e83a1d4-6bd8-42a3-bb28-e287346008eb-config-data\") pod \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.859624 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxkgt\" (UniqueName: \"kubernetes.io/projected/6e83a1d4-6bd8-42a3-bb28-e287346008eb-kube-api-access-xxkgt\") pod \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.859647 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e83a1d4-6bd8-42a3-bb28-e287346008eb-log-httpd\") pod \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.859687 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e83a1d4-6bd8-42a3-bb28-e287346008eb-run-httpd\") pod \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.859745 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e83a1d4-6bd8-42a3-bb28-e287346008eb-sg-core-conf-yaml\") pod \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.859863 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e83a1d4-6bd8-42a3-bb28-e287346008eb-combined-ca-bundle\") pod \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.859910 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e83a1d4-6bd8-42a3-bb28-e287346008eb-scripts\") pod \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\" (UID: \"6e83a1d4-6bd8-42a3-bb28-e287346008eb\") " Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.862173 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e83a1d4-6bd8-42a3-bb28-e287346008eb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6e83a1d4-6bd8-42a3-bb28-e287346008eb" (UID: "6e83a1d4-6bd8-42a3-bb28-e287346008eb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.862540 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e83a1d4-6bd8-42a3-bb28-e287346008eb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6e83a1d4-6bd8-42a3-bb28-e287346008eb" (UID: "6e83a1d4-6bd8-42a3-bb28-e287346008eb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.874516 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e83a1d4-6bd8-42a3-bb28-e287346008eb-scripts" (OuterVolumeSpecName: "scripts") pod "6e83a1d4-6bd8-42a3-bb28-e287346008eb" (UID: "6e83a1d4-6bd8-42a3-bb28-e287346008eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.875032 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e83a1d4-6bd8-42a3-bb28-e287346008eb-kube-api-access-xxkgt" (OuterVolumeSpecName: "kube-api-access-xxkgt") pod "6e83a1d4-6bd8-42a3-bb28-e287346008eb" (UID: "6e83a1d4-6bd8-42a3-bb28-e287346008eb"). InnerVolumeSpecName "kube-api-access-xxkgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.898185 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cxgk5"] Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.898216 4743 scope.go:117] "RemoveContainer" containerID="52bbd5b0da23de52947433753c9d905103f50933a78d3511cb78abff8c33ad6c" Oct 11 01:15:30 crc kubenswrapper[4743]: E1011 01:15:30.898791 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e83a1d4-6bd8-42a3-bb28-e287346008eb" containerName="proxy-httpd" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.899315 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e83a1d4-6bd8-42a3-bb28-e287346008eb" containerName="proxy-httpd" Oct 11 01:15:30 crc kubenswrapper[4743]: E1011 01:15:30.899393 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e83a1d4-6bd8-42a3-bb28-e287346008eb" containerName="ceilometer-notification-agent" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.899445 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e83a1d4-6bd8-42a3-bb28-e287346008eb" containerName="ceilometer-notification-agent" Oct 11 01:15:30 crc kubenswrapper[4743]: E1011 01:15:30.899497 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e83a1d4-6bd8-42a3-bb28-e287346008eb" containerName="sg-core" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.899555 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e83a1d4-6bd8-42a3-bb28-e287346008eb" containerName="sg-core" Oct 11 01:15:30 crc kubenswrapper[4743]: E1011 01:15:30.899639 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e83a1d4-6bd8-42a3-bb28-e287346008eb" containerName="ceilometer-central-agent" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.899690 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e83a1d4-6bd8-42a3-bb28-e287346008eb" containerName="ceilometer-central-agent" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.900021 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e83a1d4-6bd8-42a3-bb28-e287346008eb" containerName="ceilometer-central-agent" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.900115 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e83a1d4-6bd8-42a3-bb28-e287346008eb" containerName="proxy-httpd" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.900189 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e83a1d4-6bd8-42a3-bb28-e287346008eb" containerName="ceilometer-notification-agent" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.900259 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e83a1d4-6bd8-42a3-bb28-e287346008eb" containerName="sg-core" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.901958 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxgk5" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.917448 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cxgk5"] Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.929168 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e83a1d4-6bd8-42a3-bb28-e287346008eb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6e83a1d4-6bd8-42a3-bb28-e287346008eb" (UID: "6e83a1d4-6bd8-42a3-bb28-e287346008eb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.961752 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d980b77a-a725-400f-a2e8-d60c517ea3ca-catalog-content\") pod \"redhat-operators-cxgk5\" (UID: \"d980b77a-a725-400f-a2e8-d60c517ea3ca\") " pod="openshift-marketplace/redhat-operators-cxgk5" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.961987 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d980b77a-a725-400f-a2e8-d60c517ea3ca-utilities\") pod \"redhat-operators-cxgk5\" (UID: \"d980b77a-a725-400f-a2e8-d60c517ea3ca\") " pod="openshift-marketplace/redhat-operators-cxgk5" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.962150 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx9bm\" (UniqueName: \"kubernetes.io/projected/d980b77a-a725-400f-a2e8-d60c517ea3ca-kube-api-access-cx9bm\") pod \"redhat-operators-cxgk5\" (UID: \"d980b77a-a725-400f-a2e8-d60c517ea3ca\") " pod="openshift-marketplace/redhat-operators-cxgk5" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.962326 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e83a1d4-6bd8-42a3-bb28-e287346008eb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.962398 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e83a1d4-6bd8-42a3-bb28-e287346008eb-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.962452 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxkgt\" (UniqueName: \"kubernetes.io/projected/6e83a1d4-6bd8-42a3-bb28-e287346008eb-kube-api-access-xxkgt\") on node \"crc\" DevicePath \"\"" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.962508 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e83a1d4-6bd8-42a3-bb28-e287346008eb-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.962559 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e83a1d4-6bd8-42a3-bb28-e287346008eb-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 11 01:15:30 crc kubenswrapper[4743]: I1011 01:15:30.983099 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e83a1d4-6bd8-42a3-bb28-e287346008eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e83a1d4-6bd8-42a3-bb28-e287346008eb" (UID: "6e83a1d4-6bd8-42a3-bb28-e287346008eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.001589 4743 scope.go:117] "RemoveContainer" containerID="796327c276dad74f5ee54a09941f46c4f158e69a7744a237b421738162fb00ef" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.040078 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e83a1d4-6bd8-42a3-bb28-e287346008eb-config-data" (OuterVolumeSpecName: "config-data") pod "6e83a1d4-6bd8-42a3-bb28-e287346008eb" (UID: "6e83a1d4-6bd8-42a3-bb28-e287346008eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.045688 4743 scope.go:117] "RemoveContainer" containerID="07c51e8b3fe3126872d6ed9d8747571270d6daa2db06adcb8ed6c8bebb399abe" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.063582 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx9bm\" (UniqueName: \"kubernetes.io/projected/d980b77a-a725-400f-a2e8-d60c517ea3ca-kube-api-access-cx9bm\") pod \"redhat-operators-cxgk5\" (UID: \"d980b77a-a725-400f-a2e8-d60c517ea3ca\") " pod="openshift-marketplace/redhat-operators-cxgk5" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.063749 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d980b77a-a725-400f-a2e8-d60c517ea3ca-catalog-content\") pod \"redhat-operators-cxgk5\" (UID: \"d980b77a-a725-400f-a2e8-d60c517ea3ca\") " pod="openshift-marketplace/redhat-operators-cxgk5" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.063788 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d980b77a-a725-400f-a2e8-d60c517ea3ca-utilities\") pod \"redhat-operators-cxgk5\" (UID: \"d980b77a-a725-400f-a2e8-d60c517ea3ca\") " pod="openshift-marketplace/redhat-operators-cxgk5" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.063839 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e83a1d4-6bd8-42a3-bb28-e287346008eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.063871 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e83a1d4-6bd8-42a3-bb28-e287346008eb-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.064231 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d980b77a-a725-400f-a2e8-d60c517ea3ca-catalog-content\") pod \"redhat-operators-cxgk5\" (UID: \"d980b77a-a725-400f-a2e8-d60c517ea3ca\") " pod="openshift-marketplace/redhat-operators-cxgk5" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.064373 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d980b77a-a725-400f-a2e8-d60c517ea3ca-utilities\") pod \"redhat-operators-cxgk5\" (UID: \"d980b77a-a725-400f-a2e8-d60c517ea3ca\") " pod="openshift-marketplace/redhat-operators-cxgk5" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.069683 4743 scope.go:117] "RemoveContainer" containerID="9d5adba1ea7452bbfa5bb0da625a06575600b71d527247e283031273dfb30694" Oct 11 01:15:31 crc kubenswrapper[4743]: E1011 01:15:31.070294 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d5adba1ea7452bbfa5bb0da625a06575600b71d527247e283031273dfb30694\": container with ID starting with 9d5adba1ea7452bbfa5bb0da625a06575600b71d527247e283031273dfb30694 not found: ID does not exist" containerID="9d5adba1ea7452bbfa5bb0da625a06575600b71d527247e283031273dfb30694" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.070335 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d5adba1ea7452bbfa5bb0da625a06575600b71d527247e283031273dfb30694"} err="failed to get container status \"9d5adba1ea7452bbfa5bb0da625a06575600b71d527247e283031273dfb30694\": rpc error: code = NotFound desc = could not find container \"9d5adba1ea7452bbfa5bb0da625a06575600b71d527247e283031273dfb30694\": container with ID starting with 9d5adba1ea7452bbfa5bb0da625a06575600b71d527247e283031273dfb30694 not found: ID does not exist" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.070362 4743 scope.go:117] "RemoveContainer" containerID="52bbd5b0da23de52947433753c9d905103f50933a78d3511cb78abff8c33ad6c" Oct 11 01:15:31 crc kubenswrapper[4743]: E1011 01:15:31.070769 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52bbd5b0da23de52947433753c9d905103f50933a78d3511cb78abff8c33ad6c\": container with ID starting with 52bbd5b0da23de52947433753c9d905103f50933a78d3511cb78abff8c33ad6c not found: ID does not exist" containerID="52bbd5b0da23de52947433753c9d905103f50933a78d3511cb78abff8c33ad6c" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.070822 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52bbd5b0da23de52947433753c9d905103f50933a78d3511cb78abff8c33ad6c"} err="failed to get container status \"52bbd5b0da23de52947433753c9d905103f50933a78d3511cb78abff8c33ad6c\": rpc error: code = NotFound desc = could not find container \"52bbd5b0da23de52947433753c9d905103f50933a78d3511cb78abff8c33ad6c\": container with ID starting with 52bbd5b0da23de52947433753c9d905103f50933a78d3511cb78abff8c33ad6c not found: ID does not exist" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.070866 4743 scope.go:117] "RemoveContainer" containerID="796327c276dad74f5ee54a09941f46c4f158e69a7744a237b421738162fb00ef" Oct 11 01:15:31 crc kubenswrapper[4743]: E1011 01:15:31.071366 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"796327c276dad74f5ee54a09941f46c4f158e69a7744a237b421738162fb00ef\": container with ID starting with 796327c276dad74f5ee54a09941f46c4f158e69a7744a237b421738162fb00ef not found: ID does not exist" containerID="796327c276dad74f5ee54a09941f46c4f158e69a7744a237b421738162fb00ef" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.071402 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"796327c276dad74f5ee54a09941f46c4f158e69a7744a237b421738162fb00ef"} err="failed to get container status \"796327c276dad74f5ee54a09941f46c4f158e69a7744a237b421738162fb00ef\": rpc error: code = NotFound desc = could not find container \"796327c276dad74f5ee54a09941f46c4f158e69a7744a237b421738162fb00ef\": container with ID starting with 796327c276dad74f5ee54a09941f46c4f158e69a7744a237b421738162fb00ef not found: ID does not exist" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.071429 4743 scope.go:117] "RemoveContainer" containerID="07c51e8b3fe3126872d6ed9d8747571270d6daa2db06adcb8ed6c8bebb399abe" Oct 11 01:15:31 crc kubenswrapper[4743]: E1011 01:15:31.071742 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07c51e8b3fe3126872d6ed9d8747571270d6daa2db06adcb8ed6c8bebb399abe\": container with ID starting with 07c51e8b3fe3126872d6ed9d8747571270d6daa2db06adcb8ed6c8bebb399abe not found: ID does not exist" containerID="07c51e8b3fe3126872d6ed9d8747571270d6daa2db06adcb8ed6c8bebb399abe" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.071767 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07c51e8b3fe3126872d6ed9d8747571270d6daa2db06adcb8ed6c8bebb399abe"} err="failed to get container status \"07c51e8b3fe3126872d6ed9d8747571270d6daa2db06adcb8ed6c8bebb399abe\": rpc error: code = NotFound desc = could not find container \"07c51e8b3fe3126872d6ed9d8747571270d6daa2db06adcb8ed6c8bebb399abe\": container with ID starting with 07c51e8b3fe3126872d6ed9d8747571270d6daa2db06adcb8ed6c8bebb399abe not found: ID does not exist" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.080101 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx9bm\" (UniqueName: \"kubernetes.io/projected/d980b77a-a725-400f-a2e8-d60c517ea3ca-kube-api-access-cx9bm\") pod \"redhat-operators-cxgk5\" (UID: \"d980b77a-a725-400f-a2e8-d60c517ea3ca\") " pod="openshift-marketplace/redhat-operators-cxgk5" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.166004 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.178912 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.191823 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.199534 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.208162 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.208562 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.209721 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.234608 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.304051 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxgk5" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.369230 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.369307 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/451bf9e6-fdc6-48ee-bf15-71e945ce936b-log-httpd\") pod \"ceilometer-0\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.369378 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.369430 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-scripts\") pod \"ceilometer-0\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.369576 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-config-data\") pod \"ceilometer-0\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.369659 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5z2p\" (UniqueName: \"kubernetes.io/projected/451bf9e6-fdc6-48ee-bf15-71e945ce936b-kube-api-access-z5z2p\") pod \"ceilometer-0\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.369744 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.369800 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/451bf9e6-fdc6-48ee-bf15-71e945ce936b-run-httpd\") pod \"ceilometer-0\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.471330 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/451bf9e6-fdc6-48ee-bf15-71e945ce936b-log-httpd\") pod \"ceilometer-0\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.471421 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.471451 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-scripts\") pod \"ceilometer-0\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.471519 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-config-data\") pod \"ceilometer-0\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.471556 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5z2p\" (UniqueName: \"kubernetes.io/projected/451bf9e6-fdc6-48ee-bf15-71e945ce936b-kube-api-access-z5z2p\") pod \"ceilometer-0\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.471614 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.471650 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/451bf9e6-fdc6-48ee-bf15-71e945ce936b-run-httpd\") pod \"ceilometer-0\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.471676 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.472513 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/451bf9e6-fdc6-48ee-bf15-71e945ce936b-log-httpd\") pod \"ceilometer-0\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.473151 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/451bf9e6-fdc6-48ee-bf15-71e945ce936b-run-httpd\") pod \"ceilometer-0\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.478194 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.478476 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.479416 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.486126 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-scripts\") pod \"ceilometer-0\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.494435 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-config-data\") pod \"ceilometer-0\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.499933 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5z2p\" (UniqueName: \"kubernetes.io/projected/451bf9e6-fdc6-48ee-bf15-71e945ce936b-kube-api-access-z5z2p\") pod \"ceilometer-0\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.525504 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.774545 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cxgk5"] Oct 11 01:15:31 crc kubenswrapper[4743]: W1011 01:15:31.776629 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd980b77a_a725_400f_a2e8_d60c517ea3ca.slice/crio-bcdd80a5aef129db880a03825e2ef8fe29d60a107ed901113f32206c0a8ebb54 WatchSource:0}: Error finding container bcdd80a5aef129db880a03825e2ef8fe29d60a107ed901113f32206c0a8ebb54: Status 404 returned error can't find the container with id bcdd80a5aef129db880a03825e2ef8fe29d60a107ed901113f32206c0a8ebb54 Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.838636 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxgk5" event={"ID":"d980b77a-a725-400f-a2e8-d60c517ea3ca","Type":"ContainerStarted","Data":"bcdd80a5aef129db880a03825e2ef8fe29d60a107ed901113f32206c0a8ebb54"} Oct 11 01:15:31 crc kubenswrapper[4743]: I1011 01:15:31.984486 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:15:32 crc kubenswrapper[4743]: W1011 01:15:32.025885 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod451bf9e6_fdc6_48ee_bf15_71e945ce936b.slice/crio-2547a0d6b03010e05c27d65146ffe19d6e7d1ac0112a42b4d5e2a7b0deda7f86 WatchSource:0}: Error finding container 2547a0d6b03010e05c27d65146ffe19d6e7d1ac0112a42b4d5e2a7b0deda7f86: Status 404 returned error can't find the container with id 2547a0d6b03010e05c27d65146ffe19d6e7d1ac0112a42b4d5e2a7b0deda7f86 Oct 11 01:15:32 crc kubenswrapper[4743]: I1011 01:15:32.115021 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e83a1d4-6bd8-42a3-bb28-e287346008eb" path="/var/lib/kubelet/pods/6e83a1d4-6bd8-42a3-bb28-e287346008eb/volumes" Oct 11 01:15:32 crc kubenswrapper[4743]: I1011 01:15:32.171253 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 11 01:15:32 crc kubenswrapper[4743]: I1011 01:15:32.901076 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"451bf9e6-fdc6-48ee-bf15-71e945ce936b","Type":"ContainerStarted","Data":"4cbaa0837680d45cfc67e473eaabcfae066158bf679d1de6a06a1ff42b945fe6"} Oct 11 01:15:32 crc kubenswrapper[4743]: I1011 01:15:32.901376 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"451bf9e6-fdc6-48ee-bf15-71e945ce936b","Type":"ContainerStarted","Data":"2547a0d6b03010e05c27d65146ffe19d6e7d1ac0112a42b4d5e2a7b0deda7f86"} Oct 11 01:15:32 crc kubenswrapper[4743]: I1011 01:15:32.943084 4743 generic.go:334] "Generic (PLEG): container finished" podID="d980b77a-a725-400f-a2e8-d60c517ea3ca" containerID="f01b3051a8591422ff46c2b2cfb31412128be5d8a5410a924f01ab311d5da1e6" exitCode=0 Oct 11 01:15:32 crc kubenswrapper[4743]: I1011 01:15:32.943128 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxgk5" event={"ID":"d980b77a-a725-400f-a2e8-d60c517ea3ca","Type":"ContainerDied","Data":"f01b3051a8591422ff46c2b2cfb31412128be5d8a5410a924f01ab311d5da1e6"} Oct 11 01:15:33 crc kubenswrapper[4743]: I1011 01:15:33.959027 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxgk5" event={"ID":"d980b77a-a725-400f-a2e8-d60c517ea3ca","Type":"ContainerStarted","Data":"ca9164ff3a59aefe61956b76d578e6cc58d75e1693cc5756d75a28352edf2605"} Oct 11 01:15:33 crc kubenswrapper[4743]: I1011 01:15:33.962902 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"451bf9e6-fdc6-48ee-bf15-71e945ce936b","Type":"ContainerStarted","Data":"84afac31df94ab945bf4321becb2ff9117d6213dbc89596c3a3150a8260b8ec2"} Oct 11 01:15:34 crc kubenswrapper[4743]: I1011 01:15:34.974841 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"451bf9e6-fdc6-48ee-bf15-71e945ce936b","Type":"ContainerStarted","Data":"d4636ca20154e3137bff423b88ca33d5c7cbfe4c57e05ccab5aa82ba31b0b869"} Oct 11 01:15:35 crc kubenswrapper[4743]: I1011 01:15:35.994626 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"451bf9e6-fdc6-48ee-bf15-71e945ce936b","Type":"ContainerStarted","Data":"ea0feab6f6c5b8e7f1db337e829961370cdd326e109deb30439e46749f2bf849"} Oct 11 01:15:35 crc kubenswrapper[4743]: I1011 01:15:35.995259 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 11 01:15:36 crc kubenswrapper[4743]: I1011 01:15:36.028723 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.574061919 podStartE2EDuration="5.028696415s" podCreationTimestamp="2025-10-11 01:15:31 +0000 UTC" firstStartedPulling="2025-10-11 01:15:32.02864076 +0000 UTC m=+1426.681621157" lastFinishedPulling="2025-10-11 01:15:35.483275246 +0000 UTC m=+1430.136255653" observedRunningTime="2025-10-11 01:15:36.016947818 +0000 UTC m=+1430.669928225" watchObservedRunningTime="2025-10-11 01:15:36.028696415 +0000 UTC m=+1430.681676802" Oct 11 01:15:37 crc kubenswrapper[4743]: I1011 01:15:37.017725 4743 generic.go:334] "Generic (PLEG): container finished" podID="d980b77a-a725-400f-a2e8-d60c517ea3ca" containerID="ca9164ff3a59aefe61956b76d578e6cc58d75e1693cc5756d75a28352edf2605" exitCode=0 Oct 11 01:15:37 crc kubenswrapper[4743]: I1011 01:15:37.017809 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxgk5" event={"ID":"d980b77a-a725-400f-a2e8-d60c517ea3ca","Type":"ContainerDied","Data":"ca9164ff3a59aefe61956b76d578e6cc58d75e1693cc5756d75a28352edf2605"} Oct 11 01:15:38 crc kubenswrapper[4743]: I1011 01:15:38.038921 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxgk5" event={"ID":"d980b77a-a725-400f-a2e8-d60c517ea3ca","Type":"ContainerStarted","Data":"5e750b56c8875612cce6a4455dd3c8714330cedd85d46ad2a8812029767aab7f"} Oct 11 01:15:38 crc kubenswrapper[4743]: I1011 01:15:38.062615 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cxgk5" podStartSLOduration=3.581067081 podStartE2EDuration="8.062591801s" podCreationTimestamp="2025-10-11 01:15:30 +0000 UTC" firstStartedPulling="2025-10-11 01:15:32.947058708 +0000 UTC m=+1427.600039105" lastFinishedPulling="2025-10-11 01:15:37.428583428 +0000 UTC m=+1432.081563825" observedRunningTime="2025-10-11 01:15:38.05683867 +0000 UTC m=+1432.709819107" watchObservedRunningTime="2025-10-11 01:15:38.062591801 +0000 UTC m=+1432.715572208" Oct 11 01:15:41 crc kubenswrapper[4743]: I1011 01:15:41.304983 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cxgk5" Oct 11 01:15:41 crc kubenswrapper[4743]: I1011 01:15:41.305313 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cxgk5" Oct 11 01:15:42 crc kubenswrapper[4743]: I1011 01:15:42.358997 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cxgk5" podUID="d980b77a-a725-400f-a2e8-d60c517ea3ca" containerName="registry-server" probeResult="failure" output=< Oct 11 01:15:42 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Oct 11 01:15:42 crc kubenswrapper[4743]: > Oct 11 01:15:44 crc kubenswrapper[4743]: I1011 01:15:44.458153 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:15:44 crc kubenswrapper[4743]: I1011 01:15:44.458433 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:15:47 crc kubenswrapper[4743]: I1011 01:15:47.446359 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bhl5k"] Oct 11 01:15:47 crc kubenswrapper[4743]: I1011 01:15:47.449903 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bhl5k" Oct 11 01:15:47 crc kubenswrapper[4743]: I1011 01:15:47.460047 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bhl5k"] Oct 11 01:15:47 crc kubenswrapper[4743]: I1011 01:15:47.617187 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl4nw\" (UniqueName: \"kubernetes.io/projected/98e0a647-d40a-4208-b8cc-a4032d075d9f-kube-api-access-pl4nw\") pod \"redhat-marketplace-bhl5k\" (UID: \"98e0a647-d40a-4208-b8cc-a4032d075d9f\") " pod="openshift-marketplace/redhat-marketplace-bhl5k" Oct 11 01:15:47 crc kubenswrapper[4743]: I1011 01:15:47.617485 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98e0a647-d40a-4208-b8cc-a4032d075d9f-utilities\") pod \"redhat-marketplace-bhl5k\" (UID: \"98e0a647-d40a-4208-b8cc-a4032d075d9f\") " pod="openshift-marketplace/redhat-marketplace-bhl5k" Oct 11 01:15:47 crc kubenswrapper[4743]: I1011 01:15:47.617569 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98e0a647-d40a-4208-b8cc-a4032d075d9f-catalog-content\") pod \"redhat-marketplace-bhl5k\" (UID: \"98e0a647-d40a-4208-b8cc-a4032d075d9f\") " pod="openshift-marketplace/redhat-marketplace-bhl5k" Oct 11 01:15:47 crc kubenswrapper[4743]: I1011 01:15:47.720599 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98e0a647-d40a-4208-b8cc-a4032d075d9f-utilities\") pod \"redhat-marketplace-bhl5k\" (UID: \"98e0a647-d40a-4208-b8cc-a4032d075d9f\") " pod="openshift-marketplace/redhat-marketplace-bhl5k" Oct 11 01:15:47 crc kubenswrapper[4743]: I1011 01:15:47.720754 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98e0a647-d40a-4208-b8cc-a4032d075d9f-catalog-content\") pod \"redhat-marketplace-bhl5k\" (UID: \"98e0a647-d40a-4208-b8cc-a4032d075d9f\") " pod="openshift-marketplace/redhat-marketplace-bhl5k" Oct 11 01:15:47 crc kubenswrapper[4743]: I1011 01:15:47.721103 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl4nw\" (UniqueName: \"kubernetes.io/projected/98e0a647-d40a-4208-b8cc-a4032d075d9f-kube-api-access-pl4nw\") pod \"redhat-marketplace-bhl5k\" (UID: \"98e0a647-d40a-4208-b8cc-a4032d075d9f\") " pod="openshift-marketplace/redhat-marketplace-bhl5k" Oct 11 01:15:47 crc kubenswrapper[4743]: I1011 01:15:47.721240 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98e0a647-d40a-4208-b8cc-a4032d075d9f-catalog-content\") pod \"redhat-marketplace-bhl5k\" (UID: \"98e0a647-d40a-4208-b8cc-a4032d075d9f\") " pod="openshift-marketplace/redhat-marketplace-bhl5k" Oct 11 01:15:47 crc kubenswrapper[4743]: I1011 01:15:47.721472 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98e0a647-d40a-4208-b8cc-a4032d075d9f-utilities\") pod \"redhat-marketplace-bhl5k\" (UID: \"98e0a647-d40a-4208-b8cc-a4032d075d9f\") " pod="openshift-marketplace/redhat-marketplace-bhl5k" Oct 11 01:15:47 crc kubenswrapper[4743]: I1011 01:15:47.740399 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl4nw\" (UniqueName: \"kubernetes.io/projected/98e0a647-d40a-4208-b8cc-a4032d075d9f-kube-api-access-pl4nw\") pod \"redhat-marketplace-bhl5k\" (UID: \"98e0a647-d40a-4208-b8cc-a4032d075d9f\") " pod="openshift-marketplace/redhat-marketplace-bhl5k" Oct 11 01:15:47 crc kubenswrapper[4743]: I1011 01:15:47.804132 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bhl5k" Oct 11 01:15:48 crc kubenswrapper[4743]: I1011 01:15:48.288934 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bhl5k"] Oct 11 01:15:49 crc kubenswrapper[4743]: I1011 01:15:49.153919 4743 generic.go:334] "Generic (PLEG): container finished" podID="98e0a647-d40a-4208-b8cc-a4032d075d9f" containerID="9e3163b061b40b6e30959d703bc59df02acb7488b977d1662418c8c552ada894" exitCode=0 Oct 11 01:15:49 crc kubenswrapper[4743]: I1011 01:15:49.154167 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhl5k" event={"ID":"98e0a647-d40a-4208-b8cc-a4032d075d9f","Type":"ContainerDied","Data":"9e3163b061b40b6e30959d703bc59df02acb7488b977d1662418c8c552ada894"} Oct 11 01:15:49 crc kubenswrapper[4743]: I1011 01:15:49.154594 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhl5k" event={"ID":"98e0a647-d40a-4208-b8cc-a4032d075d9f","Type":"ContainerStarted","Data":"b733066c414d9e63930df4ecc9266cd71b9597365a7951f3d2df80c0e83ca01d"} Oct 11 01:15:50 crc kubenswrapper[4743]: I1011 01:15:50.165138 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhl5k" event={"ID":"98e0a647-d40a-4208-b8cc-a4032d075d9f","Type":"ContainerStarted","Data":"763f93a0183ee3f44bfc66a2dc3ebc45f4f4655f3214145adb240ccb942df813"} Oct 11 01:15:51 crc kubenswrapper[4743]: I1011 01:15:51.176682 4743 generic.go:334] "Generic (PLEG): container finished" podID="98e0a647-d40a-4208-b8cc-a4032d075d9f" containerID="763f93a0183ee3f44bfc66a2dc3ebc45f4f4655f3214145adb240ccb942df813" exitCode=0 Oct 11 01:15:51 crc kubenswrapper[4743]: I1011 01:15:51.176743 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhl5k" event={"ID":"98e0a647-d40a-4208-b8cc-a4032d075d9f","Type":"ContainerDied","Data":"763f93a0183ee3f44bfc66a2dc3ebc45f4f4655f3214145adb240ccb942df813"} Oct 11 01:15:52 crc kubenswrapper[4743]: I1011 01:15:52.191823 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhl5k" event={"ID":"98e0a647-d40a-4208-b8cc-a4032d075d9f","Type":"ContainerStarted","Data":"c8f5912315432244a04ea4bc11df4eecbd7f213cb06fa403430e00fa9adc1642"} Oct 11 01:15:52 crc kubenswrapper[4743]: I1011 01:15:52.217502 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bhl5k" podStartSLOduration=2.753677735 podStartE2EDuration="5.217487099s" podCreationTimestamp="2025-10-11 01:15:47 +0000 UTC" firstStartedPulling="2025-10-11 01:15:49.15581937 +0000 UTC m=+1443.808799777" lastFinishedPulling="2025-10-11 01:15:51.619628744 +0000 UTC m=+1446.272609141" observedRunningTime="2025-10-11 01:15:52.213956641 +0000 UTC m=+1446.866937058" watchObservedRunningTime="2025-10-11 01:15:52.217487099 +0000 UTC m=+1446.870467496" Oct 11 01:15:52 crc kubenswrapper[4743]: I1011 01:15:52.368628 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cxgk5" podUID="d980b77a-a725-400f-a2e8-d60c517ea3ca" containerName="registry-server" probeResult="failure" output=< Oct 11 01:15:52 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Oct 11 01:15:52 crc kubenswrapper[4743]: > Oct 11 01:15:57 crc kubenswrapper[4743]: I1011 01:15:57.804903 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bhl5k" Oct 11 01:15:57 crc kubenswrapper[4743]: I1011 01:15:57.805312 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bhl5k" Oct 11 01:15:57 crc kubenswrapper[4743]: I1011 01:15:57.889842 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bhl5k" Oct 11 01:15:58 crc kubenswrapper[4743]: I1011 01:15:58.316472 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bhl5k" Oct 11 01:15:58 crc kubenswrapper[4743]: I1011 01:15:58.376425 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bhl5k"] Oct 11 01:16:00 crc kubenswrapper[4743]: I1011 01:16:00.291185 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bhl5k" podUID="98e0a647-d40a-4208-b8cc-a4032d075d9f" containerName="registry-server" containerID="cri-o://c8f5912315432244a04ea4bc11df4eecbd7f213cb06fa403430e00fa9adc1642" gracePeriod=2 Oct 11 01:16:00 crc kubenswrapper[4743]: I1011 01:16:00.952525 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bhl5k" Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.113301 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl4nw\" (UniqueName: \"kubernetes.io/projected/98e0a647-d40a-4208-b8cc-a4032d075d9f-kube-api-access-pl4nw\") pod \"98e0a647-d40a-4208-b8cc-a4032d075d9f\" (UID: \"98e0a647-d40a-4208-b8cc-a4032d075d9f\") " Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.113353 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98e0a647-d40a-4208-b8cc-a4032d075d9f-catalog-content\") pod \"98e0a647-d40a-4208-b8cc-a4032d075d9f\" (UID: \"98e0a647-d40a-4208-b8cc-a4032d075d9f\") " Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.113444 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98e0a647-d40a-4208-b8cc-a4032d075d9f-utilities\") pod \"98e0a647-d40a-4208-b8cc-a4032d075d9f\" (UID: \"98e0a647-d40a-4208-b8cc-a4032d075d9f\") " Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.114387 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98e0a647-d40a-4208-b8cc-a4032d075d9f-utilities" (OuterVolumeSpecName: "utilities") pod "98e0a647-d40a-4208-b8cc-a4032d075d9f" (UID: "98e0a647-d40a-4208-b8cc-a4032d075d9f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.123688 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98e0a647-d40a-4208-b8cc-a4032d075d9f-kube-api-access-pl4nw" (OuterVolumeSpecName: "kube-api-access-pl4nw") pod "98e0a647-d40a-4208-b8cc-a4032d075d9f" (UID: "98e0a647-d40a-4208-b8cc-a4032d075d9f"). InnerVolumeSpecName "kube-api-access-pl4nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.139105 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98e0a647-d40a-4208-b8cc-a4032d075d9f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98e0a647-d40a-4208-b8cc-a4032d075d9f" (UID: "98e0a647-d40a-4208-b8cc-a4032d075d9f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.215663 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl4nw\" (UniqueName: \"kubernetes.io/projected/98e0a647-d40a-4208-b8cc-a4032d075d9f-kube-api-access-pl4nw\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.215717 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98e0a647-d40a-4208-b8cc-a4032d075d9f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.215730 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98e0a647-d40a-4208-b8cc-a4032d075d9f-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.305228 4743 generic.go:334] "Generic (PLEG): container finished" podID="98e0a647-d40a-4208-b8cc-a4032d075d9f" containerID="c8f5912315432244a04ea4bc11df4eecbd7f213cb06fa403430e00fa9adc1642" exitCode=0 Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.305301 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhl5k" event={"ID":"98e0a647-d40a-4208-b8cc-a4032d075d9f","Type":"ContainerDied","Data":"c8f5912315432244a04ea4bc11df4eecbd7f213cb06fa403430e00fa9adc1642"} Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.305364 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhl5k" event={"ID":"98e0a647-d40a-4208-b8cc-a4032d075d9f","Type":"ContainerDied","Data":"b733066c414d9e63930df4ecc9266cd71b9597365a7951f3d2df80c0e83ca01d"} Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.305413 4743 scope.go:117] "RemoveContainer" containerID="c8f5912315432244a04ea4bc11df4eecbd7f213cb06fa403430e00fa9adc1642" Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.305416 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bhl5k" Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.335749 4743 scope.go:117] "RemoveContainer" containerID="763f93a0183ee3f44bfc66a2dc3ebc45f4f4655f3214145adb240ccb942df813" Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.387675 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bhl5k"] Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.397798 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bhl5k"] Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.415108 4743 scope.go:117] "RemoveContainer" containerID="9e3163b061b40b6e30959d703bc59df02acb7488b977d1662418c8c552ada894" Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.415339 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cxgk5" Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.511016 4743 scope.go:117] "RemoveContainer" containerID="c8f5912315432244a04ea4bc11df4eecbd7f213cb06fa403430e00fa9adc1642" Oct 11 01:16:01 crc kubenswrapper[4743]: E1011 01:16:01.516232 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8f5912315432244a04ea4bc11df4eecbd7f213cb06fa403430e00fa9adc1642\": container with ID starting with c8f5912315432244a04ea4bc11df4eecbd7f213cb06fa403430e00fa9adc1642 not found: ID does not exist" containerID="c8f5912315432244a04ea4bc11df4eecbd7f213cb06fa403430e00fa9adc1642" Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.516268 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8f5912315432244a04ea4bc11df4eecbd7f213cb06fa403430e00fa9adc1642"} err="failed to get container status \"c8f5912315432244a04ea4bc11df4eecbd7f213cb06fa403430e00fa9adc1642\": rpc error: code = NotFound desc = could not find container \"c8f5912315432244a04ea4bc11df4eecbd7f213cb06fa403430e00fa9adc1642\": container with ID starting with c8f5912315432244a04ea4bc11df4eecbd7f213cb06fa403430e00fa9adc1642 not found: ID does not exist" Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.516291 4743 scope.go:117] "RemoveContainer" containerID="763f93a0183ee3f44bfc66a2dc3ebc45f4f4655f3214145adb240ccb942df813" Oct 11 01:16:01 crc kubenswrapper[4743]: E1011 01:16:01.522333 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"763f93a0183ee3f44bfc66a2dc3ebc45f4f4655f3214145adb240ccb942df813\": container with ID starting with 763f93a0183ee3f44bfc66a2dc3ebc45f4f4655f3214145adb240ccb942df813 not found: ID does not exist" containerID="763f93a0183ee3f44bfc66a2dc3ebc45f4f4655f3214145adb240ccb942df813" Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.522464 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"763f93a0183ee3f44bfc66a2dc3ebc45f4f4655f3214145adb240ccb942df813"} err="failed to get container status \"763f93a0183ee3f44bfc66a2dc3ebc45f4f4655f3214145adb240ccb942df813\": rpc error: code = NotFound desc = could not find container \"763f93a0183ee3f44bfc66a2dc3ebc45f4f4655f3214145adb240ccb942df813\": container with ID starting with 763f93a0183ee3f44bfc66a2dc3ebc45f4f4655f3214145adb240ccb942df813 not found: ID does not exist" Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.522649 4743 scope.go:117] "RemoveContainer" containerID="9e3163b061b40b6e30959d703bc59df02acb7488b977d1662418c8c552ada894" Oct 11 01:16:01 crc kubenswrapper[4743]: E1011 01:16:01.524938 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e3163b061b40b6e30959d703bc59df02acb7488b977d1662418c8c552ada894\": container with ID starting with 9e3163b061b40b6e30959d703bc59df02acb7488b977d1662418c8c552ada894 not found: ID does not exist" containerID="9e3163b061b40b6e30959d703bc59df02acb7488b977d1662418c8c552ada894" Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.524981 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e3163b061b40b6e30959d703bc59df02acb7488b977d1662418c8c552ada894"} err="failed to get container status \"9e3163b061b40b6e30959d703bc59df02acb7488b977d1662418c8c552ada894\": rpc error: code = NotFound desc = could not find container \"9e3163b061b40b6e30959d703bc59df02acb7488b977d1662418c8c552ada894\": container with ID starting with 9e3163b061b40b6e30959d703bc59df02acb7488b977d1662418c8c552ada894 not found: ID does not exist" Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.555070 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cxgk5" Oct 11 01:16:01 crc kubenswrapper[4743]: I1011 01:16:01.607608 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 11 01:16:02 crc kubenswrapper[4743]: I1011 01:16:02.106253 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98e0a647-d40a-4208-b8cc-a4032d075d9f" path="/var/lib/kubelet/pods/98e0a647-d40a-4208-b8cc-a4032d075d9f/volumes" Oct 11 01:16:03 crc kubenswrapper[4743]: I1011 01:16:03.732768 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cxgk5"] Oct 11 01:16:03 crc kubenswrapper[4743]: I1011 01:16:03.733310 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cxgk5" podUID="d980b77a-a725-400f-a2e8-d60c517ea3ca" containerName="registry-server" containerID="cri-o://5e750b56c8875612cce6a4455dd3c8714330cedd85d46ad2a8812029767aab7f" gracePeriod=2 Oct 11 01:16:04 crc kubenswrapper[4743]: I1011 01:16:04.291149 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxgk5" Oct 11 01:16:04 crc kubenswrapper[4743]: I1011 01:16:04.338424 4743 generic.go:334] "Generic (PLEG): container finished" podID="d980b77a-a725-400f-a2e8-d60c517ea3ca" containerID="5e750b56c8875612cce6a4455dd3c8714330cedd85d46ad2a8812029767aab7f" exitCode=0 Oct 11 01:16:04 crc kubenswrapper[4743]: I1011 01:16:04.338491 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxgk5" Oct 11 01:16:04 crc kubenswrapper[4743]: I1011 01:16:04.338511 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxgk5" event={"ID":"d980b77a-a725-400f-a2e8-d60c517ea3ca","Type":"ContainerDied","Data":"5e750b56c8875612cce6a4455dd3c8714330cedd85d46ad2a8812029767aab7f"} Oct 11 01:16:04 crc kubenswrapper[4743]: I1011 01:16:04.338791 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxgk5" event={"ID":"d980b77a-a725-400f-a2e8-d60c517ea3ca","Type":"ContainerDied","Data":"bcdd80a5aef129db880a03825e2ef8fe29d60a107ed901113f32206c0a8ebb54"} Oct 11 01:16:04 crc kubenswrapper[4743]: I1011 01:16:04.338812 4743 scope.go:117] "RemoveContainer" containerID="5e750b56c8875612cce6a4455dd3c8714330cedd85d46ad2a8812029767aab7f" Oct 11 01:16:04 crc kubenswrapper[4743]: I1011 01:16:04.378237 4743 scope.go:117] "RemoveContainer" containerID="ca9164ff3a59aefe61956b76d578e6cc58d75e1693cc5756d75a28352edf2605" Oct 11 01:16:04 crc kubenswrapper[4743]: I1011 01:16:04.384609 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d980b77a-a725-400f-a2e8-d60c517ea3ca-catalog-content\") pod \"d980b77a-a725-400f-a2e8-d60c517ea3ca\" (UID: \"d980b77a-a725-400f-a2e8-d60c517ea3ca\") " Oct 11 01:16:04 crc kubenswrapper[4743]: I1011 01:16:04.384763 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx9bm\" (UniqueName: \"kubernetes.io/projected/d980b77a-a725-400f-a2e8-d60c517ea3ca-kube-api-access-cx9bm\") pod \"d980b77a-a725-400f-a2e8-d60c517ea3ca\" (UID: \"d980b77a-a725-400f-a2e8-d60c517ea3ca\") " Oct 11 01:16:04 crc kubenswrapper[4743]: I1011 01:16:04.384787 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d980b77a-a725-400f-a2e8-d60c517ea3ca-utilities\") pod \"d980b77a-a725-400f-a2e8-d60c517ea3ca\" (UID: \"d980b77a-a725-400f-a2e8-d60c517ea3ca\") " Oct 11 01:16:04 crc kubenswrapper[4743]: I1011 01:16:04.385663 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d980b77a-a725-400f-a2e8-d60c517ea3ca-utilities" (OuterVolumeSpecName: "utilities") pod "d980b77a-a725-400f-a2e8-d60c517ea3ca" (UID: "d980b77a-a725-400f-a2e8-d60c517ea3ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:16:04 crc kubenswrapper[4743]: I1011 01:16:04.393422 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d980b77a-a725-400f-a2e8-d60c517ea3ca-kube-api-access-cx9bm" (OuterVolumeSpecName: "kube-api-access-cx9bm") pod "d980b77a-a725-400f-a2e8-d60c517ea3ca" (UID: "d980b77a-a725-400f-a2e8-d60c517ea3ca"). InnerVolumeSpecName "kube-api-access-cx9bm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:16:04 crc kubenswrapper[4743]: I1011 01:16:04.424384 4743 scope.go:117] "RemoveContainer" containerID="f01b3051a8591422ff46c2b2cfb31412128be5d8a5410a924f01ab311d5da1e6" Oct 11 01:16:04 crc kubenswrapper[4743]: I1011 01:16:04.465094 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d980b77a-a725-400f-a2e8-d60c517ea3ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d980b77a-a725-400f-a2e8-d60c517ea3ca" (UID: "d980b77a-a725-400f-a2e8-d60c517ea3ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:16:04 crc kubenswrapper[4743]: I1011 01:16:04.487715 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx9bm\" (UniqueName: \"kubernetes.io/projected/d980b77a-a725-400f-a2e8-d60c517ea3ca-kube-api-access-cx9bm\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:04 crc kubenswrapper[4743]: I1011 01:16:04.487748 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d980b77a-a725-400f-a2e8-d60c517ea3ca-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:04 crc kubenswrapper[4743]: I1011 01:16:04.487759 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d980b77a-a725-400f-a2e8-d60c517ea3ca-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:04 crc kubenswrapper[4743]: I1011 01:16:04.503284 4743 scope.go:117] "RemoveContainer" containerID="5e750b56c8875612cce6a4455dd3c8714330cedd85d46ad2a8812029767aab7f" Oct 11 01:16:04 crc kubenswrapper[4743]: E1011 01:16:04.503600 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e750b56c8875612cce6a4455dd3c8714330cedd85d46ad2a8812029767aab7f\": container with ID starting with 5e750b56c8875612cce6a4455dd3c8714330cedd85d46ad2a8812029767aab7f not found: ID does not exist" containerID="5e750b56c8875612cce6a4455dd3c8714330cedd85d46ad2a8812029767aab7f" Oct 11 01:16:04 crc kubenswrapper[4743]: I1011 01:16:04.503629 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e750b56c8875612cce6a4455dd3c8714330cedd85d46ad2a8812029767aab7f"} err="failed to get container status \"5e750b56c8875612cce6a4455dd3c8714330cedd85d46ad2a8812029767aab7f\": rpc error: code = NotFound desc = could not find container \"5e750b56c8875612cce6a4455dd3c8714330cedd85d46ad2a8812029767aab7f\": container with ID starting with 5e750b56c8875612cce6a4455dd3c8714330cedd85d46ad2a8812029767aab7f not found: ID does not exist" Oct 11 01:16:04 crc kubenswrapper[4743]: I1011 01:16:04.503648 4743 scope.go:117] "RemoveContainer" containerID="ca9164ff3a59aefe61956b76d578e6cc58d75e1693cc5756d75a28352edf2605" Oct 11 01:16:04 crc kubenswrapper[4743]: E1011 01:16:04.504180 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca9164ff3a59aefe61956b76d578e6cc58d75e1693cc5756d75a28352edf2605\": container with ID starting with ca9164ff3a59aefe61956b76d578e6cc58d75e1693cc5756d75a28352edf2605 not found: ID does not exist" containerID="ca9164ff3a59aefe61956b76d578e6cc58d75e1693cc5756d75a28352edf2605" Oct 11 01:16:04 crc kubenswrapper[4743]: I1011 01:16:04.504201 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca9164ff3a59aefe61956b76d578e6cc58d75e1693cc5756d75a28352edf2605"} err="failed to get container status \"ca9164ff3a59aefe61956b76d578e6cc58d75e1693cc5756d75a28352edf2605\": rpc error: code = NotFound desc = could not find container \"ca9164ff3a59aefe61956b76d578e6cc58d75e1693cc5756d75a28352edf2605\": container with ID starting with ca9164ff3a59aefe61956b76d578e6cc58d75e1693cc5756d75a28352edf2605 not found: ID does not exist" Oct 11 01:16:04 crc kubenswrapper[4743]: I1011 01:16:04.504213 4743 scope.go:117] "RemoveContainer" containerID="f01b3051a8591422ff46c2b2cfb31412128be5d8a5410a924f01ab311d5da1e6" Oct 11 01:16:04 crc kubenswrapper[4743]: E1011 01:16:04.504761 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f01b3051a8591422ff46c2b2cfb31412128be5d8a5410a924f01ab311d5da1e6\": container with ID starting with f01b3051a8591422ff46c2b2cfb31412128be5d8a5410a924f01ab311d5da1e6 not found: ID does not exist" containerID="f01b3051a8591422ff46c2b2cfb31412128be5d8a5410a924f01ab311d5da1e6" Oct 11 01:16:04 crc kubenswrapper[4743]: I1011 01:16:04.504809 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01b3051a8591422ff46c2b2cfb31412128be5d8a5410a924f01ab311d5da1e6"} err="failed to get container status \"f01b3051a8591422ff46c2b2cfb31412128be5d8a5410a924f01ab311d5da1e6\": rpc error: code = NotFound desc = could not find container \"f01b3051a8591422ff46c2b2cfb31412128be5d8a5410a924f01ab311d5da1e6\": container with ID starting with f01b3051a8591422ff46c2b2cfb31412128be5d8a5410a924f01ab311d5da1e6 not found: ID does not exist" Oct 11 01:16:04 crc kubenswrapper[4743]: I1011 01:16:04.686952 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cxgk5"] Oct 11 01:16:04 crc kubenswrapper[4743]: I1011 01:16:04.701758 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cxgk5"] Oct 11 01:16:06 crc kubenswrapper[4743]: I1011 01:16:06.108148 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d980b77a-a725-400f-a2e8-d60c517ea3ca" path="/var/lib/kubelet/pods/d980b77a-a725-400f-a2e8-d60c517ea3ca/volumes" Oct 11 01:16:13 crc kubenswrapper[4743]: I1011 01:16:13.499939 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-f4rpt"] Oct 11 01:16:13 crc kubenswrapper[4743]: I1011 01:16:13.508892 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-f4rpt"] Oct 11 01:16:13 crc kubenswrapper[4743]: I1011 01:16:13.609808 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-lc4jr"] Oct 11 01:16:13 crc kubenswrapper[4743]: E1011 01:16:13.610320 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d980b77a-a725-400f-a2e8-d60c517ea3ca" containerName="extract-utilities" Oct 11 01:16:13 crc kubenswrapper[4743]: I1011 01:16:13.610337 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d980b77a-a725-400f-a2e8-d60c517ea3ca" containerName="extract-utilities" Oct 11 01:16:13 crc kubenswrapper[4743]: E1011 01:16:13.610372 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d980b77a-a725-400f-a2e8-d60c517ea3ca" containerName="extract-content" Oct 11 01:16:13 crc kubenswrapper[4743]: I1011 01:16:13.610377 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d980b77a-a725-400f-a2e8-d60c517ea3ca" containerName="extract-content" Oct 11 01:16:13 crc kubenswrapper[4743]: E1011 01:16:13.610385 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e0a647-d40a-4208-b8cc-a4032d075d9f" containerName="extract-utilities" Oct 11 01:16:13 crc kubenswrapper[4743]: I1011 01:16:13.610391 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e0a647-d40a-4208-b8cc-a4032d075d9f" containerName="extract-utilities" Oct 11 01:16:13 crc kubenswrapper[4743]: E1011 01:16:13.610402 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d980b77a-a725-400f-a2e8-d60c517ea3ca" containerName="registry-server" Oct 11 01:16:13 crc kubenswrapper[4743]: I1011 01:16:13.610409 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d980b77a-a725-400f-a2e8-d60c517ea3ca" containerName="registry-server" Oct 11 01:16:13 crc kubenswrapper[4743]: E1011 01:16:13.610415 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e0a647-d40a-4208-b8cc-a4032d075d9f" containerName="extract-content" Oct 11 01:16:13 crc kubenswrapper[4743]: I1011 01:16:13.610421 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e0a647-d40a-4208-b8cc-a4032d075d9f" containerName="extract-content" Oct 11 01:16:13 crc kubenswrapper[4743]: E1011 01:16:13.610433 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e0a647-d40a-4208-b8cc-a4032d075d9f" containerName="registry-server" Oct 11 01:16:13 crc kubenswrapper[4743]: I1011 01:16:13.610439 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e0a647-d40a-4208-b8cc-a4032d075d9f" containerName="registry-server" Oct 11 01:16:13 crc kubenswrapper[4743]: I1011 01:16:13.610772 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d980b77a-a725-400f-a2e8-d60c517ea3ca" containerName="registry-server" Oct 11 01:16:13 crc kubenswrapper[4743]: I1011 01:16:13.610797 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="98e0a647-d40a-4208-b8cc-a4032d075d9f" containerName="registry-server" Oct 11 01:16:13 crc kubenswrapper[4743]: I1011 01:16:13.611603 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-lc4jr" Oct 11 01:16:13 crc kubenswrapper[4743]: I1011 01:16:13.626185 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-lc4jr"] Oct 11 01:16:13 crc kubenswrapper[4743]: I1011 01:16:13.729375 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d-combined-ca-bundle\") pod \"heat-db-sync-lc4jr\" (UID: \"b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d\") " pod="openstack/heat-db-sync-lc4jr" Oct 11 01:16:13 crc kubenswrapper[4743]: I1011 01:16:13.729596 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thcvg\" (UniqueName: \"kubernetes.io/projected/b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d-kube-api-access-thcvg\") pod \"heat-db-sync-lc4jr\" (UID: \"b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d\") " pod="openstack/heat-db-sync-lc4jr" Oct 11 01:16:13 crc kubenswrapper[4743]: I1011 01:16:13.729644 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d-config-data\") pod \"heat-db-sync-lc4jr\" (UID: \"b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d\") " pod="openstack/heat-db-sync-lc4jr" Oct 11 01:16:13 crc kubenswrapper[4743]: I1011 01:16:13.831535 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d-combined-ca-bundle\") pod \"heat-db-sync-lc4jr\" (UID: \"b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d\") " pod="openstack/heat-db-sync-lc4jr" Oct 11 01:16:13 crc kubenswrapper[4743]: I1011 01:16:13.831604 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thcvg\" (UniqueName: \"kubernetes.io/projected/b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d-kube-api-access-thcvg\") pod \"heat-db-sync-lc4jr\" (UID: \"b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d\") " pod="openstack/heat-db-sync-lc4jr" Oct 11 01:16:13 crc kubenswrapper[4743]: I1011 01:16:13.831624 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d-config-data\") pod \"heat-db-sync-lc4jr\" (UID: \"b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d\") " pod="openstack/heat-db-sync-lc4jr" Oct 11 01:16:13 crc kubenswrapper[4743]: I1011 01:16:13.837562 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d-config-data\") pod \"heat-db-sync-lc4jr\" (UID: \"b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d\") " pod="openstack/heat-db-sync-lc4jr" Oct 11 01:16:13 crc kubenswrapper[4743]: I1011 01:16:13.839126 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d-combined-ca-bundle\") pod \"heat-db-sync-lc4jr\" (UID: \"b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d\") " pod="openstack/heat-db-sync-lc4jr" Oct 11 01:16:13 crc kubenswrapper[4743]: I1011 01:16:13.851690 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thcvg\" (UniqueName: \"kubernetes.io/projected/b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d-kube-api-access-thcvg\") pod \"heat-db-sync-lc4jr\" (UID: \"b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d\") " pod="openstack/heat-db-sync-lc4jr" Oct 11 01:16:13 crc kubenswrapper[4743]: I1011 01:16:13.937499 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-lc4jr" Oct 11 01:16:14 crc kubenswrapper[4743]: I1011 01:16:14.107494 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="476a4c6e-ddae-4974-a899-78a8f1ee973d" path="/var/lib/kubelet/pods/476a4c6e-ddae-4974-a899-78a8f1ee973d/volumes" Oct 11 01:16:14 crc kubenswrapper[4743]: I1011 01:16:14.426104 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-lc4jr"] Oct 11 01:16:14 crc kubenswrapper[4743]: W1011 01:16:14.432312 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5a1515f_1a64_44da_b8e1_b41ff6aa1c8d.slice/crio-2817ba897455312bafa7b3314c446473f99b59005703c0ed14fabf4509c93d7a WatchSource:0}: Error finding container 2817ba897455312bafa7b3314c446473f99b59005703c0ed14fabf4509c93d7a: Status 404 returned error can't find the container with id 2817ba897455312bafa7b3314c446473f99b59005703c0ed14fabf4509c93d7a Oct 11 01:16:14 crc kubenswrapper[4743]: I1011 01:16:14.452487 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-lc4jr" event={"ID":"b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d","Type":"ContainerStarted","Data":"2817ba897455312bafa7b3314c446473f99b59005703c0ed14fabf4509c93d7a"} Oct 11 01:16:14 crc kubenswrapper[4743]: I1011 01:16:14.457913 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:16:14 crc kubenswrapper[4743]: I1011 01:16:14.457956 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:16:15 crc kubenswrapper[4743]: I1011 01:16:15.565548 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 11 01:16:15 crc kubenswrapper[4743]: I1011 01:16:15.936927 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:16:15 crc kubenswrapper[4743]: I1011 01:16:15.937501 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="451bf9e6-fdc6-48ee-bf15-71e945ce936b" containerName="ceilometer-central-agent" containerID="cri-o://4cbaa0837680d45cfc67e473eaabcfae066158bf679d1de6a06a1ff42b945fe6" gracePeriod=30 Oct 11 01:16:15 crc kubenswrapper[4743]: I1011 01:16:15.937964 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="451bf9e6-fdc6-48ee-bf15-71e945ce936b" containerName="proxy-httpd" containerID="cri-o://ea0feab6f6c5b8e7f1db337e829961370cdd326e109deb30439e46749f2bf849" gracePeriod=30 Oct 11 01:16:15 crc kubenswrapper[4743]: I1011 01:16:15.938019 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="451bf9e6-fdc6-48ee-bf15-71e945ce936b" containerName="sg-core" containerID="cri-o://d4636ca20154e3137bff423b88ca33d5c7cbfe4c57e05ccab5aa82ba31b0b869" gracePeriod=30 Oct 11 01:16:15 crc kubenswrapper[4743]: I1011 01:16:15.938052 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="451bf9e6-fdc6-48ee-bf15-71e945ce936b" containerName="ceilometer-notification-agent" containerID="cri-o://84afac31df94ab945bf4321becb2ff9117d6213dbc89596c3a3150a8260b8ec2" gracePeriod=30 Oct 11 01:16:16 crc kubenswrapper[4743]: I1011 01:16:16.430844 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 11 01:16:16 crc kubenswrapper[4743]: I1011 01:16:16.482234 4743 generic.go:334] "Generic (PLEG): container finished" podID="451bf9e6-fdc6-48ee-bf15-71e945ce936b" containerID="ea0feab6f6c5b8e7f1db337e829961370cdd326e109deb30439e46749f2bf849" exitCode=0 Oct 11 01:16:16 crc kubenswrapper[4743]: I1011 01:16:16.482263 4743 generic.go:334] "Generic (PLEG): container finished" podID="451bf9e6-fdc6-48ee-bf15-71e945ce936b" containerID="d4636ca20154e3137bff423b88ca33d5c7cbfe4c57e05ccab5aa82ba31b0b869" exitCode=2 Oct 11 01:16:16 crc kubenswrapper[4743]: I1011 01:16:16.482271 4743 generic.go:334] "Generic (PLEG): container finished" podID="451bf9e6-fdc6-48ee-bf15-71e945ce936b" containerID="4cbaa0837680d45cfc67e473eaabcfae066158bf679d1de6a06a1ff42b945fe6" exitCode=0 Oct 11 01:16:16 crc kubenswrapper[4743]: I1011 01:16:16.482298 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"451bf9e6-fdc6-48ee-bf15-71e945ce936b","Type":"ContainerDied","Data":"ea0feab6f6c5b8e7f1db337e829961370cdd326e109deb30439e46749f2bf849"} Oct 11 01:16:16 crc kubenswrapper[4743]: I1011 01:16:16.482322 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"451bf9e6-fdc6-48ee-bf15-71e945ce936b","Type":"ContainerDied","Data":"d4636ca20154e3137bff423b88ca33d5c7cbfe4c57e05ccab5aa82ba31b0b869"} Oct 11 01:16:16 crc kubenswrapper[4743]: I1011 01:16:16.482332 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"451bf9e6-fdc6-48ee-bf15-71e945ce936b","Type":"ContainerDied","Data":"4cbaa0837680d45cfc67e473eaabcfae066158bf679d1de6a06a1ff42b945fe6"} Oct 11 01:16:19 crc kubenswrapper[4743]: I1011 01:16:19.989289 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="73d7bbf0-dc76-4572-857e-fd0fb59d95cc" containerName="rabbitmq" containerID="cri-o://18d5924ee91371fd1ad1e224761869e30787b5d666608ba26d05d7cefcfe9f7b" gracePeriod=604796 Oct 11 01:16:20 crc kubenswrapper[4743]: I1011 01:16:20.842663 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="9f596550-b88a-49d7-9cff-cbc2d4149a2e" containerName="rabbitmq" containerID="cri-o://5088c25b9ecada27037e94ce63bb12f86d1d169720a14623b47acf2816b9127b" gracePeriod=604796 Oct 11 01:16:23 crc kubenswrapper[4743]: I1011 01:16:23.558108 4743 generic.go:334] "Generic (PLEG): container finished" podID="451bf9e6-fdc6-48ee-bf15-71e945ce936b" containerID="84afac31df94ab945bf4321becb2ff9117d6213dbc89596c3a3150a8260b8ec2" exitCode=0 Oct 11 01:16:23 crc kubenswrapper[4743]: I1011 01:16:23.558200 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"451bf9e6-fdc6-48ee-bf15-71e945ce936b","Type":"ContainerDied","Data":"84afac31df94ab945bf4321becb2ff9117d6213dbc89596c3a3150a8260b8ec2"} Oct 11 01:16:26 crc kubenswrapper[4743]: I1011 01:16:26.591474 4743 generic.go:334] "Generic (PLEG): container finished" podID="73d7bbf0-dc76-4572-857e-fd0fb59d95cc" containerID="18d5924ee91371fd1ad1e224761869e30787b5d666608ba26d05d7cefcfe9f7b" exitCode=0 Oct 11 01:16:26 crc kubenswrapper[4743]: I1011 01:16:26.591601 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"73d7bbf0-dc76-4572-857e-fd0fb59d95cc","Type":"ContainerDied","Data":"18d5924ee91371fd1ad1e224761869e30787b5d666608ba26d05d7cefcfe9f7b"} Oct 11 01:16:26 crc kubenswrapper[4743]: I1011 01:16:26.976769 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.165984 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-sg-core-conf-yaml\") pod \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.166066 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-scripts\") pod \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.166121 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-combined-ca-bundle\") pod \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.166163 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5z2p\" (UniqueName: \"kubernetes.io/projected/451bf9e6-fdc6-48ee-bf15-71e945ce936b-kube-api-access-z5z2p\") pod \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.166202 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/451bf9e6-fdc6-48ee-bf15-71e945ce936b-log-httpd\") pod \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.166257 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/451bf9e6-fdc6-48ee-bf15-71e945ce936b-run-httpd\") pod \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.166374 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-config-data\") pod \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.166396 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-ceilometer-tls-certs\") pod \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\" (UID: \"451bf9e6-fdc6-48ee-bf15-71e945ce936b\") " Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.166904 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/451bf9e6-fdc6-48ee-bf15-71e945ce936b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "451bf9e6-fdc6-48ee-bf15-71e945ce936b" (UID: "451bf9e6-fdc6-48ee-bf15-71e945ce936b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.167080 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/451bf9e6-fdc6-48ee-bf15-71e945ce936b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "451bf9e6-fdc6-48ee-bf15-71e945ce936b" (UID: "451bf9e6-fdc6-48ee-bf15-71e945ce936b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.172671 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/451bf9e6-fdc6-48ee-bf15-71e945ce936b-kube-api-access-z5z2p" (OuterVolumeSpecName: "kube-api-access-z5z2p") pod "451bf9e6-fdc6-48ee-bf15-71e945ce936b" (UID: "451bf9e6-fdc6-48ee-bf15-71e945ce936b"). InnerVolumeSpecName "kube-api-access-z5z2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.172769 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-scripts" (OuterVolumeSpecName: "scripts") pod "451bf9e6-fdc6-48ee-bf15-71e945ce936b" (UID: "451bf9e6-fdc6-48ee-bf15-71e945ce936b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.200013 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "451bf9e6-fdc6-48ee-bf15-71e945ce936b" (UID: "451bf9e6-fdc6-48ee-bf15-71e945ce936b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.226039 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "451bf9e6-fdc6-48ee-bf15-71e945ce936b" (UID: "451bf9e6-fdc6-48ee-bf15-71e945ce936b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.264327 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "451bf9e6-fdc6-48ee-bf15-71e945ce936b" (UID: "451bf9e6-fdc6-48ee-bf15-71e945ce936b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.268364 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/451bf9e6-fdc6-48ee-bf15-71e945ce936b-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.268395 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/451bf9e6-fdc6-48ee-bf15-71e945ce936b-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.268407 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.268423 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.268433 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.268444 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.268455 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5z2p\" (UniqueName: \"kubernetes.io/projected/451bf9e6-fdc6-48ee-bf15-71e945ce936b-kube-api-access-z5z2p\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.311034 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-config-data" (OuterVolumeSpecName: "config-data") pod "451bf9e6-fdc6-48ee-bf15-71e945ce936b" (UID: "451bf9e6-fdc6-48ee-bf15-71e945ce936b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.375696 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451bf9e6-fdc6-48ee-bf15-71e945ce936b-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.602919 4743 generic.go:334] "Generic (PLEG): container finished" podID="9f596550-b88a-49d7-9cff-cbc2d4149a2e" containerID="5088c25b9ecada27037e94ce63bb12f86d1d169720a14623b47acf2816b9127b" exitCode=0 Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.602996 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9f596550-b88a-49d7-9cff-cbc2d4149a2e","Type":"ContainerDied","Data":"5088c25b9ecada27037e94ce63bb12f86d1d169720a14623b47acf2816b9127b"} Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.605782 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"451bf9e6-fdc6-48ee-bf15-71e945ce936b","Type":"ContainerDied","Data":"2547a0d6b03010e05c27d65146ffe19d6e7d1ac0112a42b4d5e2a7b0deda7f86"} Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.605825 4743 scope.go:117] "RemoveContainer" containerID="ea0feab6f6c5b8e7f1db337e829961370cdd326e109deb30439e46749f2bf849" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.606015 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.643871 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.659245 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.671843 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:16:27 crc kubenswrapper[4743]: E1011 01:16:27.672283 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="451bf9e6-fdc6-48ee-bf15-71e945ce936b" containerName="proxy-httpd" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.672299 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="451bf9e6-fdc6-48ee-bf15-71e945ce936b" containerName="proxy-httpd" Oct 11 01:16:27 crc kubenswrapper[4743]: E1011 01:16:27.672324 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="451bf9e6-fdc6-48ee-bf15-71e945ce936b" containerName="ceilometer-central-agent" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.672331 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="451bf9e6-fdc6-48ee-bf15-71e945ce936b" containerName="ceilometer-central-agent" Oct 11 01:16:27 crc kubenswrapper[4743]: E1011 01:16:27.672382 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="451bf9e6-fdc6-48ee-bf15-71e945ce936b" containerName="sg-core" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.672388 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="451bf9e6-fdc6-48ee-bf15-71e945ce936b" containerName="sg-core" Oct 11 01:16:27 crc kubenswrapper[4743]: E1011 01:16:27.672406 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="451bf9e6-fdc6-48ee-bf15-71e945ce936b" containerName="ceilometer-notification-agent" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.672412 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="451bf9e6-fdc6-48ee-bf15-71e945ce936b" containerName="ceilometer-notification-agent" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.672621 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="451bf9e6-fdc6-48ee-bf15-71e945ce936b" containerName="ceilometer-notification-agent" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.672632 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="451bf9e6-fdc6-48ee-bf15-71e945ce936b" containerName="proxy-httpd" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.672646 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="451bf9e6-fdc6-48ee-bf15-71e945ce936b" containerName="ceilometer-central-agent" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.672656 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="451bf9e6-fdc6-48ee-bf15-71e945ce936b" containerName="sg-core" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.674392 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.677600 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.678537 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.679588 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.684036 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.786246 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8z4p\" (UniqueName: \"kubernetes.io/projected/d54c922a-5193-47c9-9148-7fe897065885-kube-api-access-p8z4p\") pod \"ceilometer-0\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.786289 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.786674 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-config-data\") pod \"ceilometer-0\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.786824 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-scripts\") pod \"ceilometer-0\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.786961 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.787117 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.787191 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d54c922a-5193-47c9-9148-7fe897065885-run-httpd\") pod \"ceilometer-0\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.787445 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d54c922a-5193-47c9-9148-7fe897065885-log-httpd\") pod \"ceilometer-0\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.889254 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-config-data\") pod \"ceilometer-0\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.889952 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-scripts\") pod \"ceilometer-0\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.889988 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.890029 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.890043 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d54c922a-5193-47c9-9148-7fe897065885-run-httpd\") pod \"ceilometer-0\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.890096 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d54c922a-5193-47c9-9148-7fe897065885-log-httpd\") pod \"ceilometer-0\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.890190 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8z4p\" (UniqueName: \"kubernetes.io/projected/d54c922a-5193-47c9-9148-7fe897065885-kube-api-access-p8z4p\") pod \"ceilometer-0\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.890208 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.891700 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d54c922a-5193-47c9-9148-7fe897065885-log-httpd\") pod \"ceilometer-0\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.891940 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d54c922a-5193-47c9-9148-7fe897065885-run-httpd\") pod \"ceilometer-0\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.896091 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-config-data\") pod \"ceilometer-0\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.899813 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.900241 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.907376 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-scripts\") pod \"ceilometer-0\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.909245 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " pod="openstack/ceilometer-0" Oct 11 01:16:27 crc kubenswrapper[4743]: I1011 01:16:27.910197 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8z4p\" (UniqueName: \"kubernetes.io/projected/d54c922a-5193-47c9-9148-7fe897065885-kube-api-access-p8z4p\") pod \"ceilometer-0\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " pod="openstack/ceilometer-0" Oct 11 01:16:28 crc kubenswrapper[4743]: I1011 01:16:28.004114 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 01:16:28 crc kubenswrapper[4743]: I1011 01:16:28.109189 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="451bf9e6-fdc6-48ee-bf15-71e945ce936b" path="/var/lib/kubelet/pods/451bf9e6-fdc6-48ee-bf15-71e945ce936b/volumes" Oct 11 01:16:28 crc kubenswrapper[4743]: I1011 01:16:28.127348 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="73d7bbf0-dc76-4572-857e-fd0fb59d95cc" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.119:5671: connect: connection refused" Oct 11 01:16:28 crc kubenswrapper[4743]: I1011 01:16:28.438976 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="9f596550-b88a-49d7-9cff-cbc2d4149a2e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.120:5671: connect: connection refused" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.192694 4743 scope.go:117] "RemoveContainer" containerID="d4636ca20154e3137bff423b88ca33d5c7cbfe4c57e05ccab5aa82ba31b0b869" Oct 11 01:16:31 crc kubenswrapper[4743]: E1011 01:16:31.230181 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Oct 11 01:16:31 crc kubenswrapper[4743]: E1011 01:16:31.230233 4743 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Oct 11 01:16:31 crc kubenswrapper[4743]: E1011 01:16:31.230345 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-thcvg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-lc4jr_openstack(b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 11 01:16:31 crc kubenswrapper[4743]: E1011 01:16:31.231637 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-lc4jr" podUID="b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.323280 4743 scope.go:117] "RemoveContainer" containerID="84afac31df94ab945bf4321becb2ff9117d6213dbc89596c3a3150a8260b8ec2" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.360933 4743 scope.go:117] "RemoveContainer" containerID="4cbaa0837680d45cfc67e473eaabcfae066158bf679d1de6a06a1ff42b945fe6" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.662002 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9f596550-b88a-49d7-9cff-cbc2d4149a2e","Type":"ContainerDied","Data":"ea76cbc044cc60d25786766100c9f98b838a44f7b163f230250f6c671a788183"} Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.662048 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea76cbc044cc60d25786766100c9f98b838a44f7b163f230250f6c671a788183" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.664754 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"73d7bbf0-dc76-4572-857e-fd0fb59d95cc","Type":"ContainerDied","Data":"bc12e6fe570e1b9fed4f696e94239d3b186f62e7adf30d9c2e4c008c659037a6"} Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.664779 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc12e6fe570e1b9fed4f696e94239d3b186f62e7adf30d9c2e4c008c659037a6" Oct 11 01:16:31 crc kubenswrapper[4743]: E1011 01:16:31.665540 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested\\\"\"" pod="openstack/heat-db-sync-lc4jr" podUID="b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.690110 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.693421 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.774118 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-rabbitmq-confd\") pod \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.774364 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.774450 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-config-data\") pod \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.774519 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-plugins-conf\") pod \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.774599 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-server-conf\") pod \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.774684 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-pod-info\") pod \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.774770 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mklbt\" (UniqueName: \"kubernetes.io/projected/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-kube-api-access-mklbt\") pod \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.774888 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-erlang-cookie-secret\") pod \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.774991 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-rabbitmq-plugins\") pod \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.775080 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-rabbitmq-tls\") pod \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.775151 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-rabbitmq-erlang-cookie\") pod \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\" (UID: \"73d7bbf0-dc76-4572-857e-fd0fb59d95cc\") " Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.776716 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "73d7bbf0-dc76-4572-857e-fd0fb59d95cc" (UID: "73d7bbf0-dc76-4572-857e-fd0fb59d95cc"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.779269 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "73d7bbf0-dc76-4572-857e-fd0fb59d95cc" (UID: "73d7bbf0-dc76-4572-857e-fd0fb59d95cc"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.781622 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-kube-api-access-mklbt" (OuterVolumeSpecName: "kube-api-access-mklbt") pod "73d7bbf0-dc76-4572-857e-fd0fb59d95cc" (UID: "73d7bbf0-dc76-4572-857e-fd0fb59d95cc"). InnerVolumeSpecName "kube-api-access-mklbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.782717 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "73d7bbf0-dc76-4572-857e-fd0fb59d95cc" (UID: "73d7bbf0-dc76-4572-857e-fd0fb59d95cc"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.784011 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "73d7bbf0-dc76-4572-857e-fd0fb59d95cc" (UID: "73d7bbf0-dc76-4572-857e-fd0fb59d95cc"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.790518 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "73d7bbf0-dc76-4572-857e-fd0fb59d95cc" (UID: "73d7bbf0-dc76-4572-857e-fd0fb59d95cc"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.791332 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "73d7bbf0-dc76-4572-857e-fd0fb59d95cc" (UID: "73d7bbf0-dc76-4572-857e-fd0fb59d95cc"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.791728 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-pod-info" (OuterVolumeSpecName: "pod-info") pod "73d7bbf0-dc76-4572-857e-fd0fb59d95cc" (UID: "73d7bbf0-dc76-4572-857e-fd0fb59d95cc"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.820739 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-config-data" (OuterVolumeSpecName: "config-data") pod "73d7bbf0-dc76-4572-857e-fd0fb59d95cc" (UID: "73d7bbf0-dc76-4572-857e-fd0fb59d95cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.865793 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.867576 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-server-conf" (OuterVolumeSpecName: "server-conf") pod "73d7bbf0-dc76-4572-857e-fd0fb59d95cc" (UID: "73d7bbf0-dc76-4572-857e-fd0fb59d95cc"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.871569 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.884445 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9f596550-b88a-49d7-9cff-cbc2d4149a2e-rabbitmq-tls\") pod \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.884500 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9f596550-b88a-49d7-9cff-cbc2d4149a2e-pod-info\") pod \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.884526 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8px2z\" (UniqueName: \"kubernetes.io/projected/9f596550-b88a-49d7-9cff-cbc2d4149a2e-kube-api-access-8px2z\") pod \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.884582 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9f596550-b88a-49d7-9cff-cbc2d4149a2e-rabbitmq-confd\") pod \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.884611 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9f596550-b88a-49d7-9cff-cbc2d4149a2e-erlang-cookie-secret\") pod \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.884672 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9f596550-b88a-49d7-9cff-cbc2d4149a2e-rabbitmq-erlang-cookie\") pod \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.884721 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9f596550-b88a-49d7-9cff-cbc2d4149a2e-plugins-conf\") pod \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.884745 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9f596550-b88a-49d7-9cff-cbc2d4149a2e-rabbitmq-plugins\") pod \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.884775 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f596550-b88a-49d7-9cff-cbc2d4149a2e-config-data\") pod \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.884943 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.884975 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9f596550-b88a-49d7-9cff-cbc2d4149a2e-server-conf\") pod \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\" (UID: \"9f596550-b88a-49d7-9cff-cbc2d4149a2e\") " Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.885663 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.885681 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.885690 4743 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.885700 4743 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-server-conf\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.885708 4743 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-pod-info\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.885716 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mklbt\" (UniqueName: \"kubernetes.io/projected/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-kube-api-access-mklbt\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.885725 4743 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.889319 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.889341 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.889350 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.889329 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f596550-b88a-49d7-9cff-cbc2d4149a2e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "9f596550-b88a-49d7-9cff-cbc2d4149a2e" (UID: "9f596550-b88a-49d7-9cff-cbc2d4149a2e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.890161 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f596550-b88a-49d7-9cff-cbc2d4149a2e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "9f596550-b88a-49d7-9cff-cbc2d4149a2e" (UID: "9f596550-b88a-49d7-9cff-cbc2d4149a2e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.891169 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "9f596550-b88a-49d7-9cff-cbc2d4149a2e" (UID: "9f596550-b88a-49d7-9cff-cbc2d4149a2e"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.891315 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f596550-b88a-49d7-9cff-cbc2d4149a2e-kube-api-access-8px2z" (OuterVolumeSpecName: "kube-api-access-8px2z") pod "9f596550-b88a-49d7-9cff-cbc2d4149a2e" (UID: "9f596550-b88a-49d7-9cff-cbc2d4149a2e"). InnerVolumeSpecName "kube-api-access-8px2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.892560 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f596550-b88a-49d7-9cff-cbc2d4149a2e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "9f596550-b88a-49d7-9cff-cbc2d4149a2e" (UID: "9f596550-b88a-49d7-9cff-cbc2d4149a2e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.896025 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f596550-b88a-49d7-9cff-cbc2d4149a2e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "9f596550-b88a-49d7-9cff-cbc2d4149a2e" (UID: "9f596550-b88a-49d7-9cff-cbc2d4149a2e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.908558 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f596550-b88a-49d7-9cff-cbc2d4149a2e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "9f596550-b88a-49d7-9cff-cbc2d4149a2e" (UID: "9f596550-b88a-49d7-9cff-cbc2d4149a2e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.918060 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/9f596550-b88a-49d7-9cff-cbc2d4149a2e-pod-info" (OuterVolumeSpecName: "pod-info") pod "9f596550-b88a-49d7-9cff-cbc2d4149a2e" (UID: "9f596550-b88a-49d7-9cff-cbc2d4149a2e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.919702 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.929777 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f596550-b88a-49d7-9cff-cbc2d4149a2e-config-data" (OuterVolumeSpecName: "config-data") pod "9f596550-b88a-49d7-9cff-cbc2d4149a2e" (UID: "9f596550-b88a-49d7-9cff-cbc2d4149a2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.951834 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "73d7bbf0-dc76-4572-857e-fd0fb59d95cc" (UID: "73d7bbf0-dc76-4572-857e-fd0fb59d95cc"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.965996 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f596550-b88a-49d7-9cff-cbc2d4149a2e-server-conf" (OuterVolumeSpecName: "server-conf") pod "9f596550-b88a-49d7-9cff-cbc2d4149a2e" (UID: "9f596550-b88a-49d7-9cff-cbc2d4149a2e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.990604 4743 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9f596550-b88a-49d7-9cff-cbc2d4149a2e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.990635 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9f596550-b88a-49d7-9cff-cbc2d4149a2e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.990646 4743 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9f596550-b88a-49d7-9cff-cbc2d4149a2e-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.990654 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9f596550-b88a-49d7-9cff-cbc2d4149a2e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.990663 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f596550-b88a-49d7-9cff-cbc2d4149a2e-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.990672 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/73d7bbf0-dc76-4572-857e-fd0fb59d95cc-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.990697 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.990706 4743 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9f596550-b88a-49d7-9cff-cbc2d4149a2e-server-conf\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.990716 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9f596550-b88a-49d7-9cff-cbc2d4149a2e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.990724 4743 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9f596550-b88a-49d7-9cff-cbc2d4149a2e-pod-info\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.990733 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8px2z\" (UniqueName: \"kubernetes.io/projected/9f596550-b88a-49d7-9cff-cbc2d4149a2e-kube-api-access-8px2z\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.990741 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:31 crc kubenswrapper[4743]: I1011 01:16:31.999924 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f596550-b88a-49d7-9cff-cbc2d4149a2e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "9f596550-b88a-49d7-9cff-cbc2d4149a2e" (UID: "9f596550-b88a-49d7-9cff-cbc2d4149a2e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.016778 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.094915 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.094967 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9f596550-b88a-49d7-9cff-cbc2d4149a2e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.679029 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.680133 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d54c922a-5193-47c9-9148-7fe897065885","Type":"ContainerStarted","Data":"515415f7e3246ac39bf40a24d7a66fa8a689384792eede987f63a33b3c55a763"} Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.680273 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.707453 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.719699 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.727182 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.737317 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.747764 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 11 01:16:32 crc kubenswrapper[4743]: E1011 01:16:32.748207 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f596550-b88a-49d7-9cff-cbc2d4149a2e" containerName="rabbitmq" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.748223 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f596550-b88a-49d7-9cff-cbc2d4149a2e" containerName="rabbitmq" Oct 11 01:16:32 crc kubenswrapper[4743]: E1011 01:16:32.748242 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d7bbf0-dc76-4572-857e-fd0fb59d95cc" containerName="setup-container" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.748250 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d7bbf0-dc76-4572-857e-fd0fb59d95cc" containerName="setup-container" Oct 11 01:16:32 crc kubenswrapper[4743]: E1011 01:16:32.748268 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d7bbf0-dc76-4572-857e-fd0fb59d95cc" containerName="rabbitmq" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.748275 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d7bbf0-dc76-4572-857e-fd0fb59d95cc" containerName="rabbitmq" Oct 11 01:16:32 crc kubenswrapper[4743]: E1011 01:16:32.748299 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f596550-b88a-49d7-9cff-cbc2d4149a2e" containerName="setup-container" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.748308 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f596550-b88a-49d7-9cff-cbc2d4149a2e" containerName="setup-container" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.748563 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f596550-b88a-49d7-9cff-cbc2d4149a2e" containerName="rabbitmq" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.748579 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="73d7bbf0-dc76-4572-857e-fd0fb59d95cc" containerName="rabbitmq" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.749796 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.754709 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.754828 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-5sk4m" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.754872 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.754834 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.754781 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.754715 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.756301 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.759279 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.767570 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.791172 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.794332 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.794416 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.794617 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.794739 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.794996 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.795112 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.795170 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-qgqm2" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.863907 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.917877 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/de84c29c-4168-4383-aadc-0d5cc0ba56f8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.917920 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de84c29c-4168-4383-aadc-0d5cc0ba56f8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.917956 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/de84c29c-4168-4383-aadc-0d5cc0ba56f8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.917987 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/de84c29c-4168-4383-aadc-0d5cc0ba56f8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.918006 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38225901-8300-41cc-8e32-748b754660dc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.918059 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/de84c29c-4168-4383-aadc-0d5cc0ba56f8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.918082 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/de84c29c-4168-4383-aadc-0d5cc0ba56f8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.918134 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38225901-8300-41cc-8e32-748b754660dc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.918157 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38225901-8300-41cc-8e32-748b754660dc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.918192 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdf5b\" (UniqueName: \"kubernetes.io/projected/de84c29c-4168-4383-aadc-0d5cc0ba56f8-kube-api-access-bdf5b\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.918214 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38225901-8300-41cc-8e32-748b754660dc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.918229 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38225901-8300-41cc-8e32-748b754660dc-config-data\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.918261 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/de84c29c-4168-4383-aadc-0d5cc0ba56f8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.918307 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/de84c29c-4168-4383-aadc-0d5cc0ba56f8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.918342 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38225901-8300-41cc-8e32-748b754660dc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.918373 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.918424 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38225901-8300-41cc-8e32-748b754660dc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.918445 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m59k2\" (UniqueName: \"kubernetes.io/projected/38225901-8300-41cc-8e32-748b754660dc-kube-api-access-m59k2\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.918463 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38225901-8300-41cc-8e32-748b754660dc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.918502 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38225901-8300-41cc-8e32-748b754660dc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.918538 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/de84c29c-4168-4383-aadc-0d5cc0ba56f8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:32 crc kubenswrapper[4743]: I1011 01:16:32.918568 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.020072 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38225901-8300-41cc-8e32-748b754660dc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.020116 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m59k2\" (UniqueName: \"kubernetes.io/projected/38225901-8300-41cc-8e32-748b754660dc-kube-api-access-m59k2\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.020143 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38225901-8300-41cc-8e32-748b754660dc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.020166 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38225901-8300-41cc-8e32-748b754660dc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.020202 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/de84c29c-4168-4383-aadc-0d5cc0ba56f8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.020220 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.020244 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/de84c29c-4168-4383-aadc-0d5cc0ba56f8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.020261 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de84c29c-4168-4383-aadc-0d5cc0ba56f8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.020281 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/de84c29c-4168-4383-aadc-0d5cc0ba56f8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.020300 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/de84c29c-4168-4383-aadc-0d5cc0ba56f8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.020315 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38225901-8300-41cc-8e32-748b754660dc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.020349 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/de84c29c-4168-4383-aadc-0d5cc0ba56f8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.020371 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/de84c29c-4168-4383-aadc-0d5cc0ba56f8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.020406 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38225901-8300-41cc-8e32-748b754660dc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.020427 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38225901-8300-41cc-8e32-748b754660dc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.020444 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdf5b\" (UniqueName: \"kubernetes.io/projected/de84c29c-4168-4383-aadc-0d5cc0ba56f8-kube-api-access-bdf5b\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.020461 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38225901-8300-41cc-8e32-748b754660dc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.020475 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38225901-8300-41cc-8e32-748b754660dc-config-data\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.020490 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/de84c29c-4168-4383-aadc-0d5cc0ba56f8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.020506 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/de84c29c-4168-4383-aadc-0d5cc0ba56f8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.020522 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38225901-8300-41cc-8e32-748b754660dc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.020548 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.020638 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38225901-8300-41cc-8e32-748b754660dc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.021505 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.022184 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/de84c29c-4168-4383-aadc-0d5cc0ba56f8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.023735 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/de84c29c-4168-4383-aadc-0d5cc0ba56f8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.024100 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38225901-8300-41cc-8e32-748b754660dc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.024163 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38225901-8300-41cc-8e32-748b754660dc-config-data\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.024323 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38225901-8300-41cc-8e32-748b754660dc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.024549 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.024673 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/de84c29c-4168-4383-aadc-0d5cc0ba56f8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.026395 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/de84c29c-4168-4383-aadc-0d5cc0ba56f8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.026686 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/de84c29c-4168-4383-aadc-0d5cc0ba56f8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.028177 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38225901-8300-41cc-8e32-748b754660dc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.028201 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de84c29c-4168-4383-aadc-0d5cc0ba56f8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.028606 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/de84c29c-4168-4383-aadc-0d5cc0ba56f8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.028646 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/de84c29c-4168-4383-aadc-0d5cc0ba56f8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.029230 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38225901-8300-41cc-8e32-748b754660dc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.040587 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/de84c29c-4168-4383-aadc-0d5cc0ba56f8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.041438 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m59k2\" (UniqueName: \"kubernetes.io/projected/38225901-8300-41cc-8e32-748b754660dc-kube-api-access-m59k2\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.043485 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38225901-8300-41cc-8e32-748b754660dc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.045618 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38225901-8300-41cc-8e32-748b754660dc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.046220 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38225901-8300-41cc-8e32-748b754660dc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.054380 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdf5b\" (UniqueName: \"kubernetes.io/projected/de84c29c-4168-4383-aadc-0d5cc0ba56f8-kube-api-access-bdf5b\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.086513 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"38225901-8300-41cc-8e32-748b754660dc\") " pod="openstack/rabbitmq-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.087009 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"de84c29c-4168-4383-aadc-0d5cc0ba56f8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.123819 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.141140 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.638281 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 11 01:16:33 crc kubenswrapper[4743]: W1011 01:16:33.653181 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde84c29c_4168_4383_aadc_0d5cc0ba56f8.slice/crio-774fc1c25472a7027e59c22f8e5bb5eac0220e4ad236789038f8c3f27f1ecbbf WatchSource:0}: Error finding container 774fc1c25472a7027e59c22f8e5bb5eac0220e4ad236789038f8c3f27f1ecbbf: Status 404 returned error can't find the container with id 774fc1c25472a7027e59c22f8e5bb5eac0220e4ad236789038f8c3f27f1ecbbf Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.678292 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.690744 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"de84c29c-4168-4383-aadc-0d5cc0ba56f8","Type":"ContainerStarted","Data":"774fc1c25472a7027e59c22f8e5bb5eac0220e4ad236789038f8c3f27f1ecbbf"} Oct 11 01:16:33 crc kubenswrapper[4743]: I1011 01:16:33.692032 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"38225901-8300-41cc-8e32-748b754660dc","Type":"ContainerStarted","Data":"ca66cc77ca565fff3fcece9d062086d82468afbc017cec4c310dd4b659d785b3"} Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.025390 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-b6r4b"] Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.027128 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.029437 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.043805 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-b6r4b"] Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.103562 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73d7bbf0-dc76-4572-857e-fd0fb59d95cc" path="/var/lib/kubelet/pods/73d7bbf0-dc76-4572-857e-fd0fb59d95cc/volumes" Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.107021 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f596550-b88a-49d7-9cff-cbc2d4149a2e" path="/var/lib/kubelet/pods/9f596550-b88a-49d7-9cff-cbc2d4149a2e/volumes" Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.157060 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn6tj\" (UniqueName: \"kubernetes.io/projected/05b1ed8b-8016-4523-afc2-b00ce65b4042-kube-api-access-nn6tj\") pod \"dnsmasq-dns-5b75489c6f-b6r4b\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.157152 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-b6r4b\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.157206 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-b6r4b\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.157261 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-b6r4b\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.157317 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-b6r4b\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.157406 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-config\") pod \"dnsmasq-dns-5b75489c6f-b6r4b\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.157421 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-b6r4b\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.259388 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-b6r4b\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.259441 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-config\") pod \"dnsmasq-dns-5b75489c6f-b6r4b\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.259512 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn6tj\" (UniqueName: \"kubernetes.io/projected/05b1ed8b-8016-4523-afc2-b00ce65b4042-kube-api-access-nn6tj\") pod \"dnsmasq-dns-5b75489c6f-b6r4b\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.259550 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-b6r4b\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.259590 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-b6r4b\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.259684 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-b6r4b\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.259740 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-b6r4b\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.260646 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-b6r4b\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.261267 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-b6r4b\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.261800 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-config\") pod \"dnsmasq-dns-5b75489c6f-b6r4b\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.262696 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-b6r4b\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.262878 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-b6r4b\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.263192 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-b6r4b\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.294837 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn6tj\" (UniqueName: \"kubernetes.io/projected/05b1ed8b-8016-4523-afc2-b00ce65b4042-kube-api-access-nn6tj\") pod \"dnsmasq-dns-5b75489c6f-b6r4b\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:34 crc kubenswrapper[4743]: I1011 01:16:34.380336 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:35 crc kubenswrapper[4743]: I1011 01:16:35.724234 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"38225901-8300-41cc-8e32-748b754660dc","Type":"ContainerStarted","Data":"8a1b6c38209d238277b415eeff99d3ed1a9aba732fcb003e750ae774044b4cf9"} Oct 11 01:16:35 crc kubenswrapper[4743]: I1011 01:16:35.729201 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"de84c29c-4168-4383-aadc-0d5cc0ba56f8","Type":"ContainerStarted","Data":"e2f210d4e9eb1cc201a1f88d953711eae2090cc32d56c1edae66e81284391db4"} Oct 11 01:16:36 crc kubenswrapper[4743]: I1011 01:16:36.898546 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-b6r4b"] Oct 11 01:16:37 crc kubenswrapper[4743]: I1011 01:16:37.758972 4743 generic.go:334] "Generic (PLEG): container finished" podID="05b1ed8b-8016-4523-afc2-b00ce65b4042" containerID="c96b328d1eca40b616c1c88123a83d0fb09e9dab9f4a1f3de6020583e562b969" exitCode=0 Oct 11 01:16:37 crc kubenswrapper[4743]: I1011 01:16:37.759544 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" event={"ID":"05b1ed8b-8016-4523-afc2-b00ce65b4042","Type":"ContainerDied","Data":"c96b328d1eca40b616c1c88123a83d0fb09e9dab9f4a1f3de6020583e562b969"} Oct 11 01:16:37 crc kubenswrapper[4743]: I1011 01:16:37.759915 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" event={"ID":"05b1ed8b-8016-4523-afc2-b00ce65b4042","Type":"ContainerStarted","Data":"a8c7ffd007b5d591e438ca0fdd46c9dd2185a875b3c59f00875e8043dd0ed405"} Oct 11 01:16:37 crc kubenswrapper[4743]: I1011 01:16:37.766313 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d54c922a-5193-47c9-9148-7fe897065885","Type":"ContainerStarted","Data":"eaa60174b8128222df5518c4c906a810bde6bf618a4dc1f13b7192b8d1bb7381"} Oct 11 01:16:37 crc kubenswrapper[4743]: I1011 01:16:37.766369 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d54c922a-5193-47c9-9148-7fe897065885","Type":"ContainerStarted","Data":"b5df9e25f81c30a9e8d87999631a1284a92d5a9f28aa2a96a6785f0d24640fde"} Oct 11 01:16:38 crc kubenswrapper[4743]: I1011 01:16:38.786966 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" event={"ID":"05b1ed8b-8016-4523-afc2-b00ce65b4042","Type":"ContainerStarted","Data":"30548d23cccafe964a4952eb3531005db4eb2292508d6b72c658aeb0b2e37b26"} Oct 11 01:16:38 crc kubenswrapper[4743]: I1011 01:16:38.787595 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:38 crc kubenswrapper[4743]: I1011 01:16:38.791241 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d54c922a-5193-47c9-9148-7fe897065885","Type":"ContainerStarted","Data":"611e40249e0b290add3d2eec42c4b75c490cb1a3909644d026855afd4c4cd9a1"} Oct 11 01:16:38 crc kubenswrapper[4743]: I1011 01:16:38.822217 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" podStartSLOduration=4.822188105 podStartE2EDuration="4.822188105s" podCreationTimestamp="2025-10-11 01:16:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:16:38.819427788 +0000 UTC m=+1493.472408175" watchObservedRunningTime="2025-10-11 01:16:38.822188105 +0000 UTC m=+1493.475168522" Oct 11 01:16:40 crc kubenswrapper[4743]: I1011 01:16:40.821749 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d54c922a-5193-47c9-9148-7fe897065885","Type":"ContainerStarted","Data":"974cbe5c248e09ea2b0b16ba354ab112eba57fa4bbda53a9ab34bddc530fc71f"} Oct 11 01:16:40 crc kubenswrapper[4743]: I1011 01:16:40.822512 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 11 01:16:40 crc kubenswrapper[4743]: I1011 01:16:40.862253 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.624679681 podStartE2EDuration="13.862234552s" podCreationTimestamp="2025-10-11 01:16:27 +0000 UTC" firstStartedPulling="2025-10-11 01:16:31.871380505 +0000 UTC m=+1486.524360892" lastFinishedPulling="2025-10-11 01:16:40.108935356 +0000 UTC m=+1494.761915763" observedRunningTime="2025-10-11 01:16:40.845984127 +0000 UTC m=+1495.498964524" watchObservedRunningTime="2025-10-11 01:16:40.862234552 +0000 UTC m=+1495.515214949" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.383810 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.458216 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.458300 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.458353 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.459329 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.459427 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" gracePeriod=600 Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.482457 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-m4xd7"] Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.487442 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" podUID="229ca9c6-760d-4ea3-9599-bb5cfeeea826" containerName="dnsmasq-dns" containerID="cri-o://95ca4b10ceb0b8f7ea64d770770a6ce3deec6c9a69acefa3ac9ed2b0c87e591f" gracePeriod=10 Oct 11 01:16:44 crc kubenswrapper[4743]: E1011 01:16:44.611715 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.689656 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8"] Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.693675 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.701092 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8"] Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.762653 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-config\") pod \"dnsmasq-dns-5cf7b6cbf7-7wrx8\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.762704 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-dns-svc\") pod \"dnsmasq-dns-5cf7b6cbf7-7wrx8\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.762737 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-ovsdbserver-nb\") pod \"dnsmasq-dns-5cf7b6cbf7-7wrx8\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.762787 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbjhh\" (UniqueName: \"kubernetes.io/projected/69b6d455-26e7-498f-9dd4-4a96b7327f62-kube-api-access-gbjhh\") pod \"dnsmasq-dns-5cf7b6cbf7-7wrx8\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.762834 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-openstack-edpm-ipam\") pod \"dnsmasq-dns-5cf7b6cbf7-7wrx8\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.762922 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-dns-swift-storage-0\") pod \"dnsmasq-dns-5cf7b6cbf7-7wrx8\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.762971 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-ovsdbserver-sb\") pod \"dnsmasq-dns-5cf7b6cbf7-7wrx8\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.864479 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbjhh\" (UniqueName: \"kubernetes.io/projected/69b6d455-26e7-498f-9dd4-4a96b7327f62-kube-api-access-gbjhh\") pod \"dnsmasq-dns-5cf7b6cbf7-7wrx8\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.864556 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-openstack-edpm-ipam\") pod \"dnsmasq-dns-5cf7b6cbf7-7wrx8\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.864597 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-dns-swift-storage-0\") pod \"dnsmasq-dns-5cf7b6cbf7-7wrx8\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.864651 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-ovsdbserver-sb\") pod \"dnsmasq-dns-5cf7b6cbf7-7wrx8\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.864700 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-config\") pod \"dnsmasq-dns-5cf7b6cbf7-7wrx8\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.864726 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-dns-svc\") pod \"dnsmasq-dns-5cf7b6cbf7-7wrx8\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.864757 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-ovsdbserver-nb\") pod \"dnsmasq-dns-5cf7b6cbf7-7wrx8\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.865758 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-ovsdbserver-sb\") pod \"dnsmasq-dns-5cf7b6cbf7-7wrx8\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.865961 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-config\") pod \"dnsmasq-dns-5cf7b6cbf7-7wrx8\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.866647 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-dns-swift-storage-0\") pod \"dnsmasq-dns-5cf7b6cbf7-7wrx8\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.866912 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-dns-svc\") pod \"dnsmasq-dns-5cf7b6cbf7-7wrx8\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.867070 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-openstack-edpm-ipam\") pod \"dnsmasq-dns-5cf7b6cbf7-7wrx8\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.867555 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-ovsdbserver-nb\") pod \"dnsmasq-dns-5cf7b6cbf7-7wrx8\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.887203 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbjhh\" (UniqueName: \"kubernetes.io/projected/69b6d455-26e7-498f-9dd4-4a96b7327f62-kube-api-access-gbjhh\") pod \"dnsmasq-dns-5cf7b6cbf7-7wrx8\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.908401 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" exitCode=0 Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.908620 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1"} Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.908829 4743 scope.go:117] "RemoveContainer" containerID="bdc2fd3e645a7f36140a058209779fdbf1154f0849a37453796b08adc03a7cc1" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.909658 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:16:44 crc kubenswrapper[4743]: E1011 01:16:44.910309 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.913487 4743 generic.go:334] "Generic (PLEG): container finished" podID="229ca9c6-760d-4ea3-9599-bb5cfeeea826" containerID="95ca4b10ceb0b8f7ea64d770770a6ce3deec6c9a69acefa3ac9ed2b0c87e591f" exitCode=0 Oct 11 01:16:44 crc kubenswrapper[4743]: I1011 01:16:44.913520 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" event={"ID":"229ca9c6-760d-4ea3-9599-bb5cfeeea826","Type":"ContainerDied","Data":"95ca4b10ceb0b8f7ea64d770770a6ce3deec6c9a69acefa3ac9ed2b0c87e591f"} Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.018316 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.136934 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.171466 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-ovsdbserver-sb\") pod \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\" (UID: \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\") " Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.171528 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-ovsdbserver-nb\") pod \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\" (UID: \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\") " Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.171651 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-dns-swift-storage-0\") pod \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\" (UID: \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\") " Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.171689 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-dns-svc\") pod \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\" (UID: \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\") " Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.171720 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-config\") pod \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\" (UID: \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\") " Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.171746 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5vxg\" (UniqueName: \"kubernetes.io/projected/229ca9c6-760d-4ea3-9599-bb5cfeeea826-kube-api-access-f5vxg\") pod \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\" (UID: \"229ca9c6-760d-4ea3-9599-bb5cfeeea826\") " Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.182002 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/229ca9c6-760d-4ea3-9599-bb5cfeeea826-kube-api-access-f5vxg" (OuterVolumeSpecName: "kube-api-access-f5vxg") pod "229ca9c6-760d-4ea3-9599-bb5cfeeea826" (UID: "229ca9c6-760d-4ea3-9599-bb5cfeeea826"). InnerVolumeSpecName "kube-api-access-f5vxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.226064 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "229ca9c6-760d-4ea3-9599-bb5cfeeea826" (UID: "229ca9c6-760d-4ea3-9599-bb5cfeeea826"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.227374 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "229ca9c6-760d-4ea3-9599-bb5cfeeea826" (UID: "229ca9c6-760d-4ea3-9599-bb5cfeeea826"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.232261 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "229ca9c6-760d-4ea3-9599-bb5cfeeea826" (UID: "229ca9c6-760d-4ea3-9599-bb5cfeeea826"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.239800 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-config" (OuterVolumeSpecName: "config") pod "229ca9c6-760d-4ea3-9599-bb5cfeeea826" (UID: "229ca9c6-760d-4ea3-9599-bb5cfeeea826"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.249134 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "229ca9c6-760d-4ea3-9599-bb5cfeeea826" (UID: "229ca9c6-760d-4ea3-9599-bb5cfeeea826"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.274377 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.274416 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.274426 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.274439 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.274453 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/229ca9c6-760d-4ea3-9599-bb5cfeeea826-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.274464 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5vxg\" (UniqueName: \"kubernetes.io/projected/229ca9c6-760d-4ea3-9599-bb5cfeeea826-kube-api-access-f5vxg\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.554830 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8"] Oct 11 01:16:45 crc kubenswrapper[4743]: W1011 01:16:45.554890 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69b6d455_26e7_498f_9dd4_4a96b7327f62.slice/crio-359512a022bcb9e36283cdc14cfa3263bfa9fe3d8d214f86241513a54d3f81b3 WatchSource:0}: Error finding container 359512a022bcb9e36283cdc14cfa3263bfa9fe3d8d214f86241513a54d3f81b3: Status 404 returned error can't find the container with id 359512a022bcb9e36283cdc14cfa3263bfa9fe3d8d214f86241513a54d3f81b3 Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.931663 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.931657 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" event={"ID":"229ca9c6-760d-4ea3-9599-bb5cfeeea826","Type":"ContainerDied","Data":"73a74bb2f1eca33e04276811345bed0cc4307816323e299082df2e988efc3bc9"} Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.932085 4743 scope.go:117] "RemoveContainer" containerID="95ca4b10ceb0b8f7ea64d770770a6ce3deec6c9a69acefa3ac9ed2b0c87e591f" Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.935263 4743 generic.go:334] "Generic (PLEG): container finished" podID="69b6d455-26e7-498f-9dd4-4a96b7327f62" containerID="49fabfdcfab5ad7e3997023f0c6601548d101517477040c8268fbaf7110775a4" exitCode=0 Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.935294 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" event={"ID":"69b6d455-26e7-498f-9dd4-4a96b7327f62","Type":"ContainerDied","Data":"49fabfdcfab5ad7e3997023f0c6601548d101517477040c8268fbaf7110775a4"} Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.935316 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" event={"ID":"69b6d455-26e7-498f-9dd4-4a96b7327f62","Type":"ContainerStarted","Data":"359512a022bcb9e36283cdc14cfa3263bfa9fe3d8d214f86241513a54d3f81b3"} Oct 11 01:16:45 crc kubenswrapper[4743]: I1011 01:16:45.983341 4743 scope.go:117] "RemoveContainer" containerID="3712c50e49f77421ca79623016f56cb42e253a0bfdaa9c338c16f927ee50382a" Oct 11 01:16:46 crc kubenswrapper[4743]: I1011 01:16:46.002461 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-m4xd7"] Oct 11 01:16:46 crc kubenswrapper[4743]: I1011 01:16:46.016414 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-m4xd7"] Oct 11 01:16:46 crc kubenswrapper[4743]: I1011 01:16:46.103466 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="229ca9c6-760d-4ea3-9599-bb5cfeeea826" path="/var/lib/kubelet/pods/229ca9c6-760d-4ea3-9599-bb5cfeeea826/volumes" Oct 11 01:16:46 crc kubenswrapper[4743]: I1011 01:16:46.950557 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" event={"ID":"69b6d455-26e7-498f-9dd4-4a96b7327f62","Type":"ContainerStarted","Data":"d00168936a54cc20e1f81b72604436656d7eb0bcfadcddde09b963478e9239e9"} Oct 11 01:16:46 crc kubenswrapper[4743]: I1011 01:16:46.950939 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 01:16:48 crc kubenswrapper[4743]: I1011 01:16:48.702002 4743 scope.go:117] "RemoveContainer" containerID="481a1258a6422d0adb2d8063cebd0064a3027d185956e75d1ac262e1e7ae0dbb" Oct 11 01:16:48 crc kubenswrapper[4743]: I1011 01:16:48.743058 4743 scope.go:117] "RemoveContainer" containerID="71bd50ca24278810cfdfc1d2d9fe23b63e4ca59dbc1c1013a7228649e54927a6" Oct 11 01:16:48 crc kubenswrapper[4743]: I1011 01:16:48.976780 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-lc4jr" event={"ID":"b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d","Type":"ContainerStarted","Data":"636e1e422bcdceb11e9f438441f678aff8edfed5bd734d0cc4bdc66334a276fe"} Oct 11 01:16:48 crc kubenswrapper[4743]: I1011 01:16:48.999696 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-lc4jr" podStartSLOduration=2.169100801 podStartE2EDuration="35.999679599s" podCreationTimestamp="2025-10-11 01:16:13 +0000 UTC" firstStartedPulling="2025-10-11 01:16:14.434174164 +0000 UTC m=+1469.087154551" lastFinishedPulling="2025-10-11 01:16:48.264752932 +0000 UTC m=+1502.917733349" observedRunningTime="2025-10-11 01:16:48.995793849 +0000 UTC m=+1503.648774246" watchObservedRunningTime="2025-10-11 01:16:48.999679599 +0000 UTC m=+1503.652659996" Oct 11 01:16:49 crc kubenswrapper[4743]: I1011 01:16:49.000409 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" podStartSLOduration=5.000403844 podStartE2EDuration="5.000403844s" podCreationTimestamp="2025-10-11 01:16:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:16:47.016390062 +0000 UTC m=+1501.669370469" watchObservedRunningTime="2025-10-11 01:16:49.000403844 +0000 UTC m=+1503.653384231" Oct 11 01:16:50 crc kubenswrapper[4743]: I1011 01:16:50.087748 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-f84f9ccf-m4xd7" podUID="229ca9c6-760d-4ea3-9599-bb5cfeeea826" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.230:5353: i/o timeout" Oct 11 01:16:51 crc kubenswrapper[4743]: I1011 01:16:51.019942 4743 generic.go:334] "Generic (PLEG): container finished" podID="b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d" containerID="636e1e422bcdceb11e9f438441f678aff8edfed5bd734d0cc4bdc66334a276fe" exitCode=0 Oct 11 01:16:51 crc kubenswrapper[4743]: I1011 01:16:51.020075 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-lc4jr" event={"ID":"b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d","Type":"ContainerDied","Data":"636e1e422bcdceb11e9f438441f678aff8edfed5bd734d0cc4bdc66334a276fe"} Oct 11 01:16:52 crc kubenswrapper[4743]: I1011 01:16:52.592019 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-lc4jr" Oct 11 01:16:52 crc kubenswrapper[4743]: I1011 01:16:52.753660 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thcvg\" (UniqueName: \"kubernetes.io/projected/b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d-kube-api-access-thcvg\") pod \"b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d\" (UID: \"b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d\") " Oct 11 01:16:52 crc kubenswrapper[4743]: I1011 01:16:52.753939 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d-combined-ca-bundle\") pod \"b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d\" (UID: \"b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d\") " Oct 11 01:16:52 crc kubenswrapper[4743]: I1011 01:16:52.754016 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d-config-data\") pod \"b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d\" (UID: \"b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d\") " Oct 11 01:16:52 crc kubenswrapper[4743]: I1011 01:16:52.785143 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d-kube-api-access-thcvg" (OuterVolumeSpecName: "kube-api-access-thcvg") pod "b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d" (UID: "b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d"). InnerVolumeSpecName "kube-api-access-thcvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:16:52 crc kubenswrapper[4743]: I1011 01:16:52.863838 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thcvg\" (UniqueName: \"kubernetes.io/projected/b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d-kube-api-access-thcvg\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:52 crc kubenswrapper[4743]: I1011 01:16:52.909037 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d" (UID: "b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:16:52 crc kubenswrapper[4743]: I1011 01:16:52.979008 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:52 crc kubenswrapper[4743]: I1011 01:16:52.984969 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d-config-data" (OuterVolumeSpecName: "config-data") pod "b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d" (UID: "b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:16:53 crc kubenswrapper[4743]: I1011 01:16:53.045171 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-lc4jr" event={"ID":"b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d","Type":"ContainerDied","Data":"2817ba897455312bafa7b3314c446473f99b59005703c0ed14fabf4509c93d7a"} Oct 11 01:16:53 crc kubenswrapper[4743]: I1011 01:16:53.045210 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2817ba897455312bafa7b3314c446473f99b59005703c0ed14fabf4509c93d7a" Oct 11 01:16:53 crc kubenswrapper[4743]: I1011 01:16:53.045266 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-lc4jr" Oct 11 01:16:53 crc kubenswrapper[4743]: I1011 01:16:53.081387 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:53 crc kubenswrapper[4743]: I1011 01:16:53.963196 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-57fbf8bd-mp4f8"] Oct 11 01:16:53 crc kubenswrapper[4743]: E1011 01:16:53.963697 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="229ca9c6-760d-4ea3-9599-bb5cfeeea826" containerName="dnsmasq-dns" Oct 11 01:16:53 crc kubenswrapper[4743]: I1011 01:16:53.963709 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="229ca9c6-760d-4ea3-9599-bb5cfeeea826" containerName="dnsmasq-dns" Oct 11 01:16:53 crc kubenswrapper[4743]: E1011 01:16:53.963725 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d" containerName="heat-db-sync" Oct 11 01:16:53 crc kubenswrapper[4743]: I1011 01:16:53.963731 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d" containerName="heat-db-sync" Oct 11 01:16:53 crc kubenswrapper[4743]: E1011 01:16:53.963742 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="229ca9c6-760d-4ea3-9599-bb5cfeeea826" containerName="init" Oct 11 01:16:53 crc kubenswrapper[4743]: I1011 01:16:53.963748 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="229ca9c6-760d-4ea3-9599-bb5cfeeea826" containerName="init" Oct 11 01:16:53 crc kubenswrapper[4743]: I1011 01:16:53.963961 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d" containerName="heat-db-sync" Oct 11 01:16:53 crc kubenswrapper[4743]: I1011 01:16:53.963982 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="229ca9c6-760d-4ea3-9599-bb5cfeeea826" containerName="dnsmasq-dns" Oct 11 01:16:53 crc kubenswrapper[4743]: I1011 01:16:53.964750 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-57fbf8bd-mp4f8" Oct 11 01:16:53 crc kubenswrapper[4743]: I1011 01:16:53.983760 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-57fbf8bd-mp4f8"] Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.001661 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae21bf76-1584-4681-b679-29abbf1ef22a-config-data-custom\") pod \"heat-engine-57fbf8bd-mp4f8\" (UID: \"ae21bf76-1584-4681-b679-29abbf1ef22a\") " pod="openstack/heat-engine-57fbf8bd-mp4f8" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.001770 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z77n\" (UniqueName: \"kubernetes.io/projected/ae21bf76-1584-4681-b679-29abbf1ef22a-kube-api-access-4z77n\") pod \"heat-engine-57fbf8bd-mp4f8\" (UID: \"ae21bf76-1584-4681-b679-29abbf1ef22a\") " pod="openstack/heat-engine-57fbf8bd-mp4f8" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.001840 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae21bf76-1584-4681-b679-29abbf1ef22a-combined-ca-bundle\") pod \"heat-engine-57fbf8bd-mp4f8\" (UID: \"ae21bf76-1584-4681-b679-29abbf1ef22a\") " pod="openstack/heat-engine-57fbf8bd-mp4f8" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.002114 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae21bf76-1584-4681-b679-29abbf1ef22a-config-data\") pod \"heat-engine-57fbf8bd-mp4f8\" (UID: \"ae21bf76-1584-4681-b679-29abbf1ef22a\") " pod="openstack/heat-engine-57fbf8bd-mp4f8" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.044972 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-657c9dff6b-rgphg"] Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.047533 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-657c9dff6b-rgphg" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.058263 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-657c9dff6b-rgphg"] Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.086811 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7b7546fb69-4lvd7"] Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.091183 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7b7546fb69-4lvd7" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.104180 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr4vh\" (UniqueName: \"kubernetes.io/projected/de562f4f-80d8-407b-bf5a-9b584e013294-kube-api-access-wr4vh\") pod \"heat-api-657c9dff6b-rgphg\" (UID: \"de562f4f-80d8-407b-bf5a-9b584e013294\") " pod="openstack/heat-api-657c9dff6b-rgphg" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.104239 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de562f4f-80d8-407b-bf5a-9b584e013294-combined-ca-bundle\") pod \"heat-api-657c9dff6b-rgphg\" (UID: \"de562f4f-80d8-407b-bf5a-9b584e013294\") " pod="openstack/heat-api-657c9dff6b-rgphg" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.104309 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de562f4f-80d8-407b-bf5a-9b584e013294-config-data\") pod \"heat-api-657c9dff6b-rgphg\" (UID: \"de562f4f-80d8-407b-bf5a-9b584e013294\") " pod="openstack/heat-api-657c9dff6b-rgphg" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.104339 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae21bf76-1584-4681-b679-29abbf1ef22a-config-data\") pod \"heat-engine-57fbf8bd-mp4f8\" (UID: \"ae21bf76-1584-4681-b679-29abbf1ef22a\") " pod="openstack/heat-engine-57fbf8bd-mp4f8" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.104388 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db0cfd47-4287-4526-8e6b-0fd5bd770a1c-internal-tls-certs\") pod \"heat-cfnapi-7b7546fb69-4lvd7\" (UID: \"db0cfd47-4287-4526-8e6b-0fd5bd770a1c\") " pod="openstack/heat-cfnapi-7b7546fb69-4lvd7" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.104442 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de562f4f-80d8-407b-bf5a-9b584e013294-internal-tls-certs\") pod \"heat-api-657c9dff6b-rgphg\" (UID: \"de562f4f-80d8-407b-bf5a-9b584e013294\") " pod="openstack/heat-api-657c9dff6b-rgphg" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.104472 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae21bf76-1584-4681-b679-29abbf1ef22a-config-data-custom\") pod \"heat-engine-57fbf8bd-mp4f8\" (UID: \"ae21bf76-1584-4681-b679-29abbf1ef22a\") " pod="openstack/heat-engine-57fbf8bd-mp4f8" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.104496 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db0cfd47-4287-4526-8e6b-0fd5bd770a1c-config-data-custom\") pod \"heat-cfnapi-7b7546fb69-4lvd7\" (UID: \"db0cfd47-4287-4526-8e6b-0fd5bd770a1c\") " pod="openstack/heat-cfnapi-7b7546fb69-4lvd7" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.104517 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db0cfd47-4287-4526-8e6b-0fd5bd770a1c-public-tls-certs\") pod \"heat-cfnapi-7b7546fb69-4lvd7\" (UID: \"db0cfd47-4287-4526-8e6b-0fd5bd770a1c\") " pod="openstack/heat-cfnapi-7b7546fb69-4lvd7" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.104538 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z77n\" (UniqueName: \"kubernetes.io/projected/ae21bf76-1584-4681-b679-29abbf1ef22a-kube-api-access-4z77n\") pod \"heat-engine-57fbf8bd-mp4f8\" (UID: \"ae21bf76-1584-4681-b679-29abbf1ef22a\") " pod="openstack/heat-engine-57fbf8bd-mp4f8" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.104564 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae21bf76-1584-4681-b679-29abbf1ef22a-combined-ca-bundle\") pod \"heat-engine-57fbf8bd-mp4f8\" (UID: \"ae21bf76-1584-4681-b679-29abbf1ef22a\") " pod="openstack/heat-engine-57fbf8bd-mp4f8" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.104589 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de562f4f-80d8-407b-bf5a-9b584e013294-config-data-custom\") pod \"heat-api-657c9dff6b-rgphg\" (UID: \"de562f4f-80d8-407b-bf5a-9b584e013294\") " pod="openstack/heat-api-657c9dff6b-rgphg" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.104613 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db0cfd47-4287-4526-8e6b-0fd5bd770a1c-config-data\") pod \"heat-cfnapi-7b7546fb69-4lvd7\" (UID: \"db0cfd47-4287-4526-8e6b-0fd5bd770a1c\") " pod="openstack/heat-cfnapi-7b7546fb69-4lvd7" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.104664 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db0cfd47-4287-4526-8e6b-0fd5bd770a1c-combined-ca-bundle\") pod \"heat-cfnapi-7b7546fb69-4lvd7\" (UID: \"db0cfd47-4287-4526-8e6b-0fd5bd770a1c\") " pod="openstack/heat-cfnapi-7b7546fb69-4lvd7" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.104701 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de562f4f-80d8-407b-bf5a-9b584e013294-public-tls-certs\") pod \"heat-api-657c9dff6b-rgphg\" (UID: \"de562f4f-80d8-407b-bf5a-9b584e013294\") " pod="openstack/heat-api-657c9dff6b-rgphg" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.104721 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btq4w\" (UniqueName: \"kubernetes.io/projected/db0cfd47-4287-4526-8e6b-0fd5bd770a1c-kube-api-access-btq4w\") pod \"heat-cfnapi-7b7546fb69-4lvd7\" (UID: \"db0cfd47-4287-4526-8e6b-0fd5bd770a1c\") " pod="openstack/heat-cfnapi-7b7546fb69-4lvd7" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.112713 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae21bf76-1584-4681-b679-29abbf1ef22a-config-data-custom\") pod \"heat-engine-57fbf8bd-mp4f8\" (UID: \"ae21bf76-1584-4681-b679-29abbf1ef22a\") " pod="openstack/heat-engine-57fbf8bd-mp4f8" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.122894 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae21bf76-1584-4681-b679-29abbf1ef22a-combined-ca-bundle\") pod \"heat-engine-57fbf8bd-mp4f8\" (UID: \"ae21bf76-1584-4681-b679-29abbf1ef22a\") " pod="openstack/heat-engine-57fbf8bd-mp4f8" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.124314 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae21bf76-1584-4681-b679-29abbf1ef22a-config-data\") pod \"heat-engine-57fbf8bd-mp4f8\" (UID: \"ae21bf76-1584-4681-b679-29abbf1ef22a\") " pod="openstack/heat-engine-57fbf8bd-mp4f8" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.134905 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z77n\" (UniqueName: \"kubernetes.io/projected/ae21bf76-1584-4681-b679-29abbf1ef22a-kube-api-access-4z77n\") pod \"heat-engine-57fbf8bd-mp4f8\" (UID: \"ae21bf76-1584-4681-b679-29abbf1ef22a\") " pod="openstack/heat-engine-57fbf8bd-mp4f8" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.144425 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7b7546fb69-4lvd7"] Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.211701 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de562f4f-80d8-407b-bf5a-9b584e013294-public-tls-certs\") pod \"heat-api-657c9dff6b-rgphg\" (UID: \"de562f4f-80d8-407b-bf5a-9b584e013294\") " pod="openstack/heat-api-657c9dff6b-rgphg" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.212005 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btq4w\" (UniqueName: \"kubernetes.io/projected/db0cfd47-4287-4526-8e6b-0fd5bd770a1c-kube-api-access-btq4w\") pod \"heat-cfnapi-7b7546fb69-4lvd7\" (UID: \"db0cfd47-4287-4526-8e6b-0fd5bd770a1c\") " pod="openstack/heat-cfnapi-7b7546fb69-4lvd7" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.212046 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr4vh\" (UniqueName: \"kubernetes.io/projected/de562f4f-80d8-407b-bf5a-9b584e013294-kube-api-access-wr4vh\") pod \"heat-api-657c9dff6b-rgphg\" (UID: \"de562f4f-80d8-407b-bf5a-9b584e013294\") " pod="openstack/heat-api-657c9dff6b-rgphg" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.212066 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de562f4f-80d8-407b-bf5a-9b584e013294-combined-ca-bundle\") pod \"heat-api-657c9dff6b-rgphg\" (UID: \"de562f4f-80d8-407b-bf5a-9b584e013294\") " pod="openstack/heat-api-657c9dff6b-rgphg" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.212115 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de562f4f-80d8-407b-bf5a-9b584e013294-config-data\") pod \"heat-api-657c9dff6b-rgphg\" (UID: \"de562f4f-80d8-407b-bf5a-9b584e013294\") " pod="openstack/heat-api-657c9dff6b-rgphg" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.212146 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db0cfd47-4287-4526-8e6b-0fd5bd770a1c-internal-tls-certs\") pod \"heat-cfnapi-7b7546fb69-4lvd7\" (UID: \"db0cfd47-4287-4526-8e6b-0fd5bd770a1c\") " pod="openstack/heat-cfnapi-7b7546fb69-4lvd7" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.212185 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de562f4f-80d8-407b-bf5a-9b584e013294-internal-tls-certs\") pod \"heat-api-657c9dff6b-rgphg\" (UID: \"de562f4f-80d8-407b-bf5a-9b584e013294\") " pod="openstack/heat-api-657c9dff6b-rgphg" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.212211 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db0cfd47-4287-4526-8e6b-0fd5bd770a1c-config-data-custom\") pod \"heat-cfnapi-7b7546fb69-4lvd7\" (UID: \"db0cfd47-4287-4526-8e6b-0fd5bd770a1c\") " pod="openstack/heat-cfnapi-7b7546fb69-4lvd7" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.212231 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db0cfd47-4287-4526-8e6b-0fd5bd770a1c-public-tls-certs\") pod \"heat-cfnapi-7b7546fb69-4lvd7\" (UID: \"db0cfd47-4287-4526-8e6b-0fd5bd770a1c\") " pod="openstack/heat-cfnapi-7b7546fb69-4lvd7" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.212285 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de562f4f-80d8-407b-bf5a-9b584e013294-config-data-custom\") pod \"heat-api-657c9dff6b-rgphg\" (UID: \"de562f4f-80d8-407b-bf5a-9b584e013294\") " pod="openstack/heat-api-657c9dff6b-rgphg" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.212305 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db0cfd47-4287-4526-8e6b-0fd5bd770a1c-config-data\") pod \"heat-cfnapi-7b7546fb69-4lvd7\" (UID: \"db0cfd47-4287-4526-8e6b-0fd5bd770a1c\") " pod="openstack/heat-cfnapi-7b7546fb69-4lvd7" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.212376 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db0cfd47-4287-4526-8e6b-0fd5bd770a1c-combined-ca-bundle\") pod \"heat-cfnapi-7b7546fb69-4lvd7\" (UID: \"db0cfd47-4287-4526-8e6b-0fd5bd770a1c\") " pod="openstack/heat-cfnapi-7b7546fb69-4lvd7" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.215895 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de562f4f-80d8-407b-bf5a-9b584e013294-public-tls-certs\") pod \"heat-api-657c9dff6b-rgphg\" (UID: \"de562f4f-80d8-407b-bf5a-9b584e013294\") " pod="openstack/heat-api-657c9dff6b-rgphg" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.218023 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de562f4f-80d8-407b-bf5a-9b584e013294-combined-ca-bundle\") pod \"heat-api-657c9dff6b-rgphg\" (UID: \"de562f4f-80d8-407b-bf5a-9b584e013294\") " pod="openstack/heat-api-657c9dff6b-rgphg" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.219053 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db0cfd47-4287-4526-8e6b-0fd5bd770a1c-internal-tls-certs\") pod \"heat-cfnapi-7b7546fb69-4lvd7\" (UID: \"db0cfd47-4287-4526-8e6b-0fd5bd770a1c\") " pod="openstack/heat-cfnapi-7b7546fb69-4lvd7" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.219087 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db0cfd47-4287-4526-8e6b-0fd5bd770a1c-combined-ca-bundle\") pod \"heat-cfnapi-7b7546fb69-4lvd7\" (UID: \"db0cfd47-4287-4526-8e6b-0fd5bd770a1c\") " pod="openstack/heat-cfnapi-7b7546fb69-4lvd7" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.219279 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db0cfd47-4287-4526-8e6b-0fd5bd770a1c-public-tls-certs\") pod \"heat-cfnapi-7b7546fb69-4lvd7\" (UID: \"db0cfd47-4287-4526-8e6b-0fd5bd770a1c\") " pod="openstack/heat-cfnapi-7b7546fb69-4lvd7" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.219611 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de562f4f-80d8-407b-bf5a-9b584e013294-config-data-custom\") pod \"heat-api-657c9dff6b-rgphg\" (UID: \"de562f4f-80d8-407b-bf5a-9b584e013294\") " pod="openstack/heat-api-657c9dff6b-rgphg" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.219759 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de562f4f-80d8-407b-bf5a-9b584e013294-config-data\") pod \"heat-api-657c9dff6b-rgphg\" (UID: \"de562f4f-80d8-407b-bf5a-9b584e013294\") " pod="openstack/heat-api-657c9dff6b-rgphg" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.228708 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de562f4f-80d8-407b-bf5a-9b584e013294-internal-tls-certs\") pod \"heat-api-657c9dff6b-rgphg\" (UID: \"de562f4f-80d8-407b-bf5a-9b584e013294\") " pod="openstack/heat-api-657c9dff6b-rgphg" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.228999 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db0cfd47-4287-4526-8e6b-0fd5bd770a1c-config-data\") pod \"heat-cfnapi-7b7546fb69-4lvd7\" (UID: \"db0cfd47-4287-4526-8e6b-0fd5bd770a1c\") " pod="openstack/heat-cfnapi-7b7546fb69-4lvd7" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.229006 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db0cfd47-4287-4526-8e6b-0fd5bd770a1c-config-data-custom\") pod \"heat-cfnapi-7b7546fb69-4lvd7\" (UID: \"db0cfd47-4287-4526-8e6b-0fd5bd770a1c\") " pod="openstack/heat-cfnapi-7b7546fb69-4lvd7" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.232069 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btq4w\" (UniqueName: \"kubernetes.io/projected/db0cfd47-4287-4526-8e6b-0fd5bd770a1c-kube-api-access-btq4w\") pod \"heat-cfnapi-7b7546fb69-4lvd7\" (UID: \"db0cfd47-4287-4526-8e6b-0fd5bd770a1c\") " pod="openstack/heat-cfnapi-7b7546fb69-4lvd7" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.232876 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr4vh\" (UniqueName: \"kubernetes.io/projected/de562f4f-80d8-407b-bf5a-9b584e013294-kube-api-access-wr4vh\") pod \"heat-api-657c9dff6b-rgphg\" (UID: \"de562f4f-80d8-407b-bf5a-9b584e013294\") " pod="openstack/heat-api-657c9dff6b-rgphg" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.306792 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-57fbf8bd-mp4f8" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.372737 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-657c9dff6b-rgphg" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.516630 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7b7546fb69-4lvd7" Oct 11 01:16:54 crc kubenswrapper[4743]: I1011 01:16:54.914558 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-57fbf8bd-mp4f8"] Oct 11 01:16:55 crc kubenswrapper[4743]: I1011 01:16:55.017920 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-27lqn"] Oct 11 01:16:55 crc kubenswrapper[4743]: I1011 01:16:55.034247 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 01:16:55 crc kubenswrapper[4743]: I1011 01:16:55.034814 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-27lqn"] Oct 11 01:16:55 crc kubenswrapper[4743]: I1011 01:16:55.034455 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-27lqn" Oct 11 01:16:55 crc kubenswrapper[4743]: I1011 01:16:55.113374 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-57fbf8bd-mp4f8" event={"ID":"ae21bf76-1584-4681-b679-29abbf1ef22a","Type":"ContainerStarted","Data":"3dcc80b4b88a30d68c96879c8ba2b77018c348c83810edd9af2754a7dbbf9960"} Oct 11 01:16:55 crc kubenswrapper[4743]: I1011 01:16:55.127830 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-657c9dff6b-rgphg"] Oct 11 01:16:55 crc kubenswrapper[4743]: I1011 01:16:55.142198 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff506255-3f2c-4edd-8c13-dea86ef463ae-catalog-content\") pod \"community-operators-27lqn\" (UID: \"ff506255-3f2c-4edd-8c13-dea86ef463ae\") " pod="openshift-marketplace/community-operators-27lqn" Oct 11 01:16:55 crc kubenswrapper[4743]: I1011 01:16:55.142278 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff506255-3f2c-4edd-8c13-dea86ef463ae-utilities\") pod \"community-operators-27lqn\" (UID: \"ff506255-3f2c-4edd-8c13-dea86ef463ae\") " pod="openshift-marketplace/community-operators-27lqn" Oct 11 01:16:55 crc kubenswrapper[4743]: I1011 01:16:55.142534 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fndg\" (UniqueName: \"kubernetes.io/projected/ff506255-3f2c-4edd-8c13-dea86ef463ae-kube-api-access-5fndg\") pod \"community-operators-27lqn\" (UID: \"ff506255-3f2c-4edd-8c13-dea86ef463ae\") " pod="openshift-marketplace/community-operators-27lqn" Oct 11 01:16:55 crc kubenswrapper[4743]: I1011 01:16:55.167935 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-b6r4b"] Oct 11 01:16:55 crc kubenswrapper[4743]: I1011 01:16:55.168297 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" podUID="05b1ed8b-8016-4523-afc2-b00ce65b4042" containerName="dnsmasq-dns" containerID="cri-o://30548d23cccafe964a4952eb3531005db4eb2292508d6b72c658aeb0b2e37b26" gracePeriod=10 Oct 11 01:16:55 crc kubenswrapper[4743]: I1011 01:16:55.245555 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff506255-3f2c-4edd-8c13-dea86ef463ae-catalog-content\") pod \"community-operators-27lqn\" (UID: \"ff506255-3f2c-4edd-8c13-dea86ef463ae\") " pod="openshift-marketplace/community-operators-27lqn" Oct 11 01:16:55 crc kubenswrapper[4743]: I1011 01:16:55.245664 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff506255-3f2c-4edd-8c13-dea86ef463ae-utilities\") pod \"community-operators-27lqn\" (UID: \"ff506255-3f2c-4edd-8c13-dea86ef463ae\") " pod="openshift-marketplace/community-operators-27lqn" Oct 11 01:16:55 crc kubenswrapper[4743]: I1011 01:16:55.245875 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fndg\" (UniqueName: \"kubernetes.io/projected/ff506255-3f2c-4edd-8c13-dea86ef463ae-kube-api-access-5fndg\") pod \"community-operators-27lqn\" (UID: \"ff506255-3f2c-4edd-8c13-dea86ef463ae\") " pod="openshift-marketplace/community-operators-27lqn" Oct 11 01:16:55 crc kubenswrapper[4743]: I1011 01:16:55.246822 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff506255-3f2c-4edd-8c13-dea86ef463ae-catalog-content\") pod \"community-operators-27lqn\" (UID: \"ff506255-3f2c-4edd-8c13-dea86ef463ae\") " pod="openshift-marketplace/community-operators-27lqn" Oct 11 01:16:55 crc kubenswrapper[4743]: I1011 01:16:55.246892 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7b7546fb69-4lvd7"] Oct 11 01:16:55 crc kubenswrapper[4743]: I1011 01:16:55.247203 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff506255-3f2c-4edd-8c13-dea86ef463ae-utilities\") pod \"community-operators-27lqn\" (UID: \"ff506255-3f2c-4edd-8c13-dea86ef463ae\") " pod="openshift-marketplace/community-operators-27lqn" Oct 11 01:16:55 crc kubenswrapper[4743]: W1011 01:16:55.262401 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb0cfd47_4287_4526_8e6b_0fd5bd770a1c.slice/crio-e963afea429a265f02fe3c0a88f048dbfad22308618ad2a6e0612c56bf59b264 WatchSource:0}: Error finding container e963afea429a265f02fe3c0a88f048dbfad22308618ad2a6e0612c56bf59b264: Status 404 returned error can't find the container with id e963afea429a265f02fe3c0a88f048dbfad22308618ad2a6e0612c56bf59b264 Oct 11 01:16:55 crc kubenswrapper[4743]: I1011 01:16:55.266958 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fndg\" (UniqueName: \"kubernetes.io/projected/ff506255-3f2c-4edd-8c13-dea86ef463ae-kube-api-access-5fndg\") pod \"community-operators-27lqn\" (UID: \"ff506255-3f2c-4edd-8c13-dea86ef463ae\") " pod="openshift-marketplace/community-operators-27lqn" Oct 11 01:16:55 crc kubenswrapper[4743]: I1011 01:16:55.387285 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-27lqn" Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.003061 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.041433 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-27lqn"] Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.069335 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-ovsdbserver-sb\") pod \"05b1ed8b-8016-4523-afc2-b00ce65b4042\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.069395 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-config\") pod \"05b1ed8b-8016-4523-afc2-b00ce65b4042\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.069568 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn6tj\" (UniqueName: \"kubernetes.io/projected/05b1ed8b-8016-4523-afc2-b00ce65b4042-kube-api-access-nn6tj\") pod \"05b1ed8b-8016-4523-afc2-b00ce65b4042\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.069615 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-dns-swift-storage-0\") pod \"05b1ed8b-8016-4523-afc2-b00ce65b4042\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.069645 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-ovsdbserver-nb\") pod \"05b1ed8b-8016-4523-afc2-b00ce65b4042\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.069724 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-dns-svc\") pod \"05b1ed8b-8016-4523-afc2-b00ce65b4042\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.069822 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-openstack-edpm-ipam\") pod \"05b1ed8b-8016-4523-afc2-b00ce65b4042\" (UID: \"05b1ed8b-8016-4523-afc2-b00ce65b4042\") " Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.088102 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05b1ed8b-8016-4523-afc2-b00ce65b4042-kube-api-access-nn6tj" (OuterVolumeSpecName: "kube-api-access-nn6tj") pod "05b1ed8b-8016-4523-afc2-b00ce65b4042" (UID: "05b1ed8b-8016-4523-afc2-b00ce65b4042"). InnerVolumeSpecName "kube-api-access-nn6tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.172540 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn6tj\" (UniqueName: \"kubernetes.io/projected/05b1ed8b-8016-4523-afc2-b00ce65b4042-kube-api-access-nn6tj\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.182355 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "05b1ed8b-8016-4523-afc2-b00ce65b4042" (UID: "05b1ed8b-8016-4523-afc2-b00ce65b4042"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.183478 4743 generic.go:334] "Generic (PLEG): container finished" podID="05b1ed8b-8016-4523-afc2-b00ce65b4042" containerID="30548d23cccafe964a4952eb3531005db4eb2292508d6b72c658aeb0b2e37b26" exitCode=0 Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.183783 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.187467 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "05b1ed8b-8016-4523-afc2-b00ce65b4042" (UID: "05b1ed8b-8016-4523-afc2-b00ce65b4042"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.191958 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7b7546fb69-4lvd7" event={"ID":"db0cfd47-4287-4526-8e6b-0fd5bd770a1c","Type":"ContainerStarted","Data":"e963afea429a265f02fe3c0a88f048dbfad22308618ad2a6e0612c56bf59b264"} Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.192969 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" event={"ID":"05b1ed8b-8016-4523-afc2-b00ce65b4042","Type":"ContainerDied","Data":"30548d23cccafe964a4952eb3531005db4eb2292508d6b72c658aeb0b2e37b26"} Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.193112 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-b6r4b" event={"ID":"05b1ed8b-8016-4523-afc2-b00ce65b4042","Type":"ContainerDied","Data":"a8c7ffd007b5d591e438ca0fdd46c9dd2185a875b3c59f00875e8043dd0ed405"} Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.193463 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27lqn" event={"ID":"ff506255-3f2c-4edd-8c13-dea86ef463ae","Type":"ContainerStarted","Data":"f838c96f69fba687d486f397cc5a2df38229bf90b7cedfea9f901cf9763eda4a"} Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.193573 4743 scope.go:117] "RemoveContainer" containerID="30548d23cccafe964a4952eb3531005db4eb2292508d6b72c658aeb0b2e37b26" Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.193575 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-config" (OuterVolumeSpecName: "config") pod "05b1ed8b-8016-4523-afc2-b00ce65b4042" (UID: "05b1ed8b-8016-4523-afc2-b00ce65b4042"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.193987 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-657c9dff6b-rgphg" event={"ID":"de562f4f-80d8-407b-bf5a-9b584e013294","Type":"ContainerStarted","Data":"53597b7e6cb037c107eda05a2ef67d8c748224ef6264cc1b987b6b3751ca0187"} Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.204407 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-57fbf8bd-mp4f8" event={"ID":"ae21bf76-1584-4681-b679-29abbf1ef22a","Type":"ContainerStarted","Data":"74bda7d5af2d862caec7aff6b0adf7862205ef57fbeb1de9023266cc399e94a2"} Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.204456 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-57fbf8bd-mp4f8" Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.205007 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "05b1ed8b-8016-4523-afc2-b00ce65b4042" (UID: "05b1ed8b-8016-4523-afc2-b00ce65b4042"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.236725 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "05b1ed8b-8016-4523-afc2-b00ce65b4042" (UID: "05b1ed8b-8016-4523-afc2-b00ce65b4042"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.246703 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-57fbf8bd-mp4f8" podStartSLOduration=3.246674384 podStartE2EDuration="3.246674384s" podCreationTimestamp="2025-10-11 01:16:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:16:56.222311301 +0000 UTC m=+1510.875291698" watchObservedRunningTime="2025-10-11 01:16:56.246674384 +0000 UTC m=+1510.899654781" Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.247624 4743 scope.go:117] "RemoveContainer" containerID="c96b328d1eca40b616c1c88123a83d0fb09e9dab9f4a1f3de6020583e562b969" Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.253800 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "05b1ed8b-8016-4523-afc2-b00ce65b4042" (UID: "05b1ed8b-8016-4523-afc2-b00ce65b4042"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.275343 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.275383 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.275394 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-config\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.275403 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.275411 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.275419 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05b1ed8b-8016-4523-afc2-b00ce65b4042-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.295304 4743 scope.go:117] "RemoveContainer" containerID="30548d23cccafe964a4952eb3531005db4eb2292508d6b72c658aeb0b2e37b26" Oct 11 01:16:56 crc kubenswrapper[4743]: E1011 01:16:56.296115 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30548d23cccafe964a4952eb3531005db4eb2292508d6b72c658aeb0b2e37b26\": container with ID starting with 30548d23cccafe964a4952eb3531005db4eb2292508d6b72c658aeb0b2e37b26 not found: ID does not exist" containerID="30548d23cccafe964a4952eb3531005db4eb2292508d6b72c658aeb0b2e37b26" Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.296201 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30548d23cccafe964a4952eb3531005db4eb2292508d6b72c658aeb0b2e37b26"} err="failed to get container status \"30548d23cccafe964a4952eb3531005db4eb2292508d6b72c658aeb0b2e37b26\": rpc error: code = NotFound desc = could not find container \"30548d23cccafe964a4952eb3531005db4eb2292508d6b72c658aeb0b2e37b26\": container with ID starting with 30548d23cccafe964a4952eb3531005db4eb2292508d6b72c658aeb0b2e37b26 not found: ID does not exist" Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.296222 4743 scope.go:117] "RemoveContainer" containerID="c96b328d1eca40b616c1c88123a83d0fb09e9dab9f4a1f3de6020583e562b969" Oct 11 01:16:56 crc kubenswrapper[4743]: E1011 01:16:56.296513 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c96b328d1eca40b616c1c88123a83d0fb09e9dab9f4a1f3de6020583e562b969\": container with ID starting with c96b328d1eca40b616c1c88123a83d0fb09e9dab9f4a1f3de6020583e562b969 not found: ID does not exist" containerID="c96b328d1eca40b616c1c88123a83d0fb09e9dab9f4a1f3de6020583e562b969" Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.296537 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c96b328d1eca40b616c1c88123a83d0fb09e9dab9f4a1f3de6020583e562b969"} err="failed to get container status \"c96b328d1eca40b616c1c88123a83d0fb09e9dab9f4a1f3de6020583e562b969\": rpc error: code = NotFound desc = could not find container \"c96b328d1eca40b616c1c88123a83d0fb09e9dab9f4a1f3de6020583e562b969\": container with ID starting with c96b328d1eca40b616c1c88123a83d0fb09e9dab9f4a1f3de6020583e562b969 not found: ID does not exist" Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.614600 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-b6r4b"] Oct 11 01:16:56 crc kubenswrapper[4743]: I1011 01:16:56.637718 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-b6r4b"] Oct 11 01:16:57 crc kubenswrapper[4743]: I1011 01:16:57.092666 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:16:57 crc kubenswrapper[4743]: E1011 01:16:57.093733 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:16:57 crc kubenswrapper[4743]: I1011 01:16:57.217916 4743 generic.go:334] "Generic (PLEG): container finished" podID="ff506255-3f2c-4edd-8c13-dea86ef463ae" containerID="e7c911d39d74a7db4879e8a2eb39090bd153508a17835e774b998ced89e7c39c" exitCode=0 Oct 11 01:16:57 crc kubenswrapper[4743]: I1011 01:16:57.218186 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27lqn" event={"ID":"ff506255-3f2c-4edd-8c13-dea86ef463ae","Type":"ContainerDied","Data":"e7c911d39d74a7db4879e8a2eb39090bd153508a17835e774b998ced89e7c39c"} Oct 11 01:16:58 crc kubenswrapper[4743]: I1011 01:16:58.020211 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 11 01:16:58 crc kubenswrapper[4743]: I1011 01:16:58.114253 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05b1ed8b-8016-4523-afc2-b00ce65b4042" path="/var/lib/kubelet/pods/05b1ed8b-8016-4523-afc2-b00ce65b4042/volumes" Oct 11 01:16:58 crc kubenswrapper[4743]: I1011 01:16:58.240893 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7b7546fb69-4lvd7" event={"ID":"db0cfd47-4287-4526-8e6b-0fd5bd770a1c","Type":"ContainerStarted","Data":"b14a685ff46e2cac23acbbfb3398b6f4f79bf84b7dca744a2c1c9d5c55185298"} Oct 11 01:16:58 crc kubenswrapper[4743]: I1011 01:16:58.242129 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7b7546fb69-4lvd7" Oct 11 01:16:58 crc kubenswrapper[4743]: I1011 01:16:58.245001 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-657c9dff6b-rgphg" event={"ID":"de562f4f-80d8-407b-bf5a-9b584e013294","Type":"ContainerStarted","Data":"47cf8a11b59ed8effe319f9691d9c64431f5af5856f62d313a5f3ca8b4b05e74"} Oct 11 01:16:58 crc kubenswrapper[4743]: I1011 01:16:58.245212 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-657c9dff6b-rgphg" Oct 11 01:16:58 crc kubenswrapper[4743]: I1011 01:16:58.262749 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7b7546fb69-4lvd7" podStartSLOduration=1.842030424 podStartE2EDuration="4.262731355s" podCreationTimestamp="2025-10-11 01:16:54 +0000 UTC" firstStartedPulling="2025-10-11 01:16:55.271035645 +0000 UTC m=+1509.924016042" lastFinishedPulling="2025-10-11 01:16:57.691736576 +0000 UTC m=+1512.344716973" observedRunningTime="2025-10-11 01:16:58.260207673 +0000 UTC m=+1512.913188080" watchObservedRunningTime="2025-10-11 01:16:58.262731355 +0000 UTC m=+1512.915711752" Oct 11 01:16:58 crc kubenswrapper[4743]: I1011 01:16:58.285113 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-657c9dff6b-rgphg" podStartSLOduration=1.701883575 podStartE2EDuration="4.285097036s" podCreationTimestamp="2025-10-11 01:16:54 +0000 UTC" firstStartedPulling="2025-10-11 01:16:55.110910695 +0000 UTC m=+1509.763891092" lastFinishedPulling="2025-10-11 01:16:57.694124156 +0000 UTC m=+1512.347104553" observedRunningTime="2025-10-11 01:16:58.283295479 +0000 UTC m=+1512.936275876" watchObservedRunningTime="2025-10-11 01:16:58.285097036 +0000 UTC m=+1512.938077433" Oct 11 01:16:59 crc kubenswrapper[4743]: I1011 01:16:59.264576 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27lqn" event={"ID":"ff506255-3f2c-4edd-8c13-dea86ef463ae","Type":"ContainerStarted","Data":"ec1f5c3e267ec111693f38276f3e033df949bb4d597eb7405d539464bcfb8b86"} Oct 11 01:17:00 crc kubenswrapper[4743]: I1011 01:17:00.276573 4743 generic.go:334] "Generic (PLEG): container finished" podID="ff506255-3f2c-4edd-8c13-dea86ef463ae" containerID="ec1f5c3e267ec111693f38276f3e033df949bb4d597eb7405d539464bcfb8b86" exitCode=0 Oct 11 01:17:00 crc kubenswrapper[4743]: I1011 01:17:00.276635 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27lqn" event={"ID":"ff506255-3f2c-4edd-8c13-dea86ef463ae","Type":"ContainerDied","Data":"ec1f5c3e267ec111693f38276f3e033df949bb4d597eb7405d539464bcfb8b86"} Oct 11 01:17:01 crc kubenswrapper[4743]: I1011 01:17:01.287631 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27lqn" event={"ID":"ff506255-3f2c-4edd-8c13-dea86ef463ae","Type":"ContainerStarted","Data":"4e0379136c88c05a6f695df4ba66cd33c8e9873217a91afc32bd824921e843f4"} Oct 11 01:17:01 crc kubenswrapper[4743]: I1011 01:17:01.315299 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-27lqn" podStartSLOduration=4.268599187 podStartE2EDuration="7.31528118s" podCreationTimestamp="2025-10-11 01:16:54 +0000 UTC" firstStartedPulling="2025-10-11 01:16:57.616060317 +0000 UTC m=+1512.269040714" lastFinishedPulling="2025-10-11 01:17:00.6627423 +0000 UTC m=+1515.315722707" observedRunningTime="2025-10-11 01:17:01.305907066 +0000 UTC m=+1515.958887463" watchObservedRunningTime="2025-10-11 01:17:01.31528118 +0000 UTC m=+1515.968261567" Oct 11 01:17:05 crc kubenswrapper[4743]: I1011 01:17:05.388331 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-27lqn" Oct 11 01:17:05 crc kubenswrapper[4743]: I1011 01:17:05.388806 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-27lqn" Oct 11 01:17:05 crc kubenswrapper[4743]: I1011 01:17:05.456883 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-27lqn" Oct 11 01:17:05 crc kubenswrapper[4743]: I1011 01:17:05.793492 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-657c9dff6b-rgphg" Oct 11 01:17:05 crc kubenswrapper[4743]: I1011 01:17:05.853463 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-698b7768c9-bwljp"] Oct 11 01:17:05 crc kubenswrapper[4743]: I1011 01:17:05.853985 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-698b7768c9-bwljp" podUID="ef2598ca-aa73-4d3e-adf8-7f94e68f2838" containerName="heat-api" containerID="cri-o://0f18677d262f1a6f8cfc2af7e7ff348e93a8d1dbda8f2829f94d84d8863d4523" gracePeriod=60 Oct 11 01:17:06 crc kubenswrapper[4743]: I1011 01:17:06.185848 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-7b7546fb69-4lvd7" Oct 11 01:17:06 crc kubenswrapper[4743]: I1011 01:17:06.270514 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6877c7bb88-shzwl"] Oct 11 01:17:06 crc kubenswrapper[4743]: I1011 01:17:06.271021 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-6877c7bb88-shzwl" podUID="0173521e-a9ee-43e3-9760-f3f12527c84b" containerName="heat-cfnapi" containerID="cri-o://6c934096d37c4cdb4c39aaaaca7da99027c6f330c116c0e3d5922b9970621182" gracePeriod=60 Oct 11 01:17:06 crc kubenswrapper[4743]: I1011 01:17:06.400765 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-27lqn" Oct 11 01:17:06 crc kubenswrapper[4743]: I1011 01:17:06.447453 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-27lqn"] Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.176455 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6"] Oct 11 01:17:08 crc kubenswrapper[4743]: E1011 01:17:08.177248 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05b1ed8b-8016-4523-afc2-b00ce65b4042" containerName="init" Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.177266 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b1ed8b-8016-4523-afc2-b00ce65b4042" containerName="init" Oct 11 01:17:08 crc kubenswrapper[4743]: E1011 01:17:08.177308 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05b1ed8b-8016-4523-afc2-b00ce65b4042" containerName="dnsmasq-dns" Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.177316 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b1ed8b-8016-4523-afc2-b00ce65b4042" containerName="dnsmasq-dns" Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.177561 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="05b1ed8b-8016-4523-afc2-b00ce65b4042" containerName="dnsmasq-dns" Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.178511 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6" Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.186644 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.187290 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.187547 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.187747 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.196214 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6"] Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.250989 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc7b43c-1b31-4510-bb3e-a3e3017bd93e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6\" (UID: \"1fc7b43c-1b31-4510-bb3e-a3e3017bd93e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6" Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.251102 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1fc7b43c-1b31-4510-bb3e-a3e3017bd93e-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6\" (UID: \"1fc7b43c-1b31-4510-bb3e-a3e3017bd93e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6" Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.251293 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1fc7b43c-1b31-4510-bb3e-a3e3017bd93e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6\" (UID: \"1fc7b43c-1b31-4510-bb3e-a3e3017bd93e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6" Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.251536 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfjcd\" (UniqueName: \"kubernetes.io/projected/1fc7b43c-1b31-4510-bb3e-a3e3017bd93e-kube-api-access-cfjcd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6\" (UID: \"1fc7b43c-1b31-4510-bb3e-a3e3017bd93e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6" Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.353468 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1fc7b43c-1b31-4510-bb3e-a3e3017bd93e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6\" (UID: \"1fc7b43c-1b31-4510-bb3e-a3e3017bd93e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6" Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.353559 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfjcd\" (UniqueName: \"kubernetes.io/projected/1fc7b43c-1b31-4510-bb3e-a3e3017bd93e-kube-api-access-cfjcd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6\" (UID: \"1fc7b43c-1b31-4510-bb3e-a3e3017bd93e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6" Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.353684 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc7b43c-1b31-4510-bb3e-a3e3017bd93e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6\" (UID: \"1fc7b43c-1b31-4510-bb3e-a3e3017bd93e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6" Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.353710 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1fc7b43c-1b31-4510-bb3e-a3e3017bd93e-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6\" (UID: \"1fc7b43c-1b31-4510-bb3e-a3e3017bd93e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6" Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.359249 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1fc7b43c-1b31-4510-bb3e-a3e3017bd93e-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6\" (UID: \"1fc7b43c-1b31-4510-bb3e-a3e3017bd93e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6" Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.363511 4743 generic.go:334] "Generic (PLEG): container finished" podID="38225901-8300-41cc-8e32-748b754660dc" containerID="8a1b6c38209d238277b415eeff99d3ed1a9aba732fcb003e750ae774044b4cf9" exitCode=0 Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.363608 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"38225901-8300-41cc-8e32-748b754660dc","Type":"ContainerDied","Data":"8a1b6c38209d238277b415eeff99d3ed1a9aba732fcb003e750ae774044b4cf9"} Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.364797 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc7b43c-1b31-4510-bb3e-a3e3017bd93e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6\" (UID: \"1fc7b43c-1b31-4510-bb3e-a3e3017bd93e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6" Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.365992 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1fc7b43c-1b31-4510-bb3e-a3e3017bd93e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6\" (UID: \"1fc7b43c-1b31-4510-bb3e-a3e3017bd93e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6" Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.369005 4743 generic.go:334] "Generic (PLEG): container finished" podID="de84c29c-4168-4383-aadc-0d5cc0ba56f8" containerID="e2f210d4e9eb1cc201a1f88d953711eae2090cc32d56c1edae66e81284391db4" exitCode=0 Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.369110 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"de84c29c-4168-4383-aadc-0d5cc0ba56f8","Type":"ContainerDied","Data":"e2f210d4e9eb1cc201a1f88d953711eae2090cc32d56c1edae66e81284391db4"} Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.369310 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-27lqn" podUID="ff506255-3f2c-4edd-8c13-dea86ef463ae" containerName="registry-server" containerID="cri-o://4e0379136c88c05a6f695df4ba66cd33c8e9873217a91afc32bd824921e843f4" gracePeriod=2 Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.389727 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfjcd\" (UniqueName: \"kubernetes.io/projected/1fc7b43c-1b31-4510-bb3e-a3e3017bd93e-kube-api-access-cfjcd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6\" (UID: \"1fc7b43c-1b31-4510-bb3e-a3e3017bd93e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6" Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.510457 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6" Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.938814 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-27lqn" Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.974464 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff506255-3f2c-4edd-8c13-dea86ef463ae-catalog-content\") pod \"ff506255-3f2c-4edd-8c13-dea86ef463ae\" (UID: \"ff506255-3f2c-4edd-8c13-dea86ef463ae\") " Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.987180 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff506255-3f2c-4edd-8c13-dea86ef463ae-utilities\") pod \"ff506255-3f2c-4edd-8c13-dea86ef463ae\" (UID: \"ff506255-3f2c-4edd-8c13-dea86ef463ae\") " Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.987231 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fndg\" (UniqueName: \"kubernetes.io/projected/ff506255-3f2c-4edd-8c13-dea86ef463ae-kube-api-access-5fndg\") pod \"ff506255-3f2c-4edd-8c13-dea86ef463ae\" (UID: \"ff506255-3f2c-4edd-8c13-dea86ef463ae\") " Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.988001 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff506255-3f2c-4edd-8c13-dea86ef463ae-utilities" (OuterVolumeSpecName: "utilities") pod "ff506255-3f2c-4edd-8c13-dea86ef463ae" (UID: "ff506255-3f2c-4edd-8c13-dea86ef463ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.988326 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff506255-3f2c-4edd-8c13-dea86ef463ae-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:08 crc kubenswrapper[4743]: I1011 01:17:08.999144 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff506255-3f2c-4edd-8c13-dea86ef463ae-kube-api-access-5fndg" (OuterVolumeSpecName: "kube-api-access-5fndg") pod "ff506255-3f2c-4edd-8c13-dea86ef463ae" (UID: "ff506255-3f2c-4edd-8c13-dea86ef463ae"). InnerVolumeSpecName "kube-api-access-5fndg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.018113 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff506255-3f2c-4edd-8c13-dea86ef463ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff506255-3f2c-4edd-8c13-dea86ef463ae" (UID: "ff506255-3f2c-4edd-8c13-dea86ef463ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.090639 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff506255-3f2c-4edd-8c13-dea86ef463ae-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.090681 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fndg\" (UniqueName: \"kubernetes.io/projected/ff506255-3f2c-4edd-8c13-dea86ef463ae-kube-api-access-5fndg\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.224340 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6"] Oct 11 01:17:09 crc kubenswrapper[4743]: W1011 01:17:09.228826 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fc7b43c_1b31_4510_bb3e_a3e3017bd93e.slice/crio-ac8f756f2e2e2d89b60e6437564ba9001174f0b05676e0c848a65e76f2c51e21 WatchSource:0}: Error finding container ac8f756f2e2e2d89b60e6437564ba9001174f0b05676e0c848a65e76f2c51e21: Status 404 returned error can't find the container with id ac8f756f2e2e2d89b60e6437564ba9001174f0b05676e0c848a65e76f2c51e21 Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.379280 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6" event={"ID":"1fc7b43c-1b31-4510-bb3e-a3e3017bd93e","Type":"ContainerStarted","Data":"ac8f756f2e2e2d89b60e6437564ba9001174f0b05676e0c848a65e76f2c51e21"} Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.383954 4743 generic.go:334] "Generic (PLEG): container finished" podID="ff506255-3f2c-4edd-8c13-dea86ef463ae" containerID="4e0379136c88c05a6f695df4ba66cd33c8e9873217a91afc32bd824921e843f4" exitCode=0 Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.384012 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-27lqn" Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.384057 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27lqn" event={"ID":"ff506255-3f2c-4edd-8c13-dea86ef463ae","Type":"ContainerDied","Data":"4e0379136c88c05a6f695df4ba66cd33c8e9873217a91afc32bd824921e843f4"} Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.384116 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27lqn" event={"ID":"ff506255-3f2c-4edd-8c13-dea86ef463ae","Type":"ContainerDied","Data":"f838c96f69fba687d486f397cc5a2df38229bf90b7cedfea9f901cf9763eda4a"} Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.384137 4743 scope.go:117] "RemoveContainer" containerID="4e0379136c88c05a6f695df4ba66cd33c8e9873217a91afc32bd824921e843f4" Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.387145 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"de84c29c-4168-4383-aadc-0d5cc0ba56f8","Type":"ContainerStarted","Data":"ee1f887c6400732ebf101f8251f5ea1d823acdcea465e7a881ce41be49844e51"} Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.387470 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.391395 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"38225901-8300-41cc-8e32-748b754660dc","Type":"ContainerStarted","Data":"82926559fc5bdbf39409c37f7c2e4d070226a9ffcd0d04b2e71b3cef2fdafa03"} Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.392307 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.444775 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.444751362 podStartE2EDuration="37.444751362s" podCreationTimestamp="2025-10-11 01:16:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:17:09.415324206 +0000 UTC m=+1524.068304603" watchObservedRunningTime="2025-10-11 01:17:09.444751362 +0000 UTC m=+1524.097731769" Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.453957 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.453936262 podStartE2EDuration="37.453936262s" podCreationTimestamp="2025-10-11 01:16:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:17:09.451517402 +0000 UTC m=+1524.104497809" watchObservedRunningTime="2025-10-11 01:17:09.453936262 +0000 UTC m=+1524.106916659" Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.455209 4743 scope.go:117] "RemoveContainer" containerID="ec1f5c3e267ec111693f38276f3e033df949bb4d597eb7405d539464bcfb8b86" Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.482341 4743 scope.go:117] "RemoveContainer" containerID="e7c911d39d74a7db4879e8a2eb39090bd153508a17835e774b998ced89e7c39c" Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.484151 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-27lqn"] Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.497133 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-27lqn"] Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.497223 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-6877c7bb88-shzwl" podUID="0173521e-a9ee-43e3-9760-f3f12527c84b" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.0.211:8000/healthcheck\": read tcp 10.217.0.2:49272->10.217.0.211:8000: read: connection reset by peer" Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.546143 4743 scope.go:117] "RemoveContainer" containerID="4e0379136c88c05a6f695df4ba66cd33c8e9873217a91afc32bd824921e843f4" Oct 11 01:17:09 crc kubenswrapper[4743]: E1011 01:17:09.546580 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e0379136c88c05a6f695df4ba66cd33c8e9873217a91afc32bd824921e843f4\": container with ID starting with 4e0379136c88c05a6f695df4ba66cd33c8e9873217a91afc32bd824921e843f4 not found: ID does not exist" containerID="4e0379136c88c05a6f695df4ba66cd33c8e9873217a91afc32bd824921e843f4" Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.546609 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e0379136c88c05a6f695df4ba66cd33c8e9873217a91afc32bd824921e843f4"} err="failed to get container status \"4e0379136c88c05a6f695df4ba66cd33c8e9873217a91afc32bd824921e843f4\": rpc error: code = NotFound desc = could not find container \"4e0379136c88c05a6f695df4ba66cd33c8e9873217a91afc32bd824921e843f4\": container with ID starting with 4e0379136c88c05a6f695df4ba66cd33c8e9873217a91afc32bd824921e843f4 not found: ID does not exist" Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.546636 4743 scope.go:117] "RemoveContainer" containerID="ec1f5c3e267ec111693f38276f3e033df949bb4d597eb7405d539464bcfb8b86" Oct 11 01:17:09 crc kubenswrapper[4743]: E1011 01:17:09.547134 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec1f5c3e267ec111693f38276f3e033df949bb4d597eb7405d539464bcfb8b86\": container with ID starting with ec1f5c3e267ec111693f38276f3e033df949bb4d597eb7405d539464bcfb8b86 not found: ID does not exist" containerID="ec1f5c3e267ec111693f38276f3e033df949bb4d597eb7405d539464bcfb8b86" Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.547162 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec1f5c3e267ec111693f38276f3e033df949bb4d597eb7405d539464bcfb8b86"} err="failed to get container status \"ec1f5c3e267ec111693f38276f3e033df949bb4d597eb7405d539464bcfb8b86\": rpc error: code = NotFound desc = could not find container \"ec1f5c3e267ec111693f38276f3e033df949bb4d597eb7405d539464bcfb8b86\": container with ID starting with ec1f5c3e267ec111693f38276f3e033df949bb4d597eb7405d539464bcfb8b86 not found: ID does not exist" Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.547177 4743 scope.go:117] "RemoveContainer" containerID="e7c911d39d74a7db4879e8a2eb39090bd153508a17835e774b998ced89e7c39c" Oct 11 01:17:09 crc kubenswrapper[4743]: E1011 01:17:09.547704 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7c911d39d74a7db4879e8a2eb39090bd153508a17835e774b998ced89e7c39c\": container with ID starting with e7c911d39d74a7db4879e8a2eb39090bd153508a17835e774b998ced89e7c39c not found: ID does not exist" containerID="e7c911d39d74a7db4879e8a2eb39090bd153508a17835e774b998ced89e7c39c" Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.547724 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7c911d39d74a7db4879e8a2eb39090bd153508a17835e774b998ced89e7c39c"} err="failed to get container status \"e7c911d39d74a7db4879e8a2eb39090bd153508a17835e774b998ced89e7c39c\": rpc error: code = NotFound desc = could not find container \"e7c911d39d74a7db4879e8a2eb39090bd153508a17835e774b998ced89e7c39c\": container with ID starting with e7c911d39d74a7db4879e8a2eb39090bd153508a17835e774b998ced89e7c39c not found: ID does not exist" Oct 11 01:17:09 crc kubenswrapper[4743]: I1011 01:17:09.997894 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6877c7bb88-shzwl" Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.052355 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-config-data-custom\") pod \"0173521e-a9ee-43e3-9760-f3f12527c84b\" (UID: \"0173521e-a9ee-43e3-9760-f3f12527c84b\") " Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.052408 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-config-data\") pod \"0173521e-a9ee-43e3-9760-f3f12527c84b\" (UID: \"0173521e-a9ee-43e3-9760-f3f12527c84b\") " Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.052428 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-public-tls-certs\") pod \"0173521e-a9ee-43e3-9760-f3f12527c84b\" (UID: \"0173521e-a9ee-43e3-9760-f3f12527c84b\") " Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.052482 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4b87\" (UniqueName: \"kubernetes.io/projected/0173521e-a9ee-43e3-9760-f3f12527c84b-kube-api-access-b4b87\") pod \"0173521e-a9ee-43e3-9760-f3f12527c84b\" (UID: \"0173521e-a9ee-43e3-9760-f3f12527c84b\") " Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.052665 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-combined-ca-bundle\") pod \"0173521e-a9ee-43e3-9760-f3f12527c84b\" (UID: \"0173521e-a9ee-43e3-9760-f3f12527c84b\") " Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.052683 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-internal-tls-certs\") pod \"0173521e-a9ee-43e3-9760-f3f12527c84b\" (UID: \"0173521e-a9ee-43e3-9760-f3f12527c84b\") " Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.063818 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0173521e-a9ee-43e3-9760-f3f12527c84b-kube-api-access-b4b87" (OuterVolumeSpecName: "kube-api-access-b4b87") pod "0173521e-a9ee-43e3-9760-f3f12527c84b" (UID: "0173521e-a9ee-43e3-9760-f3f12527c84b"). InnerVolumeSpecName "kube-api-access-b4b87". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.065041 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0173521e-a9ee-43e3-9760-f3f12527c84b" (UID: "0173521e-a9ee-43e3-9760-f3f12527c84b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.105164 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0173521e-a9ee-43e3-9760-f3f12527c84b" (UID: "0173521e-a9ee-43e3-9760-f3f12527c84b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.109210 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff506255-3f2c-4edd-8c13-dea86ef463ae" path="/var/lib/kubelet/pods/ff506255-3f2c-4edd-8c13-dea86ef463ae/volumes" Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.128373 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0173521e-a9ee-43e3-9760-f3f12527c84b" (UID: "0173521e-a9ee-43e3-9760-f3f12527c84b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.144410 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-config-data" (OuterVolumeSpecName: "config-data") pod "0173521e-a9ee-43e3-9760-f3f12527c84b" (UID: "0173521e-a9ee-43e3-9760-f3f12527c84b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.151586 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0173521e-a9ee-43e3-9760-f3f12527c84b" (UID: "0173521e-a9ee-43e3-9760-f3f12527c84b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.156076 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.156121 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.156136 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.156148 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4b87\" (UniqueName: \"kubernetes.io/projected/0173521e-a9ee-43e3-9760-f3f12527c84b-kube-api-access-b4b87\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.156160 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.156170 4743 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0173521e-a9ee-43e3-9760-f3f12527c84b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.338277 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-698b7768c9-bwljp" podUID="ef2598ca-aa73-4d3e-adf8-7f94e68f2838" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.0.210:8004/healthcheck\": read tcp 10.217.0.2:36582->10.217.0.210:8004: read: connection reset by peer" Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.410384 4743 generic.go:334] "Generic (PLEG): container finished" podID="0173521e-a9ee-43e3-9760-f3f12527c84b" containerID="6c934096d37c4cdb4c39aaaaca7da99027c6f330c116c0e3d5922b9970621182" exitCode=0 Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.410440 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6877c7bb88-shzwl" Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.410459 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6877c7bb88-shzwl" event={"ID":"0173521e-a9ee-43e3-9760-f3f12527c84b","Type":"ContainerDied","Data":"6c934096d37c4cdb4c39aaaaca7da99027c6f330c116c0e3d5922b9970621182"} Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.411158 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6877c7bb88-shzwl" event={"ID":"0173521e-a9ee-43e3-9760-f3f12527c84b","Type":"ContainerDied","Data":"f86bb7d938d22431051f103e8eefbd9b1a949d387e1691a794ca3987ebb430c4"} Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.411180 4743 scope.go:117] "RemoveContainer" containerID="6c934096d37c4cdb4c39aaaaca7da99027c6f330c116c0e3d5922b9970621182" Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.436642 4743 generic.go:334] "Generic (PLEG): container finished" podID="ef2598ca-aa73-4d3e-adf8-7f94e68f2838" containerID="0f18677d262f1a6f8cfc2af7e7ff348e93a8d1dbda8f2829f94d84d8863d4523" exitCode=0 Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.437681 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-698b7768c9-bwljp" event={"ID":"ef2598ca-aa73-4d3e-adf8-7f94e68f2838","Type":"ContainerDied","Data":"0f18677d262f1a6f8cfc2af7e7ff348e93a8d1dbda8f2829f94d84d8863d4523"} Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.490794 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6877c7bb88-shzwl"] Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.503477 4743 scope.go:117] "RemoveContainer" containerID="6c934096d37c4cdb4c39aaaaca7da99027c6f330c116c0e3d5922b9970621182" Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.505626 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-6877c7bb88-shzwl"] Oct 11 01:17:10 crc kubenswrapper[4743]: E1011 01:17:10.519242 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c934096d37c4cdb4c39aaaaca7da99027c6f330c116c0e3d5922b9970621182\": container with ID starting with 6c934096d37c4cdb4c39aaaaca7da99027c6f330c116c0e3d5922b9970621182 not found: ID does not exist" containerID="6c934096d37c4cdb4c39aaaaca7da99027c6f330c116c0e3d5922b9970621182" Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.519279 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c934096d37c4cdb4c39aaaaca7da99027c6f330c116c0e3d5922b9970621182"} err="failed to get container status \"6c934096d37c4cdb4c39aaaaca7da99027c6f330c116c0e3d5922b9970621182\": rpc error: code = NotFound desc = could not find container \"6c934096d37c4cdb4c39aaaaca7da99027c6f330c116c0e3d5922b9970621182\": container with ID starting with 6c934096d37c4cdb4c39aaaaca7da99027c6f330c116c0e3d5922b9970621182 not found: ID does not exist" Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.901261 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-698b7768c9-bwljp" Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.976758 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72vn9\" (UniqueName: \"kubernetes.io/projected/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-kube-api-access-72vn9\") pod \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\" (UID: \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\") " Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.976844 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-combined-ca-bundle\") pod \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\" (UID: \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\") " Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.976946 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-config-data-custom\") pod \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\" (UID: \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\") " Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.977059 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-config-data\") pod \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\" (UID: \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\") " Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.977096 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-internal-tls-certs\") pod \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\" (UID: \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\") " Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.977116 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-public-tls-certs\") pod \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\" (UID: \"ef2598ca-aa73-4d3e-adf8-7f94e68f2838\") " Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.982175 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-kube-api-access-72vn9" (OuterVolumeSpecName: "kube-api-access-72vn9") pod "ef2598ca-aa73-4d3e-adf8-7f94e68f2838" (UID: "ef2598ca-aa73-4d3e-adf8-7f94e68f2838"). InnerVolumeSpecName "kube-api-access-72vn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:17:10 crc kubenswrapper[4743]: I1011 01:17:10.995683 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ef2598ca-aa73-4d3e-adf8-7f94e68f2838" (UID: "ef2598ca-aa73-4d3e-adf8-7f94e68f2838"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:17:11 crc kubenswrapper[4743]: I1011 01:17:11.030587 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef2598ca-aa73-4d3e-adf8-7f94e68f2838" (UID: "ef2598ca-aa73-4d3e-adf8-7f94e68f2838"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:17:11 crc kubenswrapper[4743]: I1011 01:17:11.035108 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ef2598ca-aa73-4d3e-adf8-7f94e68f2838" (UID: "ef2598ca-aa73-4d3e-adf8-7f94e68f2838"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:17:11 crc kubenswrapper[4743]: I1011 01:17:11.057306 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-config-data" (OuterVolumeSpecName: "config-data") pod "ef2598ca-aa73-4d3e-adf8-7f94e68f2838" (UID: "ef2598ca-aa73-4d3e-adf8-7f94e68f2838"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:17:11 crc kubenswrapper[4743]: I1011 01:17:11.060841 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ef2598ca-aa73-4d3e-adf8-7f94e68f2838" (UID: "ef2598ca-aa73-4d3e-adf8-7f94e68f2838"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:17:11 crc kubenswrapper[4743]: I1011 01:17:11.079790 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72vn9\" (UniqueName: \"kubernetes.io/projected/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-kube-api-access-72vn9\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:11 crc kubenswrapper[4743]: I1011 01:17:11.079822 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:11 crc kubenswrapper[4743]: I1011 01:17:11.079831 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:11 crc kubenswrapper[4743]: I1011 01:17:11.079841 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:11 crc kubenswrapper[4743]: I1011 01:17:11.079850 4743 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:11 crc kubenswrapper[4743]: I1011 01:17:11.079869 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef2598ca-aa73-4d3e-adf8-7f94e68f2838-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:11 crc kubenswrapper[4743]: I1011 01:17:11.453792 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-698b7768c9-bwljp" event={"ID":"ef2598ca-aa73-4d3e-adf8-7f94e68f2838","Type":"ContainerDied","Data":"6865f057330b7a2add702a63f04dc551183fed004099d0b45d756a59497c7170"} Oct 11 01:17:11 crc kubenswrapper[4743]: I1011 01:17:11.453825 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-698b7768c9-bwljp" Oct 11 01:17:11 crc kubenswrapper[4743]: I1011 01:17:11.453841 4743 scope.go:117] "RemoveContainer" containerID="0f18677d262f1a6f8cfc2af7e7ff348e93a8d1dbda8f2829f94d84d8863d4523" Oct 11 01:17:11 crc kubenswrapper[4743]: I1011 01:17:11.493165 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-698b7768c9-bwljp"] Oct 11 01:17:11 crc kubenswrapper[4743]: I1011 01:17:11.501529 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-698b7768c9-bwljp"] Oct 11 01:17:12 crc kubenswrapper[4743]: I1011 01:17:12.092060 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:17:12 crc kubenswrapper[4743]: E1011 01:17:12.092757 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:17:12 crc kubenswrapper[4743]: I1011 01:17:12.104272 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0173521e-a9ee-43e3-9760-f3f12527c84b" path="/var/lib/kubelet/pods/0173521e-a9ee-43e3-9760-f3f12527c84b/volumes" Oct 11 01:17:12 crc kubenswrapper[4743]: I1011 01:17:12.105574 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef2598ca-aa73-4d3e-adf8-7f94e68f2838" path="/var/lib/kubelet/pods/ef2598ca-aa73-4d3e-adf8-7f94e68f2838/volumes" Oct 11 01:17:14 crc kubenswrapper[4743]: I1011 01:17:14.355610 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-57fbf8bd-mp4f8" Oct 11 01:17:14 crc kubenswrapper[4743]: I1011 01:17:14.404055 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-fdd7c75fc-rtmvl"] Oct 11 01:17:14 crc kubenswrapper[4743]: I1011 01:17:14.404259 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-fdd7c75fc-rtmvl" podUID="5e39acaa-0992-471f-a015-46714fde82cf" containerName="heat-engine" containerID="cri-o://18f7a239f2e0707135ecd80c3ce36f7686fd8c2c29bfe008d2373815e615ee5c" gracePeriod=60 Oct 11 01:17:14 crc kubenswrapper[4743]: I1011 01:17:14.487738 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-744b8cd687-p7lgl" podUID="68219217-d875-4eb2-9611-b9afb0f64c45" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 11 01:17:19 crc kubenswrapper[4743]: I1011 01:17:19.566286 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6" event={"ID":"1fc7b43c-1b31-4510-bb3e-a3e3017bd93e","Type":"ContainerStarted","Data":"0a4accac8eb229d5f59e86320d47a84e76855054259ad92d5660aecd9d182161"} Oct 11 01:17:19 crc kubenswrapper[4743]: I1011 01:17:19.597998 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6" podStartSLOduration=2.006872969 podStartE2EDuration="11.597973756s" podCreationTimestamp="2025-10-11 01:17:08 +0000 UTC" firstStartedPulling="2025-10-11 01:17:09.231836304 +0000 UTC m=+1523.884816701" lastFinishedPulling="2025-10-11 01:17:18.822937091 +0000 UTC m=+1533.475917488" observedRunningTime="2025-10-11 01:17:19.590702626 +0000 UTC m=+1534.243683033" watchObservedRunningTime="2025-10-11 01:17:19.597973756 +0000 UTC m=+1534.250954173" Oct 11 01:17:22 crc kubenswrapper[4743]: E1011 01:17:22.593811 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="18f7a239f2e0707135ecd80c3ce36f7686fd8c2c29bfe008d2373815e615ee5c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 11 01:17:22 crc kubenswrapper[4743]: E1011 01:17:22.599335 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="18f7a239f2e0707135ecd80c3ce36f7686fd8c2c29bfe008d2373815e615ee5c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 11 01:17:22 crc kubenswrapper[4743]: E1011 01:17:22.601263 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="18f7a239f2e0707135ecd80c3ce36f7686fd8c2c29bfe008d2373815e615ee5c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 11 01:17:22 crc kubenswrapper[4743]: E1011 01:17:22.601310 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-fdd7c75fc-rtmvl" podUID="5e39acaa-0992-471f-a015-46714fde82cf" containerName="heat-engine" Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.127223 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.145139 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.519829 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-592t9"] Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.531544 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-592t9"] Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.619108 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-47nj5"] Oct 11 01:17:23 crc kubenswrapper[4743]: E1011 01:17:23.619621 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0173521e-a9ee-43e3-9760-f3f12527c84b" containerName="heat-cfnapi" Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.619634 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0173521e-a9ee-43e3-9760-f3f12527c84b" containerName="heat-cfnapi" Oct 11 01:17:23 crc kubenswrapper[4743]: E1011 01:17:23.619644 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff506255-3f2c-4edd-8c13-dea86ef463ae" containerName="extract-content" Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.619650 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff506255-3f2c-4edd-8c13-dea86ef463ae" containerName="extract-content" Oct 11 01:17:23 crc kubenswrapper[4743]: E1011 01:17:23.619669 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff506255-3f2c-4edd-8c13-dea86ef463ae" containerName="registry-server" Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.619675 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff506255-3f2c-4edd-8c13-dea86ef463ae" containerName="registry-server" Oct 11 01:17:23 crc kubenswrapper[4743]: E1011 01:17:23.619709 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff506255-3f2c-4edd-8c13-dea86ef463ae" containerName="extract-utilities" Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.619717 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff506255-3f2c-4edd-8c13-dea86ef463ae" containerName="extract-utilities" Oct 11 01:17:23 crc kubenswrapper[4743]: E1011 01:17:23.619731 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef2598ca-aa73-4d3e-adf8-7f94e68f2838" containerName="heat-api" Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.619738 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef2598ca-aa73-4d3e-adf8-7f94e68f2838" containerName="heat-api" Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.619947 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef2598ca-aa73-4d3e-adf8-7f94e68f2838" containerName="heat-api" Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.619975 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0173521e-a9ee-43e3-9760-f3f12527c84b" containerName="heat-cfnapi" Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.619990 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff506255-3f2c-4edd-8c13-dea86ef463ae" containerName="registry-server" Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.620726 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-47nj5" Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.630588 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-47nj5"] Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.768643 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60648a98-998f-41d9-aceb-3ad66a1d2b04-scripts\") pod \"aodh-db-sync-47nj5\" (UID: \"60648a98-998f-41d9-aceb-3ad66a1d2b04\") " pod="openstack/aodh-db-sync-47nj5" Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.768721 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60648a98-998f-41d9-aceb-3ad66a1d2b04-combined-ca-bundle\") pod \"aodh-db-sync-47nj5\" (UID: \"60648a98-998f-41d9-aceb-3ad66a1d2b04\") " pod="openstack/aodh-db-sync-47nj5" Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.768762 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60648a98-998f-41d9-aceb-3ad66a1d2b04-config-data\") pod \"aodh-db-sync-47nj5\" (UID: \"60648a98-998f-41d9-aceb-3ad66a1d2b04\") " pod="openstack/aodh-db-sync-47nj5" Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.769068 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64s5k\" (UniqueName: \"kubernetes.io/projected/60648a98-998f-41d9-aceb-3ad66a1d2b04-kube-api-access-64s5k\") pod \"aodh-db-sync-47nj5\" (UID: \"60648a98-998f-41d9-aceb-3ad66a1d2b04\") " pod="openstack/aodh-db-sync-47nj5" Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.870792 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60648a98-998f-41d9-aceb-3ad66a1d2b04-scripts\") pod \"aodh-db-sync-47nj5\" (UID: \"60648a98-998f-41d9-aceb-3ad66a1d2b04\") " pod="openstack/aodh-db-sync-47nj5" Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.870923 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60648a98-998f-41d9-aceb-3ad66a1d2b04-combined-ca-bundle\") pod \"aodh-db-sync-47nj5\" (UID: \"60648a98-998f-41d9-aceb-3ad66a1d2b04\") " pod="openstack/aodh-db-sync-47nj5" Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.870967 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60648a98-998f-41d9-aceb-3ad66a1d2b04-config-data\") pod \"aodh-db-sync-47nj5\" (UID: \"60648a98-998f-41d9-aceb-3ad66a1d2b04\") " pod="openstack/aodh-db-sync-47nj5" Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.871056 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64s5k\" (UniqueName: \"kubernetes.io/projected/60648a98-998f-41d9-aceb-3ad66a1d2b04-kube-api-access-64s5k\") pod \"aodh-db-sync-47nj5\" (UID: \"60648a98-998f-41d9-aceb-3ad66a1d2b04\") " pod="openstack/aodh-db-sync-47nj5" Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.877438 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60648a98-998f-41d9-aceb-3ad66a1d2b04-combined-ca-bundle\") pod \"aodh-db-sync-47nj5\" (UID: \"60648a98-998f-41d9-aceb-3ad66a1d2b04\") " pod="openstack/aodh-db-sync-47nj5" Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.877536 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60648a98-998f-41d9-aceb-3ad66a1d2b04-config-data\") pod \"aodh-db-sync-47nj5\" (UID: \"60648a98-998f-41d9-aceb-3ad66a1d2b04\") " pod="openstack/aodh-db-sync-47nj5" Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.878168 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60648a98-998f-41d9-aceb-3ad66a1d2b04-scripts\") pod \"aodh-db-sync-47nj5\" (UID: \"60648a98-998f-41d9-aceb-3ad66a1d2b04\") " pod="openstack/aodh-db-sync-47nj5" Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.891551 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64s5k\" (UniqueName: \"kubernetes.io/projected/60648a98-998f-41d9-aceb-3ad66a1d2b04-kube-api-access-64s5k\") pod \"aodh-db-sync-47nj5\" (UID: \"60648a98-998f-41d9-aceb-3ad66a1d2b04\") " pod="openstack/aodh-db-sync-47nj5" Oct 11 01:17:23 crc kubenswrapper[4743]: I1011 01:17:23.941915 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-47nj5" Oct 11 01:17:24 crc kubenswrapper[4743]: I1011 01:17:24.103438 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:17:24 crc kubenswrapper[4743]: E1011 01:17:24.103963 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:17:24 crc kubenswrapper[4743]: I1011 01:17:24.122662 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf81bc47-30a1-4231-a23a-446813610da6" path="/var/lib/kubelet/pods/cf81bc47-30a1-4231-a23a-446813610da6/volumes" Oct 11 01:17:24 crc kubenswrapper[4743]: I1011 01:17:24.469372 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-47nj5"] Oct 11 01:17:24 crc kubenswrapper[4743]: W1011 01:17:24.472152 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60648a98_998f_41d9_aceb_3ad66a1d2b04.slice/crio-5341ff3ca1f47f185cbb598406aa7309fba68c65c27bd5db3bb53ba61d849e7b WatchSource:0}: Error finding container 5341ff3ca1f47f185cbb598406aa7309fba68c65c27bd5db3bb53ba61d849e7b: Status 404 returned error can't find the container with id 5341ff3ca1f47f185cbb598406aa7309fba68c65c27bd5db3bb53ba61d849e7b Oct 11 01:17:24 crc kubenswrapper[4743]: I1011 01:17:24.632264 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-47nj5" event={"ID":"60648a98-998f-41d9-aceb-3ad66a1d2b04","Type":"ContainerStarted","Data":"5341ff3ca1f47f185cbb598406aa7309fba68c65c27bd5db3bb53ba61d849e7b"} Oct 11 01:17:29 crc kubenswrapper[4743]: I1011 01:17:29.688203 4743 generic.go:334] "Generic (PLEG): container finished" podID="5e39acaa-0992-471f-a015-46714fde82cf" containerID="18f7a239f2e0707135ecd80c3ce36f7686fd8c2c29bfe008d2373815e615ee5c" exitCode=0 Oct 11 01:17:29 crc kubenswrapper[4743]: I1011 01:17:29.688395 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-fdd7c75fc-rtmvl" event={"ID":"5e39acaa-0992-471f-a015-46714fde82cf","Type":"ContainerDied","Data":"18f7a239f2e0707135ecd80c3ce36f7686fd8c2c29bfe008d2373815e615ee5c"} Oct 11 01:17:29 crc kubenswrapper[4743]: I1011 01:17:29.690760 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-47nj5" event={"ID":"60648a98-998f-41d9-aceb-3ad66a1d2b04","Type":"ContainerStarted","Data":"ec47936c8ea8e3016e46f32eef56df713af2a9c92346709a02bfa557442bb542"} Oct 11 01:17:29 crc kubenswrapper[4743]: I1011 01:17:29.746014 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-47nj5" podStartSLOduration=2.302581661 podStartE2EDuration="6.745988951s" podCreationTimestamp="2025-10-11 01:17:23 +0000 UTC" firstStartedPulling="2025-10-11 01:17:24.475102566 +0000 UTC m=+1539.128082963" lastFinishedPulling="2025-10-11 01:17:28.918509856 +0000 UTC m=+1543.571490253" observedRunningTime="2025-10-11 01:17:29.716131136 +0000 UTC m=+1544.369111563" watchObservedRunningTime="2025-10-11 01:17:29.745988951 +0000 UTC m=+1544.398969358" Oct 11 01:17:30 crc kubenswrapper[4743]: I1011 01:17:30.023806 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-fdd7c75fc-rtmvl" Oct 11 01:17:30 crc kubenswrapper[4743]: I1011 01:17:30.097328 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e39acaa-0992-471f-a015-46714fde82cf-combined-ca-bundle\") pod \"5e39acaa-0992-471f-a015-46714fde82cf\" (UID: \"5e39acaa-0992-471f-a015-46714fde82cf\") " Oct 11 01:17:30 crc kubenswrapper[4743]: I1011 01:17:30.097487 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qd48r\" (UniqueName: \"kubernetes.io/projected/5e39acaa-0992-471f-a015-46714fde82cf-kube-api-access-qd48r\") pod \"5e39acaa-0992-471f-a015-46714fde82cf\" (UID: \"5e39acaa-0992-471f-a015-46714fde82cf\") " Oct 11 01:17:30 crc kubenswrapper[4743]: I1011 01:17:30.097563 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e39acaa-0992-471f-a015-46714fde82cf-config-data-custom\") pod \"5e39acaa-0992-471f-a015-46714fde82cf\" (UID: \"5e39acaa-0992-471f-a015-46714fde82cf\") " Oct 11 01:17:30 crc kubenswrapper[4743]: I1011 01:17:30.097639 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e39acaa-0992-471f-a015-46714fde82cf-config-data\") pod \"5e39acaa-0992-471f-a015-46714fde82cf\" (UID: \"5e39acaa-0992-471f-a015-46714fde82cf\") " Oct 11 01:17:30 crc kubenswrapper[4743]: I1011 01:17:30.104881 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e39acaa-0992-471f-a015-46714fde82cf-kube-api-access-qd48r" (OuterVolumeSpecName: "kube-api-access-qd48r") pod "5e39acaa-0992-471f-a015-46714fde82cf" (UID: "5e39acaa-0992-471f-a015-46714fde82cf"). InnerVolumeSpecName "kube-api-access-qd48r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:17:30 crc kubenswrapper[4743]: I1011 01:17:30.108077 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e39acaa-0992-471f-a015-46714fde82cf-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5e39acaa-0992-471f-a015-46714fde82cf" (UID: "5e39acaa-0992-471f-a015-46714fde82cf"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:17:30 crc kubenswrapper[4743]: I1011 01:17:30.135592 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e39acaa-0992-471f-a015-46714fde82cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e39acaa-0992-471f-a015-46714fde82cf" (UID: "5e39acaa-0992-471f-a015-46714fde82cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:17:30 crc kubenswrapper[4743]: I1011 01:17:30.155377 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e39acaa-0992-471f-a015-46714fde82cf-config-data" (OuterVolumeSpecName: "config-data") pod "5e39acaa-0992-471f-a015-46714fde82cf" (UID: "5e39acaa-0992-471f-a015-46714fde82cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:17:30 crc kubenswrapper[4743]: I1011 01:17:30.202692 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e39acaa-0992-471f-a015-46714fde82cf-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:30 crc kubenswrapper[4743]: I1011 01:17:30.202732 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e39acaa-0992-471f-a015-46714fde82cf-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:30 crc kubenswrapper[4743]: I1011 01:17:30.202747 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e39acaa-0992-471f-a015-46714fde82cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:30 crc kubenswrapper[4743]: I1011 01:17:30.202762 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qd48r\" (UniqueName: \"kubernetes.io/projected/5e39acaa-0992-471f-a015-46714fde82cf-kube-api-access-qd48r\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:30 crc kubenswrapper[4743]: I1011 01:17:30.708584 4743 generic.go:334] "Generic (PLEG): container finished" podID="1fc7b43c-1b31-4510-bb3e-a3e3017bd93e" containerID="0a4accac8eb229d5f59e86320d47a84e76855054259ad92d5660aecd9d182161" exitCode=0 Oct 11 01:17:30 crc kubenswrapper[4743]: I1011 01:17:30.708639 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6" event={"ID":"1fc7b43c-1b31-4510-bb3e-a3e3017bd93e","Type":"ContainerDied","Data":"0a4accac8eb229d5f59e86320d47a84e76855054259ad92d5660aecd9d182161"} Oct 11 01:17:30 crc kubenswrapper[4743]: I1011 01:17:30.711560 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-fdd7c75fc-rtmvl" Oct 11 01:17:30 crc kubenswrapper[4743]: I1011 01:17:30.711618 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-fdd7c75fc-rtmvl" event={"ID":"5e39acaa-0992-471f-a015-46714fde82cf","Type":"ContainerDied","Data":"8a42a422b4ec2501bc22ee70b22f3500375cb3084687f408f504310195e4e748"} Oct 11 01:17:30 crc kubenswrapper[4743]: I1011 01:17:30.711693 4743 scope.go:117] "RemoveContainer" containerID="18f7a239f2e0707135ecd80c3ce36f7686fd8c2c29bfe008d2373815e615ee5c" Oct 11 01:17:30 crc kubenswrapper[4743]: I1011 01:17:30.794121 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-fdd7c75fc-rtmvl"] Oct 11 01:17:30 crc kubenswrapper[4743]: I1011 01:17:30.807919 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-fdd7c75fc-rtmvl"] Oct 11 01:17:31 crc kubenswrapper[4743]: I1011 01:17:31.729687 4743 generic.go:334] "Generic (PLEG): container finished" podID="60648a98-998f-41d9-aceb-3ad66a1d2b04" containerID="ec47936c8ea8e3016e46f32eef56df713af2a9c92346709a02bfa557442bb542" exitCode=0 Oct 11 01:17:31 crc kubenswrapper[4743]: I1011 01:17:31.729767 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-47nj5" event={"ID":"60648a98-998f-41d9-aceb-3ad66a1d2b04","Type":"ContainerDied","Data":"ec47936c8ea8e3016e46f32eef56df713af2a9c92346709a02bfa557442bb542"} Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.119215 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e39acaa-0992-471f-a015-46714fde82cf" path="/var/lib/kubelet/pods/5e39acaa-0992-471f-a015-46714fde82cf/volumes" Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.293167 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6" Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.354177 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1fc7b43c-1b31-4510-bb3e-a3e3017bd93e-inventory\") pod \"1fc7b43c-1b31-4510-bb3e-a3e3017bd93e\" (UID: \"1fc7b43c-1b31-4510-bb3e-a3e3017bd93e\") " Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.354686 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc7b43c-1b31-4510-bb3e-a3e3017bd93e-repo-setup-combined-ca-bundle\") pod \"1fc7b43c-1b31-4510-bb3e-a3e3017bd93e\" (UID: \"1fc7b43c-1b31-4510-bb3e-a3e3017bd93e\") " Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.354916 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1fc7b43c-1b31-4510-bb3e-a3e3017bd93e-ssh-key\") pod \"1fc7b43c-1b31-4510-bb3e-a3e3017bd93e\" (UID: \"1fc7b43c-1b31-4510-bb3e-a3e3017bd93e\") " Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.355233 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfjcd\" (UniqueName: \"kubernetes.io/projected/1fc7b43c-1b31-4510-bb3e-a3e3017bd93e-kube-api-access-cfjcd\") pod \"1fc7b43c-1b31-4510-bb3e-a3e3017bd93e\" (UID: \"1fc7b43c-1b31-4510-bb3e-a3e3017bd93e\") " Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.364385 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc7b43c-1b31-4510-bb3e-a3e3017bd93e-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "1fc7b43c-1b31-4510-bb3e-a3e3017bd93e" (UID: "1fc7b43c-1b31-4510-bb3e-a3e3017bd93e"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.369713 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fc7b43c-1b31-4510-bb3e-a3e3017bd93e-kube-api-access-cfjcd" (OuterVolumeSpecName: "kube-api-access-cfjcd") pod "1fc7b43c-1b31-4510-bb3e-a3e3017bd93e" (UID: "1fc7b43c-1b31-4510-bb3e-a3e3017bd93e"). InnerVolumeSpecName "kube-api-access-cfjcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.398951 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc7b43c-1b31-4510-bb3e-a3e3017bd93e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1fc7b43c-1b31-4510-bb3e-a3e3017bd93e" (UID: "1fc7b43c-1b31-4510-bb3e-a3e3017bd93e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.411008 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc7b43c-1b31-4510-bb3e-a3e3017bd93e-inventory" (OuterVolumeSpecName: "inventory") pod "1fc7b43c-1b31-4510-bb3e-a3e3017bd93e" (UID: "1fc7b43c-1b31-4510-bb3e-a3e3017bd93e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.458062 4743 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc7b43c-1b31-4510-bb3e-a3e3017bd93e-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.458099 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1fc7b43c-1b31-4510-bb3e-a3e3017bd93e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.458114 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfjcd\" (UniqueName: \"kubernetes.io/projected/1fc7b43c-1b31-4510-bb3e-a3e3017bd93e-kube-api-access-cfjcd\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.458127 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1fc7b43c-1b31-4510-bb3e-a3e3017bd93e-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.747823 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6" event={"ID":"1fc7b43c-1b31-4510-bb3e-a3e3017bd93e","Type":"ContainerDied","Data":"ac8f756f2e2e2d89b60e6437564ba9001174f0b05676e0c848a65e76f2c51e21"} Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.750701 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac8f756f2e2e2d89b60e6437564ba9001174f0b05676e0c848a65e76f2c51e21" Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.747924 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6" Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.859951 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss"] Oct 11 01:17:32 crc kubenswrapper[4743]: E1011 01:17:32.860582 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc7b43c-1b31-4510-bb3e-a3e3017bd93e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.860615 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc7b43c-1b31-4510-bb3e-a3e3017bd93e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 11 01:17:32 crc kubenswrapper[4743]: E1011 01:17:32.860662 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e39acaa-0992-471f-a015-46714fde82cf" containerName="heat-engine" Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.860674 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e39acaa-0992-471f-a015-46714fde82cf" containerName="heat-engine" Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.861059 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fc7b43c-1b31-4510-bb3e-a3e3017bd93e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.861102 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e39acaa-0992-471f-a015-46714fde82cf" containerName="heat-engine" Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.862401 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss" Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.867799 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.868011 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.867813 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.868720 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.899818 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss"] Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.973563 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4lgm\" (UniqueName: \"kubernetes.io/projected/15ec1ee2-7130-47d3-8156-4352228590a6-kube-api-access-v4lgm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss\" (UID: \"15ec1ee2-7130-47d3-8156-4352228590a6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss" Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.973616 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15ec1ee2-7130-47d3-8156-4352228590a6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss\" (UID: \"15ec1ee2-7130-47d3-8156-4352228590a6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss" Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.973689 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/15ec1ee2-7130-47d3-8156-4352228590a6-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss\" (UID: \"15ec1ee2-7130-47d3-8156-4352228590a6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss" Oct 11 01:17:32 crc kubenswrapper[4743]: I1011 01:17:32.973755 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15ec1ee2-7130-47d3-8156-4352228590a6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss\" (UID: \"15ec1ee2-7130-47d3-8156-4352228590a6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss" Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.076056 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15ec1ee2-7130-47d3-8156-4352228590a6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss\" (UID: \"15ec1ee2-7130-47d3-8156-4352228590a6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss" Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.076491 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4lgm\" (UniqueName: \"kubernetes.io/projected/15ec1ee2-7130-47d3-8156-4352228590a6-kube-api-access-v4lgm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss\" (UID: \"15ec1ee2-7130-47d3-8156-4352228590a6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss" Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.076535 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15ec1ee2-7130-47d3-8156-4352228590a6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss\" (UID: \"15ec1ee2-7130-47d3-8156-4352228590a6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss" Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.076629 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/15ec1ee2-7130-47d3-8156-4352228590a6-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss\" (UID: \"15ec1ee2-7130-47d3-8156-4352228590a6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss" Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.084538 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/15ec1ee2-7130-47d3-8156-4352228590a6-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss\" (UID: \"15ec1ee2-7130-47d3-8156-4352228590a6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss" Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.090374 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15ec1ee2-7130-47d3-8156-4352228590a6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss\" (UID: \"15ec1ee2-7130-47d3-8156-4352228590a6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss" Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.093564 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15ec1ee2-7130-47d3-8156-4352228590a6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss\" (UID: \"15ec1ee2-7130-47d3-8156-4352228590a6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss" Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.096327 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4lgm\" (UniqueName: \"kubernetes.io/projected/15ec1ee2-7130-47d3-8156-4352228590a6-kube-api-access-v4lgm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss\" (UID: \"15ec1ee2-7130-47d3-8156-4352228590a6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss" Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.158328 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-47nj5" Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.214136 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss" Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.284801 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64s5k\" (UniqueName: \"kubernetes.io/projected/60648a98-998f-41d9-aceb-3ad66a1d2b04-kube-api-access-64s5k\") pod \"60648a98-998f-41d9-aceb-3ad66a1d2b04\" (UID: \"60648a98-998f-41d9-aceb-3ad66a1d2b04\") " Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.284898 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60648a98-998f-41d9-aceb-3ad66a1d2b04-combined-ca-bundle\") pod \"60648a98-998f-41d9-aceb-3ad66a1d2b04\" (UID: \"60648a98-998f-41d9-aceb-3ad66a1d2b04\") " Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.284978 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60648a98-998f-41d9-aceb-3ad66a1d2b04-scripts\") pod \"60648a98-998f-41d9-aceb-3ad66a1d2b04\" (UID: \"60648a98-998f-41d9-aceb-3ad66a1d2b04\") " Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.285143 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60648a98-998f-41d9-aceb-3ad66a1d2b04-config-data\") pod \"60648a98-998f-41d9-aceb-3ad66a1d2b04\" (UID: \"60648a98-998f-41d9-aceb-3ad66a1d2b04\") " Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.289764 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60648a98-998f-41d9-aceb-3ad66a1d2b04-scripts" (OuterVolumeSpecName: "scripts") pod "60648a98-998f-41d9-aceb-3ad66a1d2b04" (UID: "60648a98-998f-41d9-aceb-3ad66a1d2b04"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.289968 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60648a98-998f-41d9-aceb-3ad66a1d2b04-kube-api-access-64s5k" (OuterVolumeSpecName: "kube-api-access-64s5k") pod "60648a98-998f-41d9-aceb-3ad66a1d2b04" (UID: "60648a98-998f-41d9-aceb-3ad66a1d2b04"). InnerVolumeSpecName "kube-api-access-64s5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.320148 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60648a98-998f-41d9-aceb-3ad66a1d2b04-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60648a98-998f-41d9-aceb-3ad66a1d2b04" (UID: "60648a98-998f-41d9-aceb-3ad66a1d2b04"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.330803 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60648a98-998f-41d9-aceb-3ad66a1d2b04-config-data" (OuterVolumeSpecName: "config-data") pod "60648a98-998f-41d9-aceb-3ad66a1d2b04" (UID: "60648a98-998f-41d9-aceb-3ad66a1d2b04"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.388288 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60648a98-998f-41d9-aceb-3ad66a1d2b04-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.388334 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64s5k\" (UniqueName: \"kubernetes.io/projected/60648a98-998f-41d9-aceb-3ad66a1d2b04-kube-api-access-64s5k\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.388348 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60648a98-998f-41d9-aceb-3ad66a1d2b04-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.388358 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60648a98-998f-41d9-aceb-3ad66a1d2b04-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.748148 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss"] Oct 11 01:17:33 crc kubenswrapper[4743]: W1011 01:17:33.752075 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15ec1ee2_7130_47d3_8156_4352228590a6.slice/crio-b079c1de47a29cdd89738cffe15cd79a950e3434f0ea59c53647b74babd254e6 WatchSource:0}: Error finding container b079c1de47a29cdd89738cffe15cd79a950e3434f0ea59c53647b74babd254e6: Status 404 returned error can't find the container with id b079c1de47a29cdd89738cffe15cd79a950e3434f0ea59c53647b74babd254e6 Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.763216 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-47nj5" event={"ID":"60648a98-998f-41d9-aceb-3ad66a1d2b04","Type":"ContainerDied","Data":"5341ff3ca1f47f185cbb598406aa7309fba68c65c27bd5db3bb53ba61d849e7b"} Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.763252 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5341ff3ca1f47f185cbb598406aa7309fba68c65c27bd5db3bb53ba61d849e7b" Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.763324 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-47nj5" Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.893834 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.894122 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="254da181-6fe9-4682-bd43-816f813ba12e" containerName="aodh-api" containerID="cri-o://d128ad37459eb5634a29cf39a7494af9679bd6902c3f6ee6b2216d3be40b3078" gracePeriod=30 Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.894475 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="254da181-6fe9-4682-bd43-816f813ba12e" containerName="aodh-notifier" containerID="cri-o://da047fb9ecb7be96c74cda03df8686e154b6765b8ccdc04985e5f900ae0f7bea" gracePeriod=30 Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.894499 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="254da181-6fe9-4682-bd43-816f813ba12e" containerName="aodh-evaluator" containerID="cri-o://033e2e558ddb6cb699a9662bc357732dd4a800294d81a4c8770f67688fb25170" gracePeriod=30 Oct 11 01:17:33 crc kubenswrapper[4743]: I1011 01:17:33.894545 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="254da181-6fe9-4682-bd43-816f813ba12e" containerName="aodh-listener" containerID="cri-o://e2821ec74f463bf28d96387c01c02ec061d3a17ef4b5482121c5f82f1a340d29" gracePeriod=30 Oct 11 01:17:34 crc kubenswrapper[4743]: I1011 01:17:34.786293 4743 generic.go:334] "Generic (PLEG): container finished" podID="254da181-6fe9-4682-bd43-816f813ba12e" containerID="033e2e558ddb6cb699a9662bc357732dd4a800294d81a4c8770f67688fb25170" exitCode=0 Oct 11 01:17:34 crc kubenswrapper[4743]: I1011 01:17:34.786880 4743 generic.go:334] "Generic (PLEG): container finished" podID="254da181-6fe9-4682-bd43-816f813ba12e" containerID="d128ad37459eb5634a29cf39a7494af9679bd6902c3f6ee6b2216d3be40b3078" exitCode=0 Oct 11 01:17:34 crc kubenswrapper[4743]: I1011 01:17:34.786325 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"254da181-6fe9-4682-bd43-816f813ba12e","Type":"ContainerDied","Data":"033e2e558ddb6cb699a9662bc357732dd4a800294d81a4c8770f67688fb25170"} Oct 11 01:17:34 crc kubenswrapper[4743]: I1011 01:17:34.786937 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"254da181-6fe9-4682-bd43-816f813ba12e","Type":"ContainerDied","Data":"d128ad37459eb5634a29cf39a7494af9679bd6902c3f6ee6b2216d3be40b3078"} Oct 11 01:17:34 crc kubenswrapper[4743]: I1011 01:17:34.789914 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss" event={"ID":"15ec1ee2-7130-47d3-8156-4352228590a6","Type":"ContainerStarted","Data":"1f55586e20dac03f58a2e80d1f7ceea8f344878d3d78dc137749715f51568cce"} Oct 11 01:17:34 crc kubenswrapper[4743]: I1011 01:17:34.789944 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss" event={"ID":"15ec1ee2-7130-47d3-8156-4352228590a6","Type":"ContainerStarted","Data":"b079c1de47a29cdd89738cffe15cd79a950e3434f0ea59c53647b74babd254e6"} Oct 11 01:17:34 crc kubenswrapper[4743]: I1011 01:17:34.820049 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss" podStartSLOduration=2.186568534 podStartE2EDuration="2.8200219s" podCreationTimestamp="2025-10-11 01:17:32 +0000 UTC" firstStartedPulling="2025-10-11 01:17:33.759775378 +0000 UTC m=+1548.412755775" lastFinishedPulling="2025-10-11 01:17:34.393228744 +0000 UTC m=+1549.046209141" observedRunningTime="2025-10-11 01:17:34.80498616 +0000 UTC m=+1549.457966577" watchObservedRunningTime="2025-10-11 01:17:34.8200219 +0000 UTC m=+1549.473002327" Oct 11 01:17:35 crc kubenswrapper[4743]: I1011 01:17:35.092001 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:17:35 crc kubenswrapper[4743]: E1011 01:17:35.092262 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:17:35 crc kubenswrapper[4743]: I1011 01:17:35.809221 4743 generic.go:334] "Generic (PLEG): container finished" podID="254da181-6fe9-4682-bd43-816f813ba12e" containerID="da047fb9ecb7be96c74cda03df8686e154b6765b8ccdc04985e5f900ae0f7bea" exitCode=0 Oct 11 01:17:35 crc kubenswrapper[4743]: I1011 01:17:35.809294 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"254da181-6fe9-4682-bd43-816f813ba12e","Type":"ContainerDied","Data":"da047fb9ecb7be96c74cda03df8686e154b6765b8ccdc04985e5f900ae0f7bea"} Oct 11 01:17:36 crc kubenswrapper[4743]: I1011 01:17:36.827841 4743 generic.go:334] "Generic (PLEG): container finished" podID="254da181-6fe9-4682-bd43-816f813ba12e" containerID="e2821ec74f463bf28d96387c01c02ec061d3a17ef4b5482121c5f82f1a340d29" exitCode=0 Oct 11 01:17:36 crc kubenswrapper[4743]: I1011 01:17:36.827919 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"254da181-6fe9-4682-bd43-816f813ba12e","Type":"ContainerDied","Data":"e2821ec74f463bf28d96387c01c02ec061d3a17ef4b5482121c5f82f1a340d29"} Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.152496 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.266730 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mh2h\" (UniqueName: \"kubernetes.io/projected/254da181-6fe9-4682-bd43-816f813ba12e-kube-api-access-8mh2h\") pod \"254da181-6fe9-4682-bd43-816f813ba12e\" (UID: \"254da181-6fe9-4682-bd43-816f813ba12e\") " Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.266790 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-config-data\") pod \"254da181-6fe9-4682-bd43-816f813ba12e\" (UID: \"254da181-6fe9-4682-bd43-816f813ba12e\") " Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.266840 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-scripts\") pod \"254da181-6fe9-4682-bd43-816f813ba12e\" (UID: \"254da181-6fe9-4682-bd43-816f813ba12e\") " Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.266875 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-public-tls-certs\") pod \"254da181-6fe9-4682-bd43-816f813ba12e\" (UID: \"254da181-6fe9-4682-bd43-816f813ba12e\") " Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.266969 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-internal-tls-certs\") pod \"254da181-6fe9-4682-bd43-816f813ba12e\" (UID: \"254da181-6fe9-4682-bd43-816f813ba12e\") " Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.267020 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-combined-ca-bundle\") pod \"254da181-6fe9-4682-bd43-816f813ba12e\" (UID: \"254da181-6fe9-4682-bd43-816f813ba12e\") " Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.274193 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-scripts" (OuterVolumeSpecName: "scripts") pod "254da181-6fe9-4682-bd43-816f813ba12e" (UID: "254da181-6fe9-4682-bd43-816f813ba12e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.275995 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/254da181-6fe9-4682-bd43-816f813ba12e-kube-api-access-8mh2h" (OuterVolumeSpecName: "kube-api-access-8mh2h") pod "254da181-6fe9-4682-bd43-816f813ba12e" (UID: "254da181-6fe9-4682-bd43-816f813ba12e"). InnerVolumeSpecName "kube-api-access-8mh2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.341049 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "254da181-6fe9-4682-bd43-816f813ba12e" (UID: "254da181-6fe9-4682-bd43-816f813ba12e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.341575 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "254da181-6fe9-4682-bd43-816f813ba12e" (UID: "254da181-6fe9-4682-bd43-816f813ba12e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.375726 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.375764 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.375776 4743 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.375786 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mh2h\" (UniqueName: \"kubernetes.io/projected/254da181-6fe9-4682-bd43-816f813ba12e-kube-api-access-8mh2h\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.398956 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-config-data" (OuterVolumeSpecName: "config-data") pod "254da181-6fe9-4682-bd43-816f813ba12e" (UID: "254da181-6fe9-4682-bd43-816f813ba12e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.430901 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "254da181-6fe9-4682-bd43-816f813ba12e" (UID: "254da181-6fe9-4682-bd43-816f813ba12e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.477348 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.477576 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/254da181-6fe9-4682-bd43-816f813ba12e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.843723 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"254da181-6fe9-4682-bd43-816f813ba12e","Type":"ContainerDied","Data":"7099b2a1fa41c98fb35217617f117fd1dd2403a154d2738f63140eddafa32b82"} Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.844104 4743 scope.go:117] "RemoveContainer" containerID="e2821ec74f463bf28d96387c01c02ec061d3a17ef4b5482121c5f82f1a340d29" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.843930 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.882239 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.886012 4743 scope.go:117] "RemoveContainer" containerID="da047fb9ecb7be96c74cda03df8686e154b6765b8ccdc04985e5f900ae0f7bea" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.894175 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.913294 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 11 01:17:37 crc kubenswrapper[4743]: E1011 01:17:37.913695 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="254da181-6fe9-4682-bd43-816f813ba12e" containerName="aodh-evaluator" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.913710 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="254da181-6fe9-4682-bd43-816f813ba12e" containerName="aodh-evaluator" Oct 11 01:17:37 crc kubenswrapper[4743]: E1011 01:17:37.913719 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60648a98-998f-41d9-aceb-3ad66a1d2b04" containerName="aodh-db-sync" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.913725 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="60648a98-998f-41d9-aceb-3ad66a1d2b04" containerName="aodh-db-sync" Oct 11 01:17:37 crc kubenswrapper[4743]: E1011 01:17:37.913744 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="254da181-6fe9-4682-bd43-816f813ba12e" containerName="aodh-listener" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.913752 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="254da181-6fe9-4682-bd43-816f813ba12e" containerName="aodh-listener" Oct 11 01:17:37 crc kubenswrapper[4743]: E1011 01:17:37.913771 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="254da181-6fe9-4682-bd43-816f813ba12e" containerName="aodh-notifier" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.913777 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="254da181-6fe9-4682-bd43-816f813ba12e" containerName="aodh-notifier" Oct 11 01:17:37 crc kubenswrapper[4743]: E1011 01:17:37.913790 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="254da181-6fe9-4682-bd43-816f813ba12e" containerName="aodh-api" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.913795 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="254da181-6fe9-4682-bd43-816f813ba12e" containerName="aodh-api" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.913984 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="254da181-6fe9-4682-bd43-816f813ba12e" containerName="aodh-notifier" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.914003 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="254da181-6fe9-4682-bd43-816f813ba12e" containerName="aodh-api" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.914016 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="254da181-6fe9-4682-bd43-816f813ba12e" containerName="aodh-evaluator" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.914032 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="60648a98-998f-41d9-aceb-3ad66a1d2b04" containerName="aodh-db-sync" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.914044 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="254da181-6fe9-4682-bd43-816f813ba12e" containerName="aodh-listener" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.916627 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.919007 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.919851 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-dnc99" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.920198 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.920216 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.920053 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.932056 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 11 01:17:37 crc kubenswrapper[4743]: I1011 01:17:37.978530 4743 scope.go:117] "RemoveContainer" containerID="033e2e558ddb6cb699a9662bc357732dd4a800294d81a4c8770f67688fb25170" Oct 11 01:17:38 crc kubenswrapper[4743]: I1011 01:17:38.000752 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd9v6\" (UniqueName: \"kubernetes.io/projected/b61fc8c0-014e-481a-b189-e554dced0696-kube-api-access-zd9v6\") pod \"aodh-0\" (UID: \"b61fc8c0-014e-481a-b189-e554dced0696\") " pod="openstack/aodh-0" Oct 11 01:17:38 crc kubenswrapper[4743]: I1011 01:17:38.004491 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b61fc8c0-014e-481a-b189-e554dced0696-scripts\") pod \"aodh-0\" (UID: \"b61fc8c0-014e-481a-b189-e554dced0696\") " pod="openstack/aodh-0" Oct 11 01:17:38 crc kubenswrapper[4743]: I1011 01:17:38.004674 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b61fc8c0-014e-481a-b189-e554dced0696-public-tls-certs\") pod \"aodh-0\" (UID: \"b61fc8c0-014e-481a-b189-e554dced0696\") " pod="openstack/aodh-0" Oct 11 01:17:38 crc kubenswrapper[4743]: I1011 01:17:38.004826 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b61fc8c0-014e-481a-b189-e554dced0696-internal-tls-certs\") pod \"aodh-0\" (UID: \"b61fc8c0-014e-481a-b189-e554dced0696\") " pod="openstack/aodh-0" Oct 11 01:17:38 crc kubenswrapper[4743]: I1011 01:17:38.004948 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b61fc8c0-014e-481a-b189-e554dced0696-config-data\") pod \"aodh-0\" (UID: \"b61fc8c0-014e-481a-b189-e554dced0696\") " pod="openstack/aodh-0" Oct 11 01:17:38 crc kubenswrapper[4743]: I1011 01:17:38.005040 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b61fc8c0-014e-481a-b189-e554dced0696-combined-ca-bundle\") pod \"aodh-0\" (UID: \"b61fc8c0-014e-481a-b189-e554dced0696\") " pod="openstack/aodh-0" Oct 11 01:17:38 crc kubenswrapper[4743]: I1011 01:17:38.024676 4743 scope.go:117] "RemoveContainer" containerID="d128ad37459eb5634a29cf39a7494af9679bd6902c3f6ee6b2216d3be40b3078" Oct 11 01:17:38 crc kubenswrapper[4743]: I1011 01:17:38.104106 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="254da181-6fe9-4682-bd43-816f813ba12e" path="/var/lib/kubelet/pods/254da181-6fe9-4682-bd43-816f813ba12e/volumes" Oct 11 01:17:38 crc kubenswrapper[4743]: I1011 01:17:38.107456 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b61fc8c0-014e-481a-b189-e554dced0696-config-data\") pod \"aodh-0\" (UID: \"b61fc8c0-014e-481a-b189-e554dced0696\") " pod="openstack/aodh-0" Oct 11 01:17:38 crc kubenswrapper[4743]: I1011 01:17:38.107513 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b61fc8c0-014e-481a-b189-e554dced0696-combined-ca-bundle\") pod \"aodh-0\" (UID: \"b61fc8c0-014e-481a-b189-e554dced0696\") " pod="openstack/aodh-0" Oct 11 01:17:38 crc kubenswrapper[4743]: I1011 01:17:38.108069 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd9v6\" (UniqueName: \"kubernetes.io/projected/b61fc8c0-014e-481a-b189-e554dced0696-kube-api-access-zd9v6\") pod \"aodh-0\" (UID: \"b61fc8c0-014e-481a-b189-e554dced0696\") " pod="openstack/aodh-0" Oct 11 01:17:38 crc kubenswrapper[4743]: I1011 01:17:38.108197 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b61fc8c0-014e-481a-b189-e554dced0696-scripts\") pod \"aodh-0\" (UID: \"b61fc8c0-014e-481a-b189-e554dced0696\") " pod="openstack/aodh-0" Oct 11 01:17:38 crc kubenswrapper[4743]: I1011 01:17:38.108308 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b61fc8c0-014e-481a-b189-e554dced0696-public-tls-certs\") pod \"aodh-0\" (UID: \"b61fc8c0-014e-481a-b189-e554dced0696\") " pod="openstack/aodh-0" Oct 11 01:17:38 crc kubenswrapper[4743]: I1011 01:17:38.108389 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b61fc8c0-014e-481a-b189-e554dced0696-internal-tls-certs\") pod \"aodh-0\" (UID: \"b61fc8c0-014e-481a-b189-e554dced0696\") " pod="openstack/aodh-0" Oct 11 01:17:38 crc kubenswrapper[4743]: I1011 01:17:38.111475 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b61fc8c0-014e-481a-b189-e554dced0696-combined-ca-bundle\") pod \"aodh-0\" (UID: \"b61fc8c0-014e-481a-b189-e554dced0696\") " pod="openstack/aodh-0" Oct 11 01:17:38 crc kubenswrapper[4743]: I1011 01:17:38.111979 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b61fc8c0-014e-481a-b189-e554dced0696-public-tls-certs\") pod \"aodh-0\" (UID: \"b61fc8c0-014e-481a-b189-e554dced0696\") " pod="openstack/aodh-0" Oct 11 01:17:38 crc kubenswrapper[4743]: I1011 01:17:38.112887 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b61fc8c0-014e-481a-b189-e554dced0696-internal-tls-certs\") pod \"aodh-0\" (UID: \"b61fc8c0-014e-481a-b189-e554dced0696\") " pod="openstack/aodh-0" Oct 11 01:17:38 crc kubenswrapper[4743]: I1011 01:17:38.113083 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b61fc8c0-014e-481a-b189-e554dced0696-config-data\") pod \"aodh-0\" (UID: \"b61fc8c0-014e-481a-b189-e554dced0696\") " pod="openstack/aodh-0" Oct 11 01:17:38 crc kubenswrapper[4743]: I1011 01:17:38.115111 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b61fc8c0-014e-481a-b189-e554dced0696-scripts\") pod \"aodh-0\" (UID: \"b61fc8c0-014e-481a-b189-e554dced0696\") " pod="openstack/aodh-0" Oct 11 01:17:38 crc kubenswrapper[4743]: I1011 01:17:38.125386 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd9v6\" (UniqueName: \"kubernetes.io/projected/b61fc8c0-014e-481a-b189-e554dced0696-kube-api-access-zd9v6\") pod \"aodh-0\" (UID: \"b61fc8c0-014e-481a-b189-e554dced0696\") " pod="openstack/aodh-0" Oct 11 01:17:38 crc kubenswrapper[4743]: I1011 01:17:38.310726 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 11 01:17:38 crc kubenswrapper[4743]: I1011 01:17:38.802492 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 11 01:17:38 crc kubenswrapper[4743]: I1011 01:17:38.864395 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b61fc8c0-014e-481a-b189-e554dced0696","Type":"ContainerStarted","Data":"152cf9fc82d119650c60420f43e3c5a4dcbf88ac830b833c0048fde0de4104d2"} Oct 11 01:17:39 crc kubenswrapper[4743]: I1011 01:17:39.880632 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b61fc8c0-014e-481a-b189-e554dced0696","Type":"ContainerStarted","Data":"82f7fcb2aed870abe2deca1b662265b4d81b0105a8dbb5ebf59dd7e1a5b1d094"} Oct 11 01:17:40 crc kubenswrapper[4743]: I1011 01:17:40.893376 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b61fc8c0-014e-481a-b189-e554dced0696","Type":"ContainerStarted","Data":"cb06f36fccbe5f389df950dafe58051d534a72dc9c254fb9526a634d341fdb22"} Oct 11 01:17:41 crc kubenswrapper[4743]: I1011 01:17:41.906453 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b61fc8c0-014e-481a-b189-e554dced0696","Type":"ContainerStarted","Data":"70ea74cda14f1c5945a2b761abb800cc52dc0cd9dce0aded4f5e7a09abdbcde9"} Oct 11 01:17:42 crc kubenswrapper[4743]: I1011 01:17:42.920379 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b61fc8c0-014e-481a-b189-e554dced0696","Type":"ContainerStarted","Data":"ff2dd51c02fbe1582c877a69a6d7189ec366b2bc65acfa67e0cb4eb9308a3fee"} Oct 11 01:17:42 crc kubenswrapper[4743]: I1011 01:17:42.944808 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.33279257 podStartE2EDuration="5.944789366s" podCreationTimestamp="2025-10-11 01:17:37 +0000 UTC" firstStartedPulling="2025-10-11 01:17:38.800256925 +0000 UTC m=+1553.453237322" lastFinishedPulling="2025-10-11 01:17:42.412253721 +0000 UTC m=+1557.065234118" observedRunningTime="2025-10-11 01:17:42.937162129 +0000 UTC m=+1557.590142536" watchObservedRunningTime="2025-10-11 01:17:42.944789366 +0000 UTC m=+1557.597769763" Oct 11 01:17:47 crc kubenswrapper[4743]: I1011 01:17:47.092598 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:17:47 crc kubenswrapper[4743]: E1011 01:17:47.093818 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:17:48 crc kubenswrapper[4743]: I1011 01:17:48.886371 4743 scope.go:117] "RemoveContainer" containerID="77945c6fa9449bd57597a49f83be1a34d5a578743bf8d18b705387926f910e3d" Oct 11 01:17:48 crc kubenswrapper[4743]: I1011 01:17:48.922144 4743 scope.go:117] "RemoveContainer" containerID="18d5924ee91371fd1ad1e224761869e30787b5d666608ba26d05d7cefcfe9f7b" Oct 11 01:17:49 crc kubenswrapper[4743]: I1011 01:17:49.000997 4743 scope.go:117] "RemoveContainer" containerID="b160b4394e8b0c73d683003f5311518dd3780e0891042cfe320546e6b9556f90" Oct 11 01:17:49 crc kubenswrapper[4743]: I1011 01:17:49.034629 4743 scope.go:117] "RemoveContainer" containerID="5088c25b9ecada27037e94ce63bb12f86d1d169720a14623b47acf2816b9127b" Oct 11 01:17:49 crc kubenswrapper[4743]: I1011 01:17:49.086090 4743 scope.go:117] "RemoveContainer" containerID="90c860b4a98badaee54fc3acd4233190ce617e4e2decbf77a26d43c826a2f5e0" Oct 11 01:17:58 crc kubenswrapper[4743]: I1011 01:17:58.092055 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:17:58 crc kubenswrapper[4743]: E1011 01:17:58.093110 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:18:09 crc kubenswrapper[4743]: I1011 01:18:09.093113 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:18:09 crc kubenswrapper[4743]: E1011 01:18:09.094477 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:18:23 crc kubenswrapper[4743]: I1011 01:18:23.092151 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:18:23 crc kubenswrapper[4743]: E1011 01:18:23.093357 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:18:38 crc kubenswrapper[4743]: I1011 01:18:38.092577 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:18:38 crc kubenswrapper[4743]: E1011 01:18:38.093521 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:18:49 crc kubenswrapper[4743]: I1011 01:18:49.092731 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:18:49 crc kubenswrapper[4743]: E1011 01:18:49.095481 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:18:49 crc kubenswrapper[4743]: I1011 01:18:49.389633 4743 scope.go:117] "RemoveContainer" containerID="acca0d906d588645941325e81b8e7c694904bf1ff868d3295899384509bd4526" Oct 11 01:19:00 crc kubenswrapper[4743]: I1011 01:19:00.092952 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:19:00 crc kubenswrapper[4743]: E1011 01:19:00.093834 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:19:13 crc kubenswrapper[4743]: I1011 01:19:13.091674 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:19:13 crc kubenswrapper[4743]: E1011 01:19:13.092505 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:19:27 crc kubenswrapper[4743]: I1011 01:19:27.093607 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:19:27 crc kubenswrapper[4743]: E1011 01:19:27.096304 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:19:38 crc kubenswrapper[4743]: I1011 01:19:38.092875 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:19:38 crc kubenswrapper[4743]: E1011 01:19:38.093617 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:19:50 crc kubenswrapper[4743]: I1011 01:19:50.092342 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:19:50 crc kubenswrapper[4743]: E1011 01:19:50.093575 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:20:01 crc kubenswrapper[4743]: I1011 01:20:01.091839 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:20:01 crc kubenswrapper[4743]: E1011 01:20:01.093033 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:20:14 crc kubenswrapper[4743]: I1011 01:20:14.091679 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:20:14 crc kubenswrapper[4743]: E1011 01:20:14.092745 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:20:25 crc kubenswrapper[4743]: I1011 01:20:25.093117 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:20:25 crc kubenswrapper[4743]: E1011 01:20:25.094273 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:20:39 crc kubenswrapper[4743]: I1011 01:20:39.091489 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:20:39 crc kubenswrapper[4743]: E1011 01:20:39.092220 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:20:47 crc kubenswrapper[4743]: I1011 01:20:47.269171 4743 generic.go:334] "Generic (PLEG): container finished" podID="15ec1ee2-7130-47d3-8156-4352228590a6" containerID="1f55586e20dac03f58a2e80d1f7ceea8f344878d3d78dc137749715f51568cce" exitCode=0 Oct 11 01:20:47 crc kubenswrapper[4743]: I1011 01:20:47.269263 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss" event={"ID":"15ec1ee2-7130-47d3-8156-4352228590a6","Type":"ContainerDied","Data":"1f55586e20dac03f58a2e80d1f7ceea8f344878d3d78dc137749715f51568cce"} Oct 11 01:20:48 crc kubenswrapper[4743]: I1011 01:20:48.769393 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss" Oct 11 01:20:48 crc kubenswrapper[4743]: I1011 01:20:48.913040 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15ec1ee2-7130-47d3-8156-4352228590a6-bootstrap-combined-ca-bundle\") pod \"15ec1ee2-7130-47d3-8156-4352228590a6\" (UID: \"15ec1ee2-7130-47d3-8156-4352228590a6\") " Oct 11 01:20:48 crc kubenswrapper[4743]: I1011 01:20:48.913646 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/15ec1ee2-7130-47d3-8156-4352228590a6-ssh-key\") pod \"15ec1ee2-7130-47d3-8156-4352228590a6\" (UID: \"15ec1ee2-7130-47d3-8156-4352228590a6\") " Oct 11 01:20:48 crc kubenswrapper[4743]: I1011 01:20:48.913747 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4lgm\" (UniqueName: \"kubernetes.io/projected/15ec1ee2-7130-47d3-8156-4352228590a6-kube-api-access-v4lgm\") pod \"15ec1ee2-7130-47d3-8156-4352228590a6\" (UID: \"15ec1ee2-7130-47d3-8156-4352228590a6\") " Oct 11 01:20:48 crc kubenswrapper[4743]: I1011 01:20:48.913916 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15ec1ee2-7130-47d3-8156-4352228590a6-inventory\") pod \"15ec1ee2-7130-47d3-8156-4352228590a6\" (UID: \"15ec1ee2-7130-47d3-8156-4352228590a6\") " Oct 11 01:20:48 crc kubenswrapper[4743]: I1011 01:20:48.918793 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15ec1ee2-7130-47d3-8156-4352228590a6-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "15ec1ee2-7130-47d3-8156-4352228590a6" (UID: "15ec1ee2-7130-47d3-8156-4352228590a6"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:20:48 crc kubenswrapper[4743]: I1011 01:20:48.927073 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15ec1ee2-7130-47d3-8156-4352228590a6-kube-api-access-v4lgm" (OuterVolumeSpecName: "kube-api-access-v4lgm") pod "15ec1ee2-7130-47d3-8156-4352228590a6" (UID: "15ec1ee2-7130-47d3-8156-4352228590a6"). InnerVolumeSpecName "kube-api-access-v4lgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:20:48 crc kubenswrapper[4743]: I1011 01:20:48.946273 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15ec1ee2-7130-47d3-8156-4352228590a6-inventory" (OuterVolumeSpecName: "inventory") pod "15ec1ee2-7130-47d3-8156-4352228590a6" (UID: "15ec1ee2-7130-47d3-8156-4352228590a6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:20:48 crc kubenswrapper[4743]: I1011 01:20:48.958988 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15ec1ee2-7130-47d3-8156-4352228590a6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "15ec1ee2-7130-47d3-8156-4352228590a6" (UID: "15ec1ee2-7130-47d3-8156-4352228590a6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.016958 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/15ec1ee2-7130-47d3-8156-4352228590a6-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.016993 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4lgm\" (UniqueName: \"kubernetes.io/projected/15ec1ee2-7130-47d3-8156-4352228590a6-kube-api-access-v4lgm\") on node \"crc\" DevicePath \"\"" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.017007 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15ec1ee2-7130-47d3-8156-4352228590a6-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.017016 4743 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15ec1ee2-7130-47d3-8156-4352228590a6-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.302312 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss" event={"ID":"15ec1ee2-7130-47d3-8156-4352228590a6","Type":"ContainerDied","Data":"b079c1de47a29cdd89738cffe15cd79a950e3434f0ea59c53647b74babd254e6"} Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.302363 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b079c1de47a29cdd89738cffe15cd79a950e3434f0ea59c53647b74babd254e6" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.302387 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.407700 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs"] Oct 11 01:20:49 crc kubenswrapper[4743]: E1011 01:20:49.408283 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15ec1ee2-7130-47d3-8156-4352228590a6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.408304 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="15ec1ee2-7130-47d3-8156-4352228590a6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.408563 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="15ec1ee2-7130-47d3-8156-4352228590a6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.409429 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.411335 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.411958 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.411968 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.412817 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.426468 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs"] Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.498706 4743 scope.go:117] "RemoveContainer" containerID="362c681795501aec8d43ce09e7837eebef382d9b2352f1ae2e7a176e2113b024" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.530604 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18dd49f1-d2cb-4e7b-b427-971bda666f14-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs\" (UID: \"18dd49f1-d2cb-4e7b-b427-971bda666f14\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.530771 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18dd49f1-d2cb-4e7b-b427-971bda666f14-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs\" (UID: \"18dd49f1-d2cb-4e7b-b427-971bda666f14\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.530951 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rflh5\" (UniqueName: \"kubernetes.io/projected/18dd49f1-d2cb-4e7b-b427-971bda666f14-kube-api-access-rflh5\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs\" (UID: \"18dd49f1-d2cb-4e7b-b427-971bda666f14\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.533729 4743 scope.go:117] "RemoveContainer" containerID="9fce45c61e11f7e31457a8d26df0d96937c1c5893b583ce261b36f706cea3973" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.566364 4743 scope.go:117] "RemoveContainer" containerID="7bd09168ae2e684cd0fff7d087d25c45753110e20e21a220d95f12439edb08fb" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.613475 4743 scope.go:117] "RemoveContainer" containerID="4eb366535771ea42329ec46200c035fb66a8fd4ca6c5fa3fc1b05abff4bc1fe5" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.633273 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18dd49f1-d2cb-4e7b-b427-971bda666f14-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs\" (UID: \"18dd49f1-d2cb-4e7b-b427-971bda666f14\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.633409 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18dd49f1-d2cb-4e7b-b427-971bda666f14-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs\" (UID: \"18dd49f1-d2cb-4e7b-b427-971bda666f14\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.633515 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rflh5\" (UniqueName: \"kubernetes.io/projected/18dd49f1-d2cb-4e7b-b427-971bda666f14-kube-api-access-rflh5\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs\" (UID: \"18dd49f1-d2cb-4e7b-b427-971bda666f14\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.637406 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18dd49f1-d2cb-4e7b-b427-971bda666f14-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs\" (UID: \"18dd49f1-d2cb-4e7b-b427-971bda666f14\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.638568 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18dd49f1-d2cb-4e7b-b427-971bda666f14-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs\" (UID: \"18dd49f1-d2cb-4e7b-b427-971bda666f14\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.642632 4743 scope.go:117] "RemoveContainer" containerID="f87f6575ead149a7c51791d8c58d34b33501e2d070690f495eb06f057a97e875" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.662292 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rflh5\" (UniqueName: \"kubernetes.io/projected/18dd49f1-d2cb-4e7b-b427-971bda666f14-kube-api-access-rflh5\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs\" (UID: \"18dd49f1-d2cb-4e7b-b427-971bda666f14\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs" Oct 11 01:20:49 crc kubenswrapper[4743]: I1011 01:20:49.731251 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs" Oct 11 01:20:50 crc kubenswrapper[4743]: I1011 01:20:50.091876 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:20:50 crc kubenswrapper[4743]: E1011 01:20:50.092612 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:20:50 crc kubenswrapper[4743]: I1011 01:20:50.362981 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs"] Oct 11 01:20:51 crc kubenswrapper[4743]: I1011 01:20:51.330086 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs" event={"ID":"18dd49f1-d2cb-4e7b-b427-971bda666f14","Type":"ContainerStarted","Data":"d2877a6772eb59c85b29003056f035db5d713ac4adeec746228f98972180eddb"} Oct 11 01:20:52 crc kubenswrapper[4743]: I1011 01:20:52.342006 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs" event={"ID":"18dd49f1-d2cb-4e7b-b427-971bda666f14","Type":"ContainerStarted","Data":"319cc7abdc442387fa721383536d6b4b2212dab24aec428ede031df238824c38"} Oct 11 01:20:52 crc kubenswrapper[4743]: I1011 01:20:52.372269 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs" podStartSLOduration=2.3705893319999998 podStartE2EDuration="3.372240868s" podCreationTimestamp="2025-10-11 01:20:49 +0000 UTC" firstStartedPulling="2025-10-11 01:20:50.362043772 +0000 UTC m=+1745.015024169" lastFinishedPulling="2025-10-11 01:20:51.363695308 +0000 UTC m=+1746.016675705" observedRunningTime="2025-10-11 01:20:52.366167096 +0000 UTC m=+1747.019147513" watchObservedRunningTime="2025-10-11 01:20:52.372240868 +0000 UTC m=+1747.025221295" Oct 11 01:21:03 crc kubenswrapper[4743]: I1011 01:21:03.045287 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-tf857"] Oct 11 01:21:03 crc kubenswrapper[4743]: I1011 01:21:03.062900 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-r8cj7"] Oct 11 01:21:03 crc kubenswrapper[4743]: I1011 01:21:03.079165 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-tf857"] Oct 11 01:21:03 crc kubenswrapper[4743]: I1011 01:21:03.088052 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-jdrfb"] Oct 11 01:21:03 crc kubenswrapper[4743]: I1011 01:21:03.092049 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:21:03 crc kubenswrapper[4743]: E1011 01:21:03.092339 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:21:03 crc kubenswrapper[4743]: I1011 01:21:03.095995 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-r8cj7"] Oct 11 01:21:03 crc kubenswrapper[4743]: I1011 01:21:03.107411 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-wjqbw"] Oct 11 01:21:03 crc kubenswrapper[4743]: I1011 01:21:03.118152 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-jdrfb"] Oct 11 01:21:03 crc kubenswrapper[4743]: I1011 01:21:03.127640 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-wjqbw"] Oct 11 01:21:04 crc kubenswrapper[4743]: I1011 01:21:04.144815 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1561cfd4-e0e2-4f2b-ae94-195bc9061df5" path="/var/lib/kubelet/pods/1561cfd4-e0e2-4f2b-ae94-195bc9061df5/volumes" Oct 11 01:21:04 crc kubenswrapper[4743]: I1011 01:21:04.146261 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aacb3f4-8c2a-4f30-b51a-c31b9c89cd28" path="/var/lib/kubelet/pods/4aacb3f4-8c2a-4f30-b51a-c31b9c89cd28/volumes" Oct 11 01:21:04 crc kubenswrapper[4743]: I1011 01:21:04.152733 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6ae2ef5-9e7a-43c5-9772-7bcd53053fb6" path="/var/lib/kubelet/pods/c6ae2ef5-9e7a-43c5-9772-7bcd53053fb6/volumes" Oct 11 01:21:04 crc kubenswrapper[4743]: I1011 01:21:04.165292 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cad5491d-5b39-42df-b4d5-1d5be912561b" path="/var/lib/kubelet/pods/cad5491d-5b39-42df-b4d5-1d5be912561b/volumes" Oct 11 01:21:15 crc kubenswrapper[4743]: I1011 01:21:15.044587 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5f82-account-create-gq8c9"] Oct 11 01:21:15 crc kubenswrapper[4743]: I1011 01:21:15.058513 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5f82-account-create-gq8c9"] Oct 11 01:21:16 crc kubenswrapper[4743]: I1011 01:21:16.033098 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-d5a7-account-create-z86xz"] Oct 11 01:21:16 crc kubenswrapper[4743]: I1011 01:21:16.044770 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2a5c-account-create-6nzhk"] Oct 11 01:21:16 crc kubenswrapper[4743]: I1011 01:21:16.054623 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-d5a7-account-create-z86xz"] Oct 11 01:21:16 crc kubenswrapper[4743]: I1011 01:21:16.063310 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-2a5c-account-create-6nzhk"] Oct 11 01:21:16 crc kubenswrapper[4743]: I1011 01:21:16.100220 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:21:16 crc kubenswrapper[4743]: E1011 01:21:16.100638 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:21:16 crc kubenswrapper[4743]: I1011 01:21:16.105707 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="421a65dd-2c9a-4807-bffa-c292f25a8263" path="/var/lib/kubelet/pods/421a65dd-2c9a-4807-bffa-c292f25a8263/volumes" Oct 11 01:21:16 crc kubenswrapper[4743]: I1011 01:21:16.106604 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b01c2590-9783-4392-8a81-a4a0ec37e88d" path="/var/lib/kubelet/pods/b01c2590-9783-4392-8a81-a4a0ec37e88d/volumes" Oct 11 01:21:16 crc kubenswrapper[4743]: I1011 01:21:16.107394 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5dabefa-327b-4e8b-9a7f-81517a52c01b" path="/var/lib/kubelet/pods/c5dabefa-327b-4e8b-9a7f-81517a52c01b/volumes" Oct 11 01:21:19 crc kubenswrapper[4743]: I1011 01:21:19.025258 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0c59-account-create-gzvmt"] Oct 11 01:21:19 crc kubenswrapper[4743]: I1011 01:21:19.034013 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0c59-account-create-gzvmt"] Oct 11 01:21:20 crc kubenswrapper[4743]: I1011 01:21:20.106138 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d22dfc6b-30a2-4d39-82d0-70bb32452261" path="/var/lib/kubelet/pods/d22dfc6b-30a2-4d39-82d0-70bb32452261/volumes" Oct 11 01:21:24 crc kubenswrapper[4743]: I1011 01:21:24.047249 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-crf47"] Oct 11 01:21:24 crc kubenswrapper[4743]: I1011 01:21:24.058428 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-crf47"] Oct 11 01:21:24 crc kubenswrapper[4743]: I1011 01:21:24.106239 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae3a2cd6-d036-4c48-aaed-fc9750d6c0d0" path="/var/lib/kubelet/pods/ae3a2cd6-d036-4c48-aaed-fc9750d6c0d0/volumes" Oct 11 01:21:27 crc kubenswrapper[4743]: I1011 01:21:27.091579 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:21:27 crc kubenswrapper[4743]: E1011 01:21:27.092352 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:21:38 crc kubenswrapper[4743]: I1011 01:21:38.045830 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-rvpwq"] Oct 11 01:21:38 crc kubenswrapper[4743]: I1011 01:21:38.057995 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-5027-account-create-9tvlm"] Oct 11 01:21:38 crc kubenswrapper[4743]: I1011 01:21:38.070068 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-rvpwq"] Oct 11 01:21:38 crc kubenswrapper[4743]: I1011 01:21:38.084151 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-5027-account-create-9tvlm"] Oct 11 01:21:38 crc kubenswrapper[4743]: I1011 01:21:38.091790 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:21:38 crc kubenswrapper[4743]: E1011 01:21:38.092228 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:21:38 crc kubenswrapper[4743]: I1011 01:21:38.108173 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3147b11a-05a1-4e5b-93b9-14748977e08e" path="/var/lib/kubelet/pods/3147b11a-05a1-4e5b-93b9-14748977e08e/volumes" Oct 11 01:21:38 crc kubenswrapper[4743]: I1011 01:21:38.109188 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2bc9888-ce44-44fc-84b0-747f726ec516" path="/var/lib/kubelet/pods/a2bc9888-ce44-44fc-84b0-747f726ec516/volumes" Oct 11 01:21:41 crc kubenswrapper[4743]: I1011 01:21:41.053596 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-nmtm5"] Oct 11 01:21:41 crc kubenswrapper[4743]: I1011 01:21:41.064826 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-srz27"] Oct 11 01:21:41 crc kubenswrapper[4743]: I1011 01:21:41.076048 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-nmtm5"] Oct 11 01:21:41 crc kubenswrapper[4743]: I1011 01:21:41.086704 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-srz27"] Oct 11 01:21:41 crc kubenswrapper[4743]: I1011 01:21:41.094636 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-4wts2"] Oct 11 01:21:41 crc kubenswrapper[4743]: I1011 01:21:41.102981 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-4wts2"] Oct 11 01:21:42 crc kubenswrapper[4743]: I1011 01:21:42.114903 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="583eac1b-7b00-44ab-8f94-59b016a1d635" path="/var/lib/kubelet/pods/583eac1b-7b00-44ab-8f94-59b016a1d635/volumes" Oct 11 01:21:42 crc kubenswrapper[4743]: I1011 01:21:42.116158 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85d3caa5-89e1-4b4b-a4e2-5ec2dcb90002" path="/var/lib/kubelet/pods/85d3caa5-89e1-4b4b-a4e2-5ec2dcb90002/volumes" Oct 11 01:21:42 crc kubenswrapper[4743]: I1011 01:21:42.117394 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="860e062d-6883-4ea9-8e44-8b2f4e9bae60" path="/var/lib/kubelet/pods/860e062d-6883-4ea9-8e44-8b2f4e9bae60/volumes" Oct 11 01:21:44 crc kubenswrapper[4743]: I1011 01:21:44.036502 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-dznrn"] Oct 11 01:21:44 crc kubenswrapper[4743]: I1011 01:21:44.070636 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-dznrn"] Oct 11 01:21:44 crc kubenswrapper[4743]: I1011 01:21:44.115604 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f60a76bb-fc23-4e3a-b7ad-123f65747952" path="/var/lib/kubelet/pods/f60a76bb-fc23-4e3a-b7ad-123f65747952/volumes" Oct 11 01:21:45 crc kubenswrapper[4743]: I1011 01:21:45.048831 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-zdrsb"] Oct 11 01:21:45 crc kubenswrapper[4743]: I1011 01:21:45.070278 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-zdrsb"] Oct 11 01:21:46 crc kubenswrapper[4743]: I1011 01:21:46.110456 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f546b6b-8484-4ad7-879f-593cf31efaaa" path="/var/lib/kubelet/pods/1f546b6b-8484-4ad7-879f-593cf31efaaa/volumes" Oct 11 01:21:49 crc kubenswrapper[4743]: I1011 01:21:49.725093 4743 scope.go:117] "RemoveContainer" containerID="a79a5eec1c20e5887958d3732820b9542f9c0915ad81786cf52d1f0ae29650c8" Oct 11 01:21:49 crc kubenswrapper[4743]: I1011 01:21:49.757280 4743 scope.go:117] "RemoveContainer" containerID="5d17ff7b9e1b806db0dcac19018f064d4b41fc11cdf8eed39cb3555c25f56cc8" Oct 11 01:21:49 crc kubenswrapper[4743]: I1011 01:21:49.831214 4743 scope.go:117] "RemoveContainer" containerID="9151a6ec813d0d3f00909a9b7ac2b314eef94a9d2ab2b385b47ee8ee09ea3695" Oct 11 01:21:49 crc kubenswrapper[4743]: I1011 01:21:49.915519 4743 scope.go:117] "RemoveContainer" containerID="0aed64549f096d70a12d4f364b00c9964a0086b84b5f5073f58af45f615622b3" Oct 11 01:21:49 crc kubenswrapper[4743]: I1011 01:21:49.970685 4743 scope.go:117] "RemoveContainer" containerID="256ff5de7d671245b9e23bb339c4f711ebe44aff6310320abf53d018d077d3dd" Oct 11 01:21:50 crc kubenswrapper[4743]: I1011 01:21:50.031621 4743 scope.go:117] "RemoveContainer" containerID="af90b0865142410807d8db6070a8f38daa83fd613473c41eca141d2a24d258b2" Oct 11 01:21:50 crc kubenswrapper[4743]: I1011 01:21:50.069478 4743 scope.go:117] "RemoveContainer" containerID="5e011c8554ca34706c9228cf902c194456fe233ac9b626c43abf439c9320109c" Oct 11 01:21:50 crc kubenswrapper[4743]: I1011 01:21:50.088496 4743 scope.go:117] "RemoveContainer" containerID="2c7b3956dc1901cea2381f9d6552be6446d0c77adda981342617b1bec4ce4a68" Oct 11 01:21:50 crc kubenswrapper[4743]: I1011 01:21:50.114591 4743 scope.go:117] "RemoveContainer" containerID="c4640f891f229b11967e57e15d36e53eec7cf21df3cc39f4f37f338720a00c5e" Oct 11 01:21:50 crc kubenswrapper[4743]: I1011 01:21:50.146192 4743 scope.go:117] "RemoveContainer" containerID="e83332b3e2b3333428ecde30b206b683751b46aea84ab7f7b12a0a0ab9be979d" Oct 11 01:21:50 crc kubenswrapper[4743]: I1011 01:21:50.168969 4743 scope.go:117] "RemoveContainer" containerID="70e9afaaf9657a2a88ff06cd48b80ae8cefd54ae9a3238e5e7ef25ea7d659cef" Oct 11 01:21:50 crc kubenswrapper[4743]: I1011 01:21:50.189945 4743 scope.go:117] "RemoveContainer" containerID="08f957faee5a7ed7a08ec2e27ccb9a79aaad18f10f96bc3b3b821591a4f7fda6" Oct 11 01:21:50 crc kubenswrapper[4743]: I1011 01:21:50.213726 4743 scope.go:117] "RemoveContainer" containerID="1dff9ac51551dc7108a67e25d8c101ce005b31d733514e1d194a9523e985cc5f" Oct 11 01:21:50 crc kubenswrapper[4743]: I1011 01:21:50.235320 4743 scope.go:117] "RemoveContainer" containerID="c11ea7ff21062154fb575deb299c21943cc474dfb337c895f15ff7a8b32750e0" Oct 11 01:21:50 crc kubenswrapper[4743]: I1011 01:21:50.260224 4743 scope.go:117] "RemoveContainer" containerID="463afe5f78b4ac572429650488710e5076ea660d5576f21e3d62c1ec4b3b3857" Oct 11 01:21:50 crc kubenswrapper[4743]: I1011 01:21:50.285495 4743 scope.go:117] "RemoveContainer" containerID="26ac38353778f0f982eb11e0c4b48a6fd455599647027554bbc74c4dc64cd8be" Oct 11 01:21:53 crc kubenswrapper[4743]: I1011 01:21:53.041333 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-70e4-account-create-v8fz8"] Oct 11 01:21:53 crc kubenswrapper[4743]: I1011 01:21:53.054591 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-11c8-account-create-n8n9x"] Oct 11 01:21:53 crc kubenswrapper[4743]: I1011 01:21:53.065457 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-33f9-account-create-zvc5d"] Oct 11 01:21:53 crc kubenswrapper[4743]: I1011 01:21:53.073195 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-70e4-account-create-v8fz8"] Oct 11 01:21:53 crc kubenswrapper[4743]: I1011 01:21:53.082074 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-11c8-account-create-n8n9x"] Oct 11 01:21:53 crc kubenswrapper[4743]: I1011 01:21:53.089961 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-33f9-account-create-zvc5d"] Oct 11 01:21:53 crc kubenswrapper[4743]: I1011 01:21:53.092028 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:21:54 crc kubenswrapper[4743]: I1011 01:21:54.111698 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="227021ae-b786-43b0-b61b-c3317fd4ee34" path="/var/lib/kubelet/pods/227021ae-b786-43b0-b61b-c3317fd4ee34/volumes" Oct 11 01:21:54 crc kubenswrapper[4743]: I1011 01:21:54.114025 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e5b2b21-68ca-41b3-96f4-f3dabc8cb94d" path="/var/lib/kubelet/pods/3e5b2b21-68ca-41b3-96f4-f3dabc8cb94d/volumes" Oct 11 01:21:54 crc kubenswrapper[4743]: I1011 01:21:54.114881 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6c0d49c-5d16-433e-aba5-de2c1b0746bf" path="/var/lib/kubelet/pods/d6c0d49c-5d16-433e-aba5-de2c1b0746bf/volumes" Oct 11 01:21:54 crc kubenswrapper[4743]: I1011 01:21:54.120268 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"22fd023cd5183c2ca78abfab3b66c41277f63acb424eb906c5326f5e04010643"} Oct 11 01:21:57 crc kubenswrapper[4743]: I1011 01:21:57.041699 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-1f1d-account-create-8q67r"] Oct 11 01:21:57 crc kubenswrapper[4743]: I1011 01:21:57.060643 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-1f1d-account-create-8q67r"] Oct 11 01:21:58 crc kubenswrapper[4743]: I1011 01:21:58.109347 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad7c265c-766a-446c-863d-e6137abdd0e9" path="/var/lib/kubelet/pods/ad7c265c-766a-446c-863d-e6137abdd0e9/volumes" Oct 11 01:21:59 crc kubenswrapper[4743]: I1011 01:21:59.408743 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dq8sg"] Oct 11 01:21:59 crc kubenswrapper[4743]: I1011 01:21:59.411392 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dq8sg" Oct 11 01:21:59 crc kubenswrapper[4743]: I1011 01:21:59.417768 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dq8sg"] Oct 11 01:21:59 crc kubenswrapper[4743]: I1011 01:21:59.492914 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca6876ee-226c-41a3-bdfc-98c31ca57d81-utilities\") pod \"certified-operators-dq8sg\" (UID: \"ca6876ee-226c-41a3-bdfc-98c31ca57d81\") " pod="openshift-marketplace/certified-operators-dq8sg" Oct 11 01:21:59 crc kubenswrapper[4743]: I1011 01:21:59.492996 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca6876ee-226c-41a3-bdfc-98c31ca57d81-catalog-content\") pod \"certified-operators-dq8sg\" (UID: \"ca6876ee-226c-41a3-bdfc-98c31ca57d81\") " pod="openshift-marketplace/certified-operators-dq8sg" Oct 11 01:21:59 crc kubenswrapper[4743]: I1011 01:21:59.493053 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgdtd\" (UniqueName: \"kubernetes.io/projected/ca6876ee-226c-41a3-bdfc-98c31ca57d81-kube-api-access-wgdtd\") pod \"certified-operators-dq8sg\" (UID: \"ca6876ee-226c-41a3-bdfc-98c31ca57d81\") " pod="openshift-marketplace/certified-operators-dq8sg" Oct 11 01:21:59 crc kubenswrapper[4743]: I1011 01:21:59.594766 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca6876ee-226c-41a3-bdfc-98c31ca57d81-utilities\") pod \"certified-operators-dq8sg\" (UID: \"ca6876ee-226c-41a3-bdfc-98c31ca57d81\") " pod="openshift-marketplace/certified-operators-dq8sg" Oct 11 01:21:59 crc kubenswrapper[4743]: I1011 01:21:59.594891 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca6876ee-226c-41a3-bdfc-98c31ca57d81-catalog-content\") pod \"certified-operators-dq8sg\" (UID: \"ca6876ee-226c-41a3-bdfc-98c31ca57d81\") " pod="openshift-marketplace/certified-operators-dq8sg" Oct 11 01:21:59 crc kubenswrapper[4743]: I1011 01:21:59.595282 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca6876ee-226c-41a3-bdfc-98c31ca57d81-utilities\") pod \"certified-operators-dq8sg\" (UID: \"ca6876ee-226c-41a3-bdfc-98c31ca57d81\") " pod="openshift-marketplace/certified-operators-dq8sg" Oct 11 01:21:59 crc kubenswrapper[4743]: I1011 01:21:59.595479 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca6876ee-226c-41a3-bdfc-98c31ca57d81-catalog-content\") pod \"certified-operators-dq8sg\" (UID: \"ca6876ee-226c-41a3-bdfc-98c31ca57d81\") " pod="openshift-marketplace/certified-operators-dq8sg" Oct 11 01:21:59 crc kubenswrapper[4743]: I1011 01:21:59.595560 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgdtd\" (UniqueName: \"kubernetes.io/projected/ca6876ee-226c-41a3-bdfc-98c31ca57d81-kube-api-access-wgdtd\") pod \"certified-operators-dq8sg\" (UID: \"ca6876ee-226c-41a3-bdfc-98c31ca57d81\") " pod="openshift-marketplace/certified-operators-dq8sg" Oct 11 01:21:59 crc kubenswrapper[4743]: I1011 01:21:59.613499 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgdtd\" (UniqueName: \"kubernetes.io/projected/ca6876ee-226c-41a3-bdfc-98c31ca57d81-kube-api-access-wgdtd\") pod \"certified-operators-dq8sg\" (UID: \"ca6876ee-226c-41a3-bdfc-98c31ca57d81\") " pod="openshift-marketplace/certified-operators-dq8sg" Oct 11 01:21:59 crc kubenswrapper[4743]: I1011 01:21:59.757422 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dq8sg" Oct 11 01:22:00 crc kubenswrapper[4743]: I1011 01:22:00.203251 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dq8sg"] Oct 11 01:22:01 crc kubenswrapper[4743]: I1011 01:22:01.040481 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-z4tv4"] Oct 11 01:22:01 crc kubenswrapper[4743]: I1011 01:22:01.052567 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-z4tv4"] Oct 11 01:22:01 crc kubenswrapper[4743]: I1011 01:22:01.220214 4743 generic.go:334] "Generic (PLEG): container finished" podID="ca6876ee-226c-41a3-bdfc-98c31ca57d81" containerID="6060881fc7e7a4b9aaf0ef88837ada438c0c1039bc119f5bf41a2e1dbc384e26" exitCode=0 Oct 11 01:22:01 crc kubenswrapper[4743]: I1011 01:22:01.220276 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dq8sg" event={"ID":"ca6876ee-226c-41a3-bdfc-98c31ca57d81","Type":"ContainerDied","Data":"6060881fc7e7a4b9aaf0ef88837ada438c0c1039bc119f5bf41a2e1dbc384e26"} Oct 11 01:22:01 crc kubenswrapper[4743]: I1011 01:22:01.220316 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dq8sg" event={"ID":"ca6876ee-226c-41a3-bdfc-98c31ca57d81","Type":"ContainerStarted","Data":"e07bf984102a36ce5e9d20f79d4fb217076e7e049e64fe01696579caa1f09584"} Oct 11 01:22:01 crc kubenswrapper[4743]: I1011 01:22:01.222293 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 01:22:02 crc kubenswrapper[4743]: I1011 01:22:02.112494 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e0bb9e3-337b-443b-baa8-8ea69d351ea1" path="/var/lib/kubelet/pods/7e0bb9e3-337b-443b-baa8-8ea69d351ea1/volumes" Oct 11 01:22:02 crc kubenswrapper[4743]: I1011 01:22:02.238692 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dq8sg" event={"ID":"ca6876ee-226c-41a3-bdfc-98c31ca57d81","Type":"ContainerStarted","Data":"cf740d9b27bba63c2d39e22ccbd14c56fda4849d12fa6ee483889e2ed75bb03f"} Oct 11 01:22:04 crc kubenswrapper[4743]: I1011 01:22:04.265173 4743 generic.go:334] "Generic (PLEG): container finished" podID="ca6876ee-226c-41a3-bdfc-98c31ca57d81" containerID="cf740d9b27bba63c2d39e22ccbd14c56fda4849d12fa6ee483889e2ed75bb03f" exitCode=0 Oct 11 01:22:04 crc kubenswrapper[4743]: I1011 01:22:04.265511 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dq8sg" event={"ID":"ca6876ee-226c-41a3-bdfc-98c31ca57d81","Type":"ContainerDied","Data":"cf740d9b27bba63c2d39e22ccbd14c56fda4849d12fa6ee483889e2ed75bb03f"} Oct 11 01:22:05 crc kubenswrapper[4743]: I1011 01:22:05.280276 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dq8sg" event={"ID":"ca6876ee-226c-41a3-bdfc-98c31ca57d81","Type":"ContainerStarted","Data":"330f4b3fb8a3039496fc770a0511d7a8f3251b6bed3560fa3f8ca6790b839d6f"} Oct 11 01:22:05 crc kubenswrapper[4743]: I1011 01:22:05.301739 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dq8sg" podStartSLOduration=2.833098587 podStartE2EDuration="6.30172043s" podCreationTimestamp="2025-10-11 01:21:59 +0000 UTC" firstStartedPulling="2025-10-11 01:22:01.222044693 +0000 UTC m=+1815.875025090" lastFinishedPulling="2025-10-11 01:22:04.690666536 +0000 UTC m=+1819.343646933" observedRunningTime="2025-10-11 01:22:05.297092748 +0000 UTC m=+1819.950073155" watchObservedRunningTime="2025-10-11 01:22:05.30172043 +0000 UTC m=+1819.954700847" Oct 11 01:22:09 crc kubenswrapper[4743]: I1011 01:22:09.758749 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dq8sg" Oct 11 01:22:09 crc kubenswrapper[4743]: I1011 01:22:09.759657 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dq8sg" Oct 11 01:22:09 crc kubenswrapper[4743]: I1011 01:22:09.860637 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dq8sg" Oct 11 01:22:10 crc kubenswrapper[4743]: I1011 01:22:10.417012 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dq8sg" Oct 11 01:22:10 crc kubenswrapper[4743]: I1011 01:22:10.460954 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dq8sg"] Oct 11 01:22:11 crc kubenswrapper[4743]: I1011 01:22:11.360456 4743 generic.go:334] "Generic (PLEG): container finished" podID="18dd49f1-d2cb-4e7b-b427-971bda666f14" containerID="319cc7abdc442387fa721383536d6b4b2212dab24aec428ede031df238824c38" exitCode=0 Oct 11 01:22:11 crc kubenswrapper[4743]: I1011 01:22:11.360577 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs" event={"ID":"18dd49f1-d2cb-4e7b-b427-971bda666f14","Type":"ContainerDied","Data":"319cc7abdc442387fa721383536d6b4b2212dab24aec428ede031df238824c38"} Oct 11 01:22:12 crc kubenswrapper[4743]: I1011 01:22:12.379803 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dq8sg" podUID="ca6876ee-226c-41a3-bdfc-98c31ca57d81" containerName="registry-server" containerID="cri-o://330f4b3fb8a3039496fc770a0511d7a8f3251b6bed3560fa3f8ca6790b839d6f" gracePeriod=2 Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.044375 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dq8sg" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.053769 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.197943 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca6876ee-226c-41a3-bdfc-98c31ca57d81-utilities\") pod \"ca6876ee-226c-41a3-bdfc-98c31ca57d81\" (UID: \"ca6876ee-226c-41a3-bdfc-98c31ca57d81\") " Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.198046 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18dd49f1-d2cb-4e7b-b427-971bda666f14-ssh-key\") pod \"18dd49f1-d2cb-4e7b-b427-971bda666f14\" (UID: \"18dd49f1-d2cb-4e7b-b427-971bda666f14\") " Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.198066 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca6876ee-226c-41a3-bdfc-98c31ca57d81-catalog-content\") pod \"ca6876ee-226c-41a3-bdfc-98c31ca57d81\" (UID: \"ca6876ee-226c-41a3-bdfc-98c31ca57d81\") " Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.198129 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18dd49f1-d2cb-4e7b-b427-971bda666f14-inventory\") pod \"18dd49f1-d2cb-4e7b-b427-971bda666f14\" (UID: \"18dd49f1-d2cb-4e7b-b427-971bda666f14\") " Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.198157 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgdtd\" (UniqueName: \"kubernetes.io/projected/ca6876ee-226c-41a3-bdfc-98c31ca57d81-kube-api-access-wgdtd\") pod \"ca6876ee-226c-41a3-bdfc-98c31ca57d81\" (UID: \"ca6876ee-226c-41a3-bdfc-98c31ca57d81\") " Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.198241 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rflh5\" (UniqueName: \"kubernetes.io/projected/18dd49f1-d2cb-4e7b-b427-971bda666f14-kube-api-access-rflh5\") pod \"18dd49f1-d2cb-4e7b-b427-971bda666f14\" (UID: \"18dd49f1-d2cb-4e7b-b427-971bda666f14\") " Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.198610 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca6876ee-226c-41a3-bdfc-98c31ca57d81-utilities" (OuterVolumeSpecName: "utilities") pod "ca6876ee-226c-41a3-bdfc-98c31ca57d81" (UID: "ca6876ee-226c-41a3-bdfc-98c31ca57d81"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.200124 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca6876ee-226c-41a3-bdfc-98c31ca57d81-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.210270 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18dd49f1-d2cb-4e7b-b427-971bda666f14-kube-api-access-rflh5" (OuterVolumeSpecName: "kube-api-access-rflh5") pod "18dd49f1-d2cb-4e7b-b427-971bda666f14" (UID: "18dd49f1-d2cb-4e7b-b427-971bda666f14"). InnerVolumeSpecName "kube-api-access-rflh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.220745 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca6876ee-226c-41a3-bdfc-98c31ca57d81-kube-api-access-wgdtd" (OuterVolumeSpecName: "kube-api-access-wgdtd") pod "ca6876ee-226c-41a3-bdfc-98c31ca57d81" (UID: "ca6876ee-226c-41a3-bdfc-98c31ca57d81"). InnerVolumeSpecName "kube-api-access-wgdtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.235685 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18dd49f1-d2cb-4e7b-b427-971bda666f14-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "18dd49f1-d2cb-4e7b-b427-971bda666f14" (UID: "18dd49f1-d2cb-4e7b-b427-971bda666f14"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.241004 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18dd49f1-d2cb-4e7b-b427-971bda666f14-inventory" (OuterVolumeSpecName: "inventory") pod "18dd49f1-d2cb-4e7b-b427-971bda666f14" (UID: "18dd49f1-d2cb-4e7b-b427-971bda666f14"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.260836 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca6876ee-226c-41a3-bdfc-98c31ca57d81-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca6876ee-226c-41a3-bdfc-98c31ca57d81" (UID: "ca6876ee-226c-41a3-bdfc-98c31ca57d81"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.302683 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18dd49f1-d2cb-4e7b-b427-971bda666f14-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.303687 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca6876ee-226c-41a3-bdfc-98c31ca57d81-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.303773 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18dd49f1-d2cb-4e7b-b427-971bda666f14-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.303830 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgdtd\" (UniqueName: \"kubernetes.io/projected/ca6876ee-226c-41a3-bdfc-98c31ca57d81-kube-api-access-wgdtd\") on node \"crc\" DevicePath \"\"" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.303951 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rflh5\" (UniqueName: \"kubernetes.io/projected/18dd49f1-d2cb-4e7b-b427-971bda666f14-kube-api-access-rflh5\") on node \"crc\" DevicePath \"\"" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.400593 4743 generic.go:334] "Generic (PLEG): container finished" podID="ca6876ee-226c-41a3-bdfc-98c31ca57d81" containerID="330f4b3fb8a3039496fc770a0511d7a8f3251b6bed3560fa3f8ca6790b839d6f" exitCode=0 Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.400696 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dq8sg" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.400701 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dq8sg" event={"ID":"ca6876ee-226c-41a3-bdfc-98c31ca57d81","Type":"ContainerDied","Data":"330f4b3fb8a3039496fc770a0511d7a8f3251b6bed3560fa3f8ca6790b839d6f"} Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.400971 4743 scope.go:117] "RemoveContainer" containerID="330f4b3fb8a3039496fc770a0511d7a8f3251b6bed3560fa3f8ca6790b839d6f" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.402921 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dq8sg" event={"ID":"ca6876ee-226c-41a3-bdfc-98c31ca57d81","Type":"ContainerDied","Data":"e07bf984102a36ce5e9d20f79d4fb217076e7e049e64fe01696579caa1f09584"} Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.415784 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs" event={"ID":"18dd49f1-d2cb-4e7b-b427-971bda666f14","Type":"ContainerDied","Data":"d2877a6772eb59c85b29003056f035db5d713ac4adeec746228f98972180eddb"} Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.415823 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2877a6772eb59c85b29003056f035db5d713ac4adeec746228f98972180eddb" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.415914 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.454654 4743 scope.go:117] "RemoveContainer" containerID="cf740d9b27bba63c2d39e22ccbd14c56fda4849d12fa6ee483889e2ed75bb03f" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.468797 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dq8sg"] Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.482982 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dq8sg"] Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.494751 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-grhxs"] Oct 11 01:22:13 crc kubenswrapper[4743]: E1011 01:22:13.495222 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca6876ee-226c-41a3-bdfc-98c31ca57d81" containerName="extract-utilities" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.495244 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca6876ee-226c-41a3-bdfc-98c31ca57d81" containerName="extract-utilities" Oct 11 01:22:13 crc kubenswrapper[4743]: E1011 01:22:13.495258 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18dd49f1-d2cb-4e7b-b427-971bda666f14" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.495266 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="18dd49f1-d2cb-4e7b-b427-971bda666f14" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 11 01:22:13 crc kubenswrapper[4743]: E1011 01:22:13.495281 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca6876ee-226c-41a3-bdfc-98c31ca57d81" containerName="registry-server" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.495287 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca6876ee-226c-41a3-bdfc-98c31ca57d81" containerName="registry-server" Oct 11 01:22:13 crc kubenswrapper[4743]: E1011 01:22:13.495305 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca6876ee-226c-41a3-bdfc-98c31ca57d81" containerName="extract-content" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.495310 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca6876ee-226c-41a3-bdfc-98c31ca57d81" containerName="extract-content" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.495495 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="18dd49f1-d2cb-4e7b-b427-971bda666f14" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.495525 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca6876ee-226c-41a3-bdfc-98c31ca57d81" containerName="registry-server" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.496224 4743 scope.go:117] "RemoveContainer" containerID="6060881fc7e7a4b9aaf0ef88837ada438c0c1039bc119f5bf41a2e1dbc384e26" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.496393 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-grhxs" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.500602 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.500748 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.500833 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.502528 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.506416 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-grhxs"] Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.558780 4743 scope.go:117] "RemoveContainer" containerID="330f4b3fb8a3039496fc770a0511d7a8f3251b6bed3560fa3f8ca6790b839d6f" Oct 11 01:22:13 crc kubenswrapper[4743]: E1011 01:22:13.559310 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"330f4b3fb8a3039496fc770a0511d7a8f3251b6bed3560fa3f8ca6790b839d6f\": container with ID starting with 330f4b3fb8a3039496fc770a0511d7a8f3251b6bed3560fa3f8ca6790b839d6f not found: ID does not exist" containerID="330f4b3fb8a3039496fc770a0511d7a8f3251b6bed3560fa3f8ca6790b839d6f" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.559426 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"330f4b3fb8a3039496fc770a0511d7a8f3251b6bed3560fa3f8ca6790b839d6f"} err="failed to get container status \"330f4b3fb8a3039496fc770a0511d7a8f3251b6bed3560fa3f8ca6790b839d6f\": rpc error: code = NotFound desc = could not find container \"330f4b3fb8a3039496fc770a0511d7a8f3251b6bed3560fa3f8ca6790b839d6f\": container with ID starting with 330f4b3fb8a3039496fc770a0511d7a8f3251b6bed3560fa3f8ca6790b839d6f not found: ID does not exist" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.559511 4743 scope.go:117] "RemoveContainer" containerID="cf740d9b27bba63c2d39e22ccbd14c56fda4849d12fa6ee483889e2ed75bb03f" Oct 11 01:22:13 crc kubenswrapper[4743]: E1011 01:22:13.560024 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf740d9b27bba63c2d39e22ccbd14c56fda4849d12fa6ee483889e2ed75bb03f\": container with ID starting with cf740d9b27bba63c2d39e22ccbd14c56fda4849d12fa6ee483889e2ed75bb03f not found: ID does not exist" containerID="cf740d9b27bba63c2d39e22ccbd14c56fda4849d12fa6ee483889e2ed75bb03f" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.560103 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf740d9b27bba63c2d39e22ccbd14c56fda4849d12fa6ee483889e2ed75bb03f"} err="failed to get container status \"cf740d9b27bba63c2d39e22ccbd14c56fda4849d12fa6ee483889e2ed75bb03f\": rpc error: code = NotFound desc = could not find container \"cf740d9b27bba63c2d39e22ccbd14c56fda4849d12fa6ee483889e2ed75bb03f\": container with ID starting with cf740d9b27bba63c2d39e22ccbd14c56fda4849d12fa6ee483889e2ed75bb03f not found: ID does not exist" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.560169 4743 scope.go:117] "RemoveContainer" containerID="6060881fc7e7a4b9aaf0ef88837ada438c0c1039bc119f5bf41a2e1dbc384e26" Oct 11 01:22:13 crc kubenswrapper[4743]: E1011 01:22:13.560504 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6060881fc7e7a4b9aaf0ef88837ada438c0c1039bc119f5bf41a2e1dbc384e26\": container with ID starting with 6060881fc7e7a4b9aaf0ef88837ada438c0c1039bc119f5bf41a2e1dbc384e26 not found: ID does not exist" containerID="6060881fc7e7a4b9aaf0ef88837ada438c0c1039bc119f5bf41a2e1dbc384e26" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.560542 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6060881fc7e7a4b9aaf0ef88837ada438c0c1039bc119f5bf41a2e1dbc384e26"} err="failed to get container status \"6060881fc7e7a4b9aaf0ef88837ada438c0c1039bc119f5bf41a2e1dbc384e26\": rpc error: code = NotFound desc = could not find container \"6060881fc7e7a4b9aaf0ef88837ada438c0c1039bc119f5bf41a2e1dbc384e26\": container with ID starting with 6060881fc7e7a4b9aaf0ef88837ada438c0c1039bc119f5bf41a2e1dbc384e26 not found: ID does not exist" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.612429 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f766e457-c9a8-465e-b746-e9ef3bba860f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-grhxs\" (UID: \"f766e457-c9a8-465e-b746-e9ef3bba860f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-grhxs" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.612634 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f766e457-c9a8-465e-b746-e9ef3bba860f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-grhxs\" (UID: \"f766e457-c9a8-465e-b746-e9ef3bba860f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-grhxs" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.613467 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx2gr\" (UniqueName: \"kubernetes.io/projected/f766e457-c9a8-465e-b746-e9ef3bba860f-kube-api-access-kx2gr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-grhxs\" (UID: \"f766e457-c9a8-465e-b746-e9ef3bba860f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-grhxs" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.715955 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f766e457-c9a8-465e-b746-e9ef3bba860f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-grhxs\" (UID: \"f766e457-c9a8-465e-b746-e9ef3bba860f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-grhxs" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.716045 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f766e457-c9a8-465e-b746-e9ef3bba860f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-grhxs\" (UID: \"f766e457-c9a8-465e-b746-e9ef3bba860f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-grhxs" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.716132 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx2gr\" (UniqueName: \"kubernetes.io/projected/f766e457-c9a8-465e-b746-e9ef3bba860f-kube-api-access-kx2gr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-grhxs\" (UID: \"f766e457-c9a8-465e-b746-e9ef3bba860f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-grhxs" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.719596 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f766e457-c9a8-465e-b746-e9ef3bba860f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-grhxs\" (UID: \"f766e457-c9a8-465e-b746-e9ef3bba860f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-grhxs" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.719953 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f766e457-c9a8-465e-b746-e9ef3bba860f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-grhxs\" (UID: \"f766e457-c9a8-465e-b746-e9ef3bba860f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-grhxs" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.733819 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx2gr\" (UniqueName: \"kubernetes.io/projected/f766e457-c9a8-465e-b746-e9ef3bba860f-kube-api-access-kx2gr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-grhxs\" (UID: \"f766e457-c9a8-465e-b746-e9ef3bba860f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-grhxs" Oct 11 01:22:13 crc kubenswrapper[4743]: I1011 01:22:13.903361 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-grhxs" Oct 11 01:22:14 crc kubenswrapper[4743]: I1011 01:22:14.110175 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca6876ee-226c-41a3-bdfc-98c31ca57d81" path="/var/lib/kubelet/pods/ca6876ee-226c-41a3-bdfc-98c31ca57d81/volumes" Oct 11 01:22:14 crc kubenswrapper[4743]: I1011 01:22:14.483397 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-grhxs"] Oct 11 01:22:15 crc kubenswrapper[4743]: I1011 01:22:15.442350 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-grhxs" event={"ID":"f766e457-c9a8-465e-b746-e9ef3bba860f","Type":"ContainerStarted","Data":"b543d07a48df3636a760ba323820411308602c28c463342486510a0946d99a1c"} Oct 11 01:22:15 crc kubenswrapper[4743]: I1011 01:22:15.442850 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-grhxs" event={"ID":"f766e457-c9a8-465e-b746-e9ef3bba860f","Type":"ContainerStarted","Data":"3fce562ce127cebe1b66bd729b4b379e04d3e9ec122be2a38cf8d41981f0f9f2"} Oct 11 01:22:15 crc kubenswrapper[4743]: I1011 01:22:15.472382 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-grhxs" podStartSLOduration=1.963692584 podStartE2EDuration="2.472361067s" podCreationTimestamp="2025-10-11 01:22:13 +0000 UTC" firstStartedPulling="2025-10-11 01:22:14.488421573 +0000 UTC m=+1829.141401970" lastFinishedPulling="2025-10-11 01:22:14.997090046 +0000 UTC m=+1829.650070453" observedRunningTime="2025-10-11 01:22:15.460044889 +0000 UTC m=+1830.113025316" watchObservedRunningTime="2025-10-11 01:22:15.472361067 +0000 UTC m=+1830.125341464" Oct 11 01:22:20 crc kubenswrapper[4743]: I1011 01:22:20.064601 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xq7tg"] Oct 11 01:22:20 crc kubenswrapper[4743]: I1011 01:22:20.085663 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xq7tg"] Oct 11 01:22:20 crc kubenswrapper[4743]: I1011 01:22:20.108364 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bca21196-09b5-4f5a-b690-19afdf3318b4" path="/var/lib/kubelet/pods/bca21196-09b5-4f5a-b690-19afdf3318b4/volumes" Oct 11 01:22:21 crc kubenswrapper[4743]: I1011 01:22:21.507827 4743 generic.go:334] "Generic (PLEG): container finished" podID="f766e457-c9a8-465e-b746-e9ef3bba860f" containerID="b543d07a48df3636a760ba323820411308602c28c463342486510a0946d99a1c" exitCode=0 Oct 11 01:22:21 crc kubenswrapper[4743]: I1011 01:22:21.507913 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-grhxs" event={"ID":"f766e457-c9a8-465e-b746-e9ef3bba860f","Type":"ContainerDied","Data":"b543d07a48df3636a760ba323820411308602c28c463342486510a0946d99a1c"} Oct 11 01:22:22 crc kubenswrapper[4743]: I1011 01:22:22.029310 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-s49bx"] Oct 11 01:22:22 crc kubenswrapper[4743]: I1011 01:22:22.038408 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-s49bx"] Oct 11 01:22:22 crc kubenswrapper[4743]: I1011 01:22:22.113196 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f85e310-6acf-40d8-b07d-9e1c9b4d997b" path="/var/lib/kubelet/pods/2f85e310-6acf-40d8-b07d-9e1c9b4d997b/volumes" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.095149 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-grhxs" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.225479 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f766e457-c9a8-465e-b746-e9ef3bba860f-ssh-key\") pod \"f766e457-c9a8-465e-b746-e9ef3bba860f\" (UID: \"f766e457-c9a8-465e-b746-e9ef3bba860f\") " Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.225768 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f766e457-c9a8-465e-b746-e9ef3bba860f-inventory\") pod \"f766e457-c9a8-465e-b746-e9ef3bba860f\" (UID: \"f766e457-c9a8-465e-b746-e9ef3bba860f\") " Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.225961 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx2gr\" (UniqueName: \"kubernetes.io/projected/f766e457-c9a8-465e-b746-e9ef3bba860f-kube-api-access-kx2gr\") pod \"f766e457-c9a8-465e-b746-e9ef3bba860f\" (UID: \"f766e457-c9a8-465e-b746-e9ef3bba860f\") " Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.236543 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f766e457-c9a8-465e-b746-e9ef3bba860f-kube-api-access-kx2gr" (OuterVolumeSpecName: "kube-api-access-kx2gr") pod "f766e457-c9a8-465e-b746-e9ef3bba860f" (UID: "f766e457-c9a8-465e-b746-e9ef3bba860f"). InnerVolumeSpecName "kube-api-access-kx2gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.259096 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f766e457-c9a8-465e-b746-e9ef3bba860f-inventory" (OuterVolumeSpecName: "inventory") pod "f766e457-c9a8-465e-b746-e9ef3bba860f" (UID: "f766e457-c9a8-465e-b746-e9ef3bba860f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.262228 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f766e457-c9a8-465e-b746-e9ef3bba860f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f766e457-c9a8-465e-b746-e9ef3bba860f" (UID: "f766e457-c9a8-465e-b746-e9ef3bba860f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.329115 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx2gr\" (UniqueName: \"kubernetes.io/projected/f766e457-c9a8-465e-b746-e9ef3bba860f-kube-api-access-kx2gr\") on node \"crc\" DevicePath \"\"" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.329148 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f766e457-c9a8-465e-b746-e9ef3bba860f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.329163 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f766e457-c9a8-465e-b746-e9ef3bba860f-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.532939 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-grhxs" event={"ID":"f766e457-c9a8-465e-b746-e9ef3bba860f","Type":"ContainerDied","Data":"3fce562ce127cebe1b66bd729b4b379e04d3e9ec122be2a38cf8d41981f0f9f2"} Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.532985 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fce562ce127cebe1b66bd729b4b379e04d3e9ec122be2a38cf8d41981f0f9f2" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.533046 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-grhxs" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.608901 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gbggg"] Oct 11 01:22:23 crc kubenswrapper[4743]: E1011 01:22:23.610595 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f766e457-c9a8-465e-b746-e9ef3bba860f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.610631 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f766e457-c9a8-465e-b746-e9ef3bba860f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.626717 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f766e457-c9a8-465e-b746-e9ef3bba860f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.628273 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gbggg" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.631439 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.631723 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.631757 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.634594 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.676556 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gbggg"] Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.740745 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h84jv\" (UniqueName: \"kubernetes.io/projected/bebacb8f-bd48-4082-ac2c-80875645f5bf-kube-api-access-h84jv\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gbggg\" (UID: \"bebacb8f-bd48-4082-ac2c-80875645f5bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gbggg" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.740972 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bebacb8f-bd48-4082-ac2c-80875645f5bf-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gbggg\" (UID: \"bebacb8f-bd48-4082-ac2c-80875645f5bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gbggg" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.741022 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bebacb8f-bd48-4082-ac2c-80875645f5bf-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gbggg\" (UID: \"bebacb8f-bd48-4082-ac2c-80875645f5bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gbggg" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.843621 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h84jv\" (UniqueName: \"kubernetes.io/projected/bebacb8f-bd48-4082-ac2c-80875645f5bf-kube-api-access-h84jv\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gbggg\" (UID: \"bebacb8f-bd48-4082-ac2c-80875645f5bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gbggg" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.843703 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bebacb8f-bd48-4082-ac2c-80875645f5bf-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gbggg\" (UID: \"bebacb8f-bd48-4082-ac2c-80875645f5bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gbggg" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.843729 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bebacb8f-bd48-4082-ac2c-80875645f5bf-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gbggg\" (UID: \"bebacb8f-bd48-4082-ac2c-80875645f5bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gbggg" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.847144 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bebacb8f-bd48-4082-ac2c-80875645f5bf-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gbggg\" (UID: \"bebacb8f-bd48-4082-ac2c-80875645f5bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gbggg" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.847456 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bebacb8f-bd48-4082-ac2c-80875645f5bf-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gbggg\" (UID: \"bebacb8f-bd48-4082-ac2c-80875645f5bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gbggg" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.859796 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h84jv\" (UniqueName: \"kubernetes.io/projected/bebacb8f-bd48-4082-ac2c-80875645f5bf-kube-api-access-h84jv\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gbggg\" (UID: \"bebacb8f-bd48-4082-ac2c-80875645f5bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gbggg" Oct 11 01:22:23 crc kubenswrapper[4743]: I1011 01:22:23.962202 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gbggg" Oct 11 01:22:24 crc kubenswrapper[4743]: I1011 01:22:24.498237 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gbggg"] Oct 11 01:22:24 crc kubenswrapper[4743]: I1011 01:22:24.544099 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gbggg" event={"ID":"bebacb8f-bd48-4082-ac2c-80875645f5bf","Type":"ContainerStarted","Data":"0aecacc9d092241fa3ed1db6e9769e09584bdf6077c82ac1b42ab2ca6b4156cf"} Oct 11 01:22:25 crc kubenswrapper[4743]: I1011 01:22:25.556692 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gbggg" event={"ID":"bebacb8f-bd48-4082-ac2c-80875645f5bf","Type":"ContainerStarted","Data":"aeaf8587b8db64a255c5b029166a9362a4051b83cce8148721465e9e03f6e91e"} Oct 11 01:22:25 crc kubenswrapper[4743]: I1011 01:22:25.577384 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gbggg" podStartSLOduration=2.141679952 podStartE2EDuration="2.577366044s" podCreationTimestamp="2025-10-11 01:22:23 +0000 UTC" firstStartedPulling="2025-10-11 01:22:24.518084003 +0000 UTC m=+1839.171064400" lastFinishedPulling="2025-10-11 01:22:24.953770105 +0000 UTC m=+1839.606750492" observedRunningTime="2025-10-11 01:22:25.57265449 +0000 UTC m=+1840.225634897" watchObservedRunningTime="2025-10-11 01:22:25.577366044 +0000 UTC m=+1840.230346441" Oct 11 01:22:43 crc kubenswrapper[4743]: I1011 01:22:43.043080 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-6pzxd"] Oct 11 01:22:43 crc kubenswrapper[4743]: I1011 01:22:43.051317 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-6pzxd"] Oct 11 01:22:44 crc kubenswrapper[4743]: I1011 01:22:44.103143 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="615827fb-c1f4-46c6-8014-00c71fe2403b" path="/var/lib/kubelet/pods/615827fb-c1f4-46c6-8014-00c71fe2403b/volumes" Oct 11 01:22:45 crc kubenswrapper[4743]: I1011 01:22:45.056387 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-przwv"] Oct 11 01:22:45 crc kubenswrapper[4743]: I1011 01:22:45.072760 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-przwv"] Oct 11 01:22:46 crc kubenswrapper[4743]: I1011 01:22:46.102509 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65f773e4-09e5-4312-8c6b-9176a1a022f0" path="/var/lib/kubelet/pods/65f773e4-09e5-4312-8c6b-9176a1a022f0/volumes" Oct 11 01:22:50 crc kubenswrapper[4743]: I1011 01:22:50.690241 4743 scope.go:117] "RemoveContainer" containerID="36d04f245bfd66df2b42f8088c0fe20486a083c7fdf3628d1cf0ed6e4a7b8686" Oct 11 01:22:50 crc kubenswrapper[4743]: I1011 01:22:50.732161 4743 scope.go:117] "RemoveContainer" containerID="2bfe9875a178be1418db1466a53073392e86b327aca29ae3ce6d9aa6667e8240" Oct 11 01:22:50 crc kubenswrapper[4743]: I1011 01:22:50.834572 4743 scope.go:117] "RemoveContainer" containerID="ea0d864863ed6198fb923dbfb39f2a713c60ca3cab3a50f8f2e3148ceea188ea" Oct 11 01:22:50 crc kubenswrapper[4743]: I1011 01:22:50.873274 4743 scope.go:117] "RemoveContainer" containerID="4e1f81d4023643285b93e504f933c75d55c24ea4c3f44ac4cf984109a8b538ed" Oct 11 01:22:50 crc kubenswrapper[4743]: I1011 01:22:50.901449 4743 scope.go:117] "RemoveContainer" containerID="ddf99e43c15d33e7535c0a343463db45039ea0315650d5d7bfd64c5d18796d4a" Oct 11 01:22:50 crc kubenswrapper[4743]: I1011 01:22:50.959529 4743 scope.go:117] "RemoveContainer" containerID="bcf05e2bc0f024539c7285640e08713d28742c1031c605bd17a58f39ab373328" Oct 11 01:22:51 crc kubenswrapper[4743]: I1011 01:22:51.006381 4743 scope.go:117] "RemoveContainer" containerID="15eb13889fe265ecb53a8acb3699eb24d0795d5cfaded1b59312c356d6249424" Oct 11 01:22:51 crc kubenswrapper[4743]: I1011 01:22:51.052205 4743 scope.go:117] "RemoveContainer" containerID="fe4f06c5afe259bd4b23460bfc524174724ed17a5da1b63ebc03431735a3daca" Oct 11 01:22:51 crc kubenswrapper[4743]: I1011 01:22:51.088332 4743 scope.go:117] "RemoveContainer" containerID="950dcfc39960b21ca647842c438cad4ee651fddcfadb031860c91b04c2afbe16" Oct 11 01:23:04 crc kubenswrapper[4743]: I1011 01:23:04.064645 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-h4nh5"] Oct 11 01:23:04 crc kubenswrapper[4743]: I1011 01:23:04.077429 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-h4nh5"] Oct 11 01:23:04 crc kubenswrapper[4743]: I1011 01:23:04.115797 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41e4f286-9bff-400a-9604-81e12333eb6c" path="/var/lib/kubelet/pods/41e4f286-9bff-400a-9604-81e12333eb6c/volumes" Oct 11 01:23:05 crc kubenswrapper[4743]: I1011 01:23:05.055332 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-dhxf8"] Oct 11 01:23:05 crc kubenswrapper[4743]: I1011 01:23:05.065913 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-q9wxz"] Oct 11 01:23:05 crc kubenswrapper[4743]: I1011 01:23:05.075982 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-q9wxz"] Oct 11 01:23:05 crc kubenswrapper[4743]: I1011 01:23:05.084028 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-dhxf8"] Oct 11 01:23:06 crc kubenswrapper[4743]: I1011 01:23:06.117092 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43d98f64-a3a8-4260-94e4-565c740912d9" path="/var/lib/kubelet/pods/43d98f64-a3a8-4260-94e4-565c740912d9/volumes" Oct 11 01:23:06 crc kubenswrapper[4743]: I1011 01:23:06.119297 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4193e99-7b11-4285-86c4-7fe1689e4aaa" path="/var/lib/kubelet/pods/d4193e99-7b11-4285-86c4-7fe1689e4aaa/volumes" Oct 11 01:23:08 crc kubenswrapper[4743]: I1011 01:23:08.071174 4743 generic.go:334] "Generic (PLEG): container finished" podID="bebacb8f-bd48-4082-ac2c-80875645f5bf" containerID="aeaf8587b8db64a255c5b029166a9362a4051b83cce8148721465e9e03f6e91e" exitCode=0 Oct 11 01:23:08 crc kubenswrapper[4743]: I1011 01:23:08.071353 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gbggg" event={"ID":"bebacb8f-bd48-4082-ac2c-80875645f5bf","Type":"ContainerDied","Data":"aeaf8587b8db64a255c5b029166a9362a4051b83cce8148721465e9e03f6e91e"} Oct 11 01:23:09 crc kubenswrapper[4743]: I1011 01:23:09.632437 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gbggg" Oct 11 01:23:09 crc kubenswrapper[4743]: I1011 01:23:09.800842 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bebacb8f-bd48-4082-ac2c-80875645f5bf-inventory\") pod \"bebacb8f-bd48-4082-ac2c-80875645f5bf\" (UID: \"bebacb8f-bd48-4082-ac2c-80875645f5bf\") " Oct 11 01:23:09 crc kubenswrapper[4743]: I1011 01:23:09.801434 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h84jv\" (UniqueName: \"kubernetes.io/projected/bebacb8f-bd48-4082-ac2c-80875645f5bf-kube-api-access-h84jv\") pod \"bebacb8f-bd48-4082-ac2c-80875645f5bf\" (UID: \"bebacb8f-bd48-4082-ac2c-80875645f5bf\") " Oct 11 01:23:09 crc kubenswrapper[4743]: I1011 01:23:09.801499 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bebacb8f-bd48-4082-ac2c-80875645f5bf-ssh-key\") pod \"bebacb8f-bd48-4082-ac2c-80875645f5bf\" (UID: \"bebacb8f-bd48-4082-ac2c-80875645f5bf\") " Oct 11 01:23:09 crc kubenswrapper[4743]: I1011 01:23:09.812527 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bebacb8f-bd48-4082-ac2c-80875645f5bf-kube-api-access-h84jv" (OuterVolumeSpecName: "kube-api-access-h84jv") pod "bebacb8f-bd48-4082-ac2c-80875645f5bf" (UID: "bebacb8f-bd48-4082-ac2c-80875645f5bf"). InnerVolumeSpecName "kube-api-access-h84jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:23:09 crc kubenswrapper[4743]: I1011 01:23:09.840445 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bebacb8f-bd48-4082-ac2c-80875645f5bf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bebacb8f-bd48-4082-ac2c-80875645f5bf" (UID: "bebacb8f-bd48-4082-ac2c-80875645f5bf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:23:09 crc kubenswrapper[4743]: I1011 01:23:09.845459 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bebacb8f-bd48-4082-ac2c-80875645f5bf-inventory" (OuterVolumeSpecName: "inventory") pod "bebacb8f-bd48-4082-ac2c-80875645f5bf" (UID: "bebacb8f-bd48-4082-ac2c-80875645f5bf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:23:09 crc kubenswrapper[4743]: I1011 01:23:09.904443 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bebacb8f-bd48-4082-ac2c-80875645f5bf-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:23:09 crc kubenswrapper[4743]: I1011 01:23:09.904482 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h84jv\" (UniqueName: \"kubernetes.io/projected/bebacb8f-bd48-4082-ac2c-80875645f5bf-kube-api-access-h84jv\") on node \"crc\" DevicePath \"\"" Oct 11 01:23:09 crc kubenswrapper[4743]: I1011 01:23:09.904519 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bebacb8f-bd48-4082-ac2c-80875645f5bf-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:23:10 crc kubenswrapper[4743]: I1011 01:23:10.101194 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gbggg" Oct 11 01:23:10 crc kubenswrapper[4743]: I1011 01:23:10.109491 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gbggg" event={"ID":"bebacb8f-bd48-4082-ac2c-80875645f5bf","Type":"ContainerDied","Data":"0aecacc9d092241fa3ed1db6e9769e09584bdf6077c82ac1b42ab2ca6b4156cf"} Oct 11 01:23:10 crc kubenswrapper[4743]: I1011 01:23:10.109911 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0aecacc9d092241fa3ed1db6e9769e09584bdf6077c82ac1b42ab2ca6b4156cf" Oct 11 01:23:10 crc kubenswrapper[4743]: I1011 01:23:10.232287 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975"] Oct 11 01:23:10 crc kubenswrapper[4743]: E1011 01:23:10.233248 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bebacb8f-bd48-4082-ac2c-80875645f5bf" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:23:10 crc kubenswrapper[4743]: I1011 01:23:10.233465 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bebacb8f-bd48-4082-ac2c-80875645f5bf" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:23:10 crc kubenswrapper[4743]: I1011 01:23:10.234027 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="bebacb8f-bd48-4082-ac2c-80875645f5bf" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:23:10 crc kubenswrapper[4743]: I1011 01:23:10.235681 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975" Oct 11 01:23:10 crc kubenswrapper[4743]: I1011 01:23:10.239637 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:23:10 crc kubenswrapper[4743]: I1011 01:23:10.239996 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:23:10 crc kubenswrapper[4743]: I1011 01:23:10.240421 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:23:10 crc kubenswrapper[4743]: I1011 01:23:10.240681 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:23:10 crc kubenswrapper[4743]: I1011 01:23:10.269015 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975"] Oct 11 01:23:10 crc kubenswrapper[4743]: I1011 01:23:10.417957 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/947f995f-28b3-4fbb-8ade-fa778c7fe05a-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975\" (UID: \"947f995f-28b3-4fbb-8ade-fa778c7fe05a\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975" Oct 11 01:23:10 crc kubenswrapper[4743]: I1011 01:23:10.418260 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/947f995f-28b3-4fbb-8ade-fa778c7fe05a-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975\" (UID: \"947f995f-28b3-4fbb-8ade-fa778c7fe05a\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975" Oct 11 01:23:10 crc kubenswrapper[4743]: I1011 01:23:10.418371 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdhx8\" (UniqueName: \"kubernetes.io/projected/947f995f-28b3-4fbb-8ade-fa778c7fe05a-kube-api-access-qdhx8\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975\" (UID: \"947f995f-28b3-4fbb-8ade-fa778c7fe05a\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975" Oct 11 01:23:10 crc kubenswrapper[4743]: I1011 01:23:10.520196 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/947f995f-28b3-4fbb-8ade-fa778c7fe05a-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975\" (UID: \"947f995f-28b3-4fbb-8ade-fa778c7fe05a\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975" Oct 11 01:23:10 crc kubenswrapper[4743]: I1011 01:23:10.520297 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/947f995f-28b3-4fbb-8ade-fa778c7fe05a-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975\" (UID: \"947f995f-28b3-4fbb-8ade-fa778c7fe05a\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975" Oct 11 01:23:10 crc kubenswrapper[4743]: I1011 01:23:10.520351 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdhx8\" (UniqueName: \"kubernetes.io/projected/947f995f-28b3-4fbb-8ade-fa778c7fe05a-kube-api-access-qdhx8\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975\" (UID: \"947f995f-28b3-4fbb-8ade-fa778c7fe05a\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975" Oct 11 01:23:10 crc kubenswrapper[4743]: I1011 01:23:10.539713 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/947f995f-28b3-4fbb-8ade-fa778c7fe05a-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975\" (UID: \"947f995f-28b3-4fbb-8ade-fa778c7fe05a\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975" Oct 11 01:23:10 crc kubenswrapper[4743]: I1011 01:23:10.539757 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/947f995f-28b3-4fbb-8ade-fa778c7fe05a-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975\" (UID: \"947f995f-28b3-4fbb-8ade-fa778c7fe05a\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975" Oct 11 01:23:10 crc kubenswrapper[4743]: I1011 01:23:10.542326 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdhx8\" (UniqueName: \"kubernetes.io/projected/947f995f-28b3-4fbb-8ade-fa778c7fe05a-kube-api-access-qdhx8\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975\" (UID: \"947f995f-28b3-4fbb-8ade-fa778c7fe05a\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975" Oct 11 01:23:10 crc kubenswrapper[4743]: I1011 01:23:10.558224 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975" Oct 11 01:23:11 crc kubenswrapper[4743]: I1011 01:23:11.136797 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975"] Oct 11 01:23:12 crc kubenswrapper[4743]: I1011 01:23:12.127030 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975" event={"ID":"947f995f-28b3-4fbb-8ade-fa778c7fe05a","Type":"ContainerStarted","Data":"5e69a52f2f2fcdef405ba2a5f48463835b5e6ecfab78a93354182fdc6b69905f"} Oct 11 01:23:12 crc kubenswrapper[4743]: I1011 01:23:12.127699 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975" event={"ID":"947f995f-28b3-4fbb-8ade-fa778c7fe05a","Type":"ContainerStarted","Data":"11874c0bf05a7b2870d3a56ccee5f9714d3bd0697eaa22f0a836fd6fcc6ae1fe"} Oct 11 01:23:12 crc kubenswrapper[4743]: I1011 01:23:12.150269 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975" podStartSLOduration=1.7240331800000002 podStartE2EDuration="2.150240065s" podCreationTimestamp="2025-10-11 01:23:10 +0000 UTC" firstStartedPulling="2025-10-11 01:23:11.146284419 +0000 UTC m=+1885.799264816" lastFinishedPulling="2025-10-11 01:23:11.572491284 +0000 UTC m=+1886.225471701" observedRunningTime="2025-10-11 01:23:12.144319564 +0000 UTC m=+1886.797299991" watchObservedRunningTime="2025-10-11 01:23:12.150240065 +0000 UTC m=+1886.803220472" Oct 11 01:23:15 crc kubenswrapper[4743]: I1011 01:23:15.048396 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-c102-account-create-rcsw5"] Oct 11 01:23:15 crc kubenswrapper[4743]: I1011 01:23:15.058409 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-bf85-account-create-9q879"] Oct 11 01:23:15 crc kubenswrapper[4743]: I1011 01:23:15.066541 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-4caf-account-create-6hbb4"] Oct 11 01:23:15 crc kubenswrapper[4743]: I1011 01:23:15.078893 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-c102-account-create-rcsw5"] Oct 11 01:23:15 crc kubenswrapper[4743]: I1011 01:23:15.086377 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-bf85-account-create-9q879"] Oct 11 01:23:15 crc kubenswrapper[4743]: I1011 01:23:15.093408 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-4caf-account-create-6hbb4"] Oct 11 01:23:16 crc kubenswrapper[4743]: I1011 01:23:16.106425 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e22d773-b000-4dea-aa30-1134c593e7cc" path="/var/lib/kubelet/pods/0e22d773-b000-4dea-aa30-1134c593e7cc/volumes" Oct 11 01:23:16 crc kubenswrapper[4743]: I1011 01:23:16.108588 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b83d6968-cf34-4111-b66c-f2de3eb8abce" path="/var/lib/kubelet/pods/b83d6968-cf34-4111-b66c-f2de3eb8abce/volumes" Oct 11 01:23:16 crc kubenswrapper[4743]: I1011 01:23:16.109460 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be9b0c9b-a917-422d-b3aa-c9011eda53c9" path="/var/lib/kubelet/pods/be9b0c9b-a917-422d-b3aa-c9011eda53c9/volumes" Oct 11 01:23:17 crc kubenswrapper[4743]: I1011 01:23:17.180024 4743 generic.go:334] "Generic (PLEG): container finished" podID="947f995f-28b3-4fbb-8ade-fa778c7fe05a" containerID="5e69a52f2f2fcdef405ba2a5f48463835b5e6ecfab78a93354182fdc6b69905f" exitCode=0 Oct 11 01:23:17 crc kubenswrapper[4743]: I1011 01:23:17.180074 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975" event={"ID":"947f995f-28b3-4fbb-8ade-fa778c7fe05a","Type":"ContainerDied","Data":"5e69a52f2f2fcdef405ba2a5f48463835b5e6ecfab78a93354182fdc6b69905f"} Oct 11 01:23:18 crc kubenswrapper[4743]: I1011 01:23:18.685768 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975" Oct 11 01:23:18 crc kubenswrapper[4743]: I1011 01:23:18.819574 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/947f995f-28b3-4fbb-8ade-fa778c7fe05a-inventory\") pod \"947f995f-28b3-4fbb-8ade-fa778c7fe05a\" (UID: \"947f995f-28b3-4fbb-8ade-fa778c7fe05a\") " Oct 11 01:23:18 crc kubenswrapper[4743]: I1011 01:23:18.819762 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/947f995f-28b3-4fbb-8ade-fa778c7fe05a-ssh-key\") pod \"947f995f-28b3-4fbb-8ade-fa778c7fe05a\" (UID: \"947f995f-28b3-4fbb-8ade-fa778c7fe05a\") " Oct 11 01:23:18 crc kubenswrapper[4743]: I1011 01:23:18.819789 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdhx8\" (UniqueName: \"kubernetes.io/projected/947f995f-28b3-4fbb-8ade-fa778c7fe05a-kube-api-access-qdhx8\") pod \"947f995f-28b3-4fbb-8ade-fa778c7fe05a\" (UID: \"947f995f-28b3-4fbb-8ade-fa778c7fe05a\") " Oct 11 01:23:18 crc kubenswrapper[4743]: I1011 01:23:18.845055 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/947f995f-28b3-4fbb-8ade-fa778c7fe05a-kube-api-access-qdhx8" (OuterVolumeSpecName: "kube-api-access-qdhx8") pod "947f995f-28b3-4fbb-8ade-fa778c7fe05a" (UID: "947f995f-28b3-4fbb-8ade-fa778c7fe05a"). InnerVolumeSpecName "kube-api-access-qdhx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:23:18 crc kubenswrapper[4743]: I1011 01:23:18.863594 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/947f995f-28b3-4fbb-8ade-fa778c7fe05a-inventory" (OuterVolumeSpecName: "inventory") pod "947f995f-28b3-4fbb-8ade-fa778c7fe05a" (UID: "947f995f-28b3-4fbb-8ade-fa778c7fe05a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:23:18 crc kubenswrapper[4743]: I1011 01:23:18.866114 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/947f995f-28b3-4fbb-8ade-fa778c7fe05a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "947f995f-28b3-4fbb-8ade-fa778c7fe05a" (UID: "947f995f-28b3-4fbb-8ade-fa778c7fe05a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:23:18 crc kubenswrapper[4743]: I1011 01:23:18.923233 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/947f995f-28b3-4fbb-8ade-fa778c7fe05a-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:23:18 crc kubenswrapper[4743]: I1011 01:23:18.923271 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/947f995f-28b3-4fbb-8ade-fa778c7fe05a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:23:18 crc kubenswrapper[4743]: I1011 01:23:18.923285 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdhx8\" (UniqueName: \"kubernetes.io/projected/947f995f-28b3-4fbb-8ade-fa778c7fe05a-kube-api-access-qdhx8\") on node \"crc\" DevicePath \"\"" Oct 11 01:23:19 crc kubenswrapper[4743]: I1011 01:23:19.215810 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975" event={"ID":"947f995f-28b3-4fbb-8ade-fa778c7fe05a","Type":"ContainerDied","Data":"11874c0bf05a7b2870d3a56ccee5f9714d3bd0697eaa22f0a836fd6fcc6ae1fe"} Oct 11 01:23:19 crc kubenswrapper[4743]: I1011 01:23:19.215913 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11874c0bf05a7b2870d3a56ccee5f9714d3bd0697eaa22f0a836fd6fcc6ae1fe" Oct 11 01:23:19 crc kubenswrapper[4743]: I1011 01:23:19.216011 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975" Oct 11 01:23:19 crc kubenswrapper[4743]: I1011 01:23:19.280332 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh"] Oct 11 01:23:19 crc kubenswrapper[4743]: E1011 01:23:19.280751 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="947f995f-28b3-4fbb-8ade-fa778c7fe05a" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 11 01:23:19 crc kubenswrapper[4743]: I1011 01:23:19.280768 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="947f995f-28b3-4fbb-8ade-fa778c7fe05a" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 11 01:23:19 crc kubenswrapper[4743]: I1011 01:23:19.280972 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="947f995f-28b3-4fbb-8ade-fa778c7fe05a" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 11 01:23:19 crc kubenswrapper[4743]: I1011 01:23:19.281650 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh" Oct 11 01:23:19 crc kubenswrapper[4743]: I1011 01:23:19.284024 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:23:19 crc kubenswrapper[4743]: I1011 01:23:19.288218 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:23:19 crc kubenswrapper[4743]: I1011 01:23:19.288413 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:23:19 crc kubenswrapper[4743]: I1011 01:23:19.288435 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:23:19 crc kubenswrapper[4743]: I1011 01:23:19.302605 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh"] Oct 11 01:23:19 crc kubenswrapper[4743]: I1011 01:23:19.432841 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/830e5365-fa12-43db-b2da-5e3295796350-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh\" (UID: \"830e5365-fa12-43db-b2da-5e3295796350\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh" Oct 11 01:23:19 crc kubenswrapper[4743]: I1011 01:23:19.433011 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/830e5365-fa12-43db-b2da-5e3295796350-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh\" (UID: \"830e5365-fa12-43db-b2da-5e3295796350\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh" Oct 11 01:23:19 crc kubenswrapper[4743]: I1011 01:23:19.433081 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v7g9\" (UniqueName: \"kubernetes.io/projected/830e5365-fa12-43db-b2da-5e3295796350-kube-api-access-7v7g9\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh\" (UID: \"830e5365-fa12-43db-b2da-5e3295796350\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh" Oct 11 01:23:19 crc kubenswrapper[4743]: I1011 01:23:19.535406 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/830e5365-fa12-43db-b2da-5e3295796350-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh\" (UID: \"830e5365-fa12-43db-b2da-5e3295796350\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh" Oct 11 01:23:19 crc kubenswrapper[4743]: I1011 01:23:19.535944 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/830e5365-fa12-43db-b2da-5e3295796350-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh\" (UID: \"830e5365-fa12-43db-b2da-5e3295796350\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh" Oct 11 01:23:19 crc kubenswrapper[4743]: I1011 01:23:19.536011 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v7g9\" (UniqueName: \"kubernetes.io/projected/830e5365-fa12-43db-b2da-5e3295796350-kube-api-access-7v7g9\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh\" (UID: \"830e5365-fa12-43db-b2da-5e3295796350\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh" Oct 11 01:23:19 crc kubenswrapper[4743]: I1011 01:23:19.543779 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/830e5365-fa12-43db-b2da-5e3295796350-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh\" (UID: \"830e5365-fa12-43db-b2da-5e3295796350\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh" Oct 11 01:23:19 crc kubenswrapper[4743]: I1011 01:23:19.550620 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/830e5365-fa12-43db-b2da-5e3295796350-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh\" (UID: \"830e5365-fa12-43db-b2da-5e3295796350\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh" Oct 11 01:23:19 crc kubenswrapper[4743]: I1011 01:23:19.554613 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v7g9\" (UniqueName: \"kubernetes.io/projected/830e5365-fa12-43db-b2da-5e3295796350-kube-api-access-7v7g9\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh\" (UID: \"830e5365-fa12-43db-b2da-5e3295796350\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh" Oct 11 01:23:19 crc kubenswrapper[4743]: I1011 01:23:19.612787 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh" Oct 11 01:23:20 crc kubenswrapper[4743]: I1011 01:23:20.221065 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh"] Oct 11 01:23:21 crc kubenswrapper[4743]: I1011 01:23:21.234698 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh" event={"ID":"830e5365-fa12-43db-b2da-5e3295796350","Type":"ContainerStarted","Data":"fddddb5adf6d9ef199f5626c1e325b4bbaf06841c1ed1819c9c4e434a634c96e"} Oct 11 01:23:21 crc kubenswrapper[4743]: I1011 01:23:21.235035 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh" event={"ID":"830e5365-fa12-43db-b2da-5e3295796350","Type":"ContainerStarted","Data":"067b02fb23bf36c13053e2e6fc98203a87edbf403f6d9564d22951ce622d38a6"} Oct 11 01:23:21 crc kubenswrapper[4743]: I1011 01:23:21.255432 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh" podStartSLOduration=1.79778243 podStartE2EDuration="2.255413345s" podCreationTimestamp="2025-10-11 01:23:19 +0000 UTC" firstStartedPulling="2025-10-11 01:23:20.229599293 +0000 UTC m=+1894.882579690" lastFinishedPulling="2025-10-11 01:23:20.687230208 +0000 UTC m=+1895.340210605" observedRunningTime="2025-10-11 01:23:21.246489128 +0000 UTC m=+1895.899469515" watchObservedRunningTime="2025-10-11 01:23:21.255413345 +0000 UTC m=+1895.908393732" Oct 11 01:23:44 crc kubenswrapper[4743]: I1011 01:23:44.046802 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-89lgh"] Oct 11 01:23:44 crc kubenswrapper[4743]: I1011 01:23:44.060549 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-89lgh"] Oct 11 01:23:44 crc kubenswrapper[4743]: I1011 01:23:44.107280 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69e15d6c-dacd-4466-aac4-050cda6242aa" path="/var/lib/kubelet/pods/69e15d6c-dacd-4466-aac4-050cda6242aa/volumes" Oct 11 01:23:51 crc kubenswrapper[4743]: I1011 01:23:51.309419 4743 scope.go:117] "RemoveContainer" containerID="2ff0e655e08fe7c8152322d7f32d94261beb30382bc21ec202b87af66cc7e7bd" Oct 11 01:23:51 crc kubenswrapper[4743]: I1011 01:23:51.350395 4743 scope.go:117] "RemoveContainer" containerID="6c415f2f966cf8502f34e945ecfacf3c5b60792adcb24042de362843942dc10c" Oct 11 01:23:51 crc kubenswrapper[4743]: I1011 01:23:51.441597 4743 scope.go:117] "RemoveContainer" containerID="d8b39937da4ba3865c6e25f8b9d4d67d3cd574c66ea1501c9949d4bd121e12fb" Oct 11 01:23:51 crc kubenswrapper[4743]: I1011 01:23:51.475201 4743 scope.go:117] "RemoveContainer" containerID="9519dc35711165dabb642d86082e4eac64a6d882fb5cae7a93bfede010f611d4" Oct 11 01:23:51 crc kubenswrapper[4743]: I1011 01:23:51.521420 4743 scope.go:117] "RemoveContainer" containerID="b828d9a92d81901efc0daf4decce896ac4c9fefa736474149001eb5d8d0464bf" Oct 11 01:23:51 crc kubenswrapper[4743]: I1011 01:23:51.568702 4743 scope.go:117] "RemoveContainer" containerID="01d3f5b2ec3ede995b2098ba25deb010dab6bd5d670dc106b836f5684602edc9" Oct 11 01:23:51 crc kubenswrapper[4743]: I1011 01:23:51.611498 4743 scope.go:117] "RemoveContainer" containerID="5d7c355d803aba783fa48c6398af9330605a486c6f2f725224e4a3e8985f9615" Oct 11 01:24:04 crc kubenswrapper[4743]: I1011 01:24:04.081328 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wjk7q"] Oct 11 01:24:04 crc kubenswrapper[4743]: I1011 01:24:04.119525 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wjk7q"] Oct 11 01:24:04 crc kubenswrapper[4743]: I1011 01:24:04.119592 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-jqs8c"] Oct 11 01:24:04 crc kubenswrapper[4743]: I1011 01:24:04.119619 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-jqs8c"] Oct 11 01:24:06 crc kubenswrapper[4743]: I1011 01:24:06.115378 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70c3253b-404c-4db1-a0e5-f2f112e94c43" path="/var/lib/kubelet/pods/70c3253b-404c-4db1-a0e5-f2f112e94c43/volumes" Oct 11 01:24:06 crc kubenswrapper[4743]: I1011 01:24:06.117158 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16" path="/var/lib/kubelet/pods/ec1be3a1-8fa8-4aa6-b61f-dcc2de3f0d16/volumes" Oct 11 01:24:08 crc kubenswrapper[4743]: I1011 01:24:08.034462 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-mtbr2"] Oct 11 01:24:08 crc kubenswrapper[4743]: I1011 01:24:08.041198 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-mtbr2"] Oct 11 01:24:08 crc kubenswrapper[4743]: I1011 01:24:08.107210 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e89352a-0aa8-4c41-ba04-4da0c6e59b3d" path="/var/lib/kubelet/pods/6e89352a-0aa8-4c41-ba04-4da0c6e59b3d/volumes" Oct 11 01:24:14 crc kubenswrapper[4743]: I1011 01:24:14.458717 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:24:14 crc kubenswrapper[4743]: I1011 01:24:14.459316 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:24:19 crc kubenswrapper[4743]: I1011 01:24:19.048071 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-ad4e-account-create-b2rl8"] Oct 11 01:24:19 crc kubenswrapper[4743]: I1011 01:24:19.064706 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-ad4e-account-create-b2rl8"] Oct 11 01:24:20 crc kubenswrapper[4743]: I1011 01:24:20.102762 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d015b239-cdc4-4ddd-8e01-4fdc85cbbbd0" path="/var/lib/kubelet/pods/d015b239-cdc4-4ddd-8e01-4fdc85cbbbd0/volumes" Oct 11 01:24:21 crc kubenswrapper[4743]: I1011 01:24:21.978375 4743 generic.go:334] "Generic (PLEG): container finished" podID="830e5365-fa12-43db-b2da-5e3295796350" containerID="fddddb5adf6d9ef199f5626c1e325b4bbaf06841c1ed1819c9c4e434a634c96e" exitCode=2 Oct 11 01:24:21 crc kubenswrapper[4743]: I1011 01:24:21.978621 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh" event={"ID":"830e5365-fa12-43db-b2da-5e3295796350","Type":"ContainerDied","Data":"fddddb5adf6d9ef199f5626c1e325b4bbaf06841c1ed1819c9c4e434a634c96e"} Oct 11 01:24:23 crc kubenswrapper[4743]: I1011 01:24:23.499733 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh" Oct 11 01:24:23 crc kubenswrapper[4743]: I1011 01:24:23.594481 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v7g9\" (UniqueName: \"kubernetes.io/projected/830e5365-fa12-43db-b2da-5e3295796350-kube-api-access-7v7g9\") pod \"830e5365-fa12-43db-b2da-5e3295796350\" (UID: \"830e5365-fa12-43db-b2da-5e3295796350\") " Oct 11 01:24:23 crc kubenswrapper[4743]: I1011 01:24:23.594588 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/830e5365-fa12-43db-b2da-5e3295796350-ssh-key\") pod \"830e5365-fa12-43db-b2da-5e3295796350\" (UID: \"830e5365-fa12-43db-b2da-5e3295796350\") " Oct 11 01:24:23 crc kubenswrapper[4743]: I1011 01:24:23.594996 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/830e5365-fa12-43db-b2da-5e3295796350-inventory\") pod \"830e5365-fa12-43db-b2da-5e3295796350\" (UID: \"830e5365-fa12-43db-b2da-5e3295796350\") " Oct 11 01:24:23 crc kubenswrapper[4743]: I1011 01:24:23.604125 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/830e5365-fa12-43db-b2da-5e3295796350-kube-api-access-7v7g9" (OuterVolumeSpecName: "kube-api-access-7v7g9") pod "830e5365-fa12-43db-b2da-5e3295796350" (UID: "830e5365-fa12-43db-b2da-5e3295796350"). InnerVolumeSpecName "kube-api-access-7v7g9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:24:23 crc kubenswrapper[4743]: I1011 01:24:23.629181 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/830e5365-fa12-43db-b2da-5e3295796350-inventory" (OuterVolumeSpecName: "inventory") pod "830e5365-fa12-43db-b2da-5e3295796350" (UID: "830e5365-fa12-43db-b2da-5e3295796350"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:24:23 crc kubenswrapper[4743]: I1011 01:24:23.651231 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/830e5365-fa12-43db-b2da-5e3295796350-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "830e5365-fa12-43db-b2da-5e3295796350" (UID: "830e5365-fa12-43db-b2da-5e3295796350"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:24:23 crc kubenswrapper[4743]: I1011 01:24:23.697668 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/830e5365-fa12-43db-b2da-5e3295796350-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:24:23 crc kubenswrapper[4743]: I1011 01:24:23.697930 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v7g9\" (UniqueName: \"kubernetes.io/projected/830e5365-fa12-43db-b2da-5e3295796350-kube-api-access-7v7g9\") on node \"crc\" DevicePath \"\"" Oct 11 01:24:23 crc kubenswrapper[4743]: I1011 01:24:23.698138 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/830e5365-fa12-43db-b2da-5e3295796350-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:24:24 crc kubenswrapper[4743]: I1011 01:24:24.008156 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh" event={"ID":"830e5365-fa12-43db-b2da-5e3295796350","Type":"ContainerDied","Data":"067b02fb23bf36c13053e2e6fc98203a87edbf403f6d9564d22951ce622d38a6"} Oct 11 01:24:24 crc kubenswrapper[4743]: I1011 01:24:24.008585 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="067b02fb23bf36c13053e2e6fc98203a87edbf403f6d9564d22951ce622d38a6" Oct 11 01:24:24 crc kubenswrapper[4743]: I1011 01:24:24.008232 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh" Oct 11 01:24:31 crc kubenswrapper[4743]: I1011 01:24:31.036325 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq"] Oct 11 01:24:31 crc kubenswrapper[4743]: E1011 01:24:31.037361 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="830e5365-fa12-43db-b2da-5e3295796350" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:24:31 crc kubenswrapper[4743]: I1011 01:24:31.037379 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="830e5365-fa12-43db-b2da-5e3295796350" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:24:31 crc kubenswrapper[4743]: I1011 01:24:31.037637 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="830e5365-fa12-43db-b2da-5e3295796350" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:24:31 crc kubenswrapper[4743]: I1011 01:24:31.039541 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq" Oct 11 01:24:31 crc kubenswrapper[4743]: I1011 01:24:31.043306 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:24:31 crc kubenswrapper[4743]: I1011 01:24:31.044675 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:24:31 crc kubenswrapper[4743]: I1011 01:24:31.044700 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:24:31 crc kubenswrapper[4743]: I1011 01:24:31.044747 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:24:31 crc kubenswrapper[4743]: I1011 01:24:31.049069 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq"] Oct 11 01:24:31 crc kubenswrapper[4743]: I1011 01:24:31.159761 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74e06001-e5b2-4a21-b3a0-887f814c87ef-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq\" (UID: \"74e06001-e5b2-4a21-b3a0-887f814c87ef\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq" Oct 11 01:24:31 crc kubenswrapper[4743]: I1011 01:24:31.160144 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxcb6\" (UniqueName: \"kubernetes.io/projected/74e06001-e5b2-4a21-b3a0-887f814c87ef-kube-api-access-mxcb6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq\" (UID: \"74e06001-e5b2-4a21-b3a0-887f814c87ef\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq" Oct 11 01:24:31 crc kubenswrapper[4743]: I1011 01:24:31.160211 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74e06001-e5b2-4a21-b3a0-887f814c87ef-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq\" (UID: \"74e06001-e5b2-4a21-b3a0-887f814c87ef\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq" Oct 11 01:24:31 crc kubenswrapper[4743]: I1011 01:24:31.263262 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74e06001-e5b2-4a21-b3a0-887f814c87ef-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq\" (UID: \"74e06001-e5b2-4a21-b3a0-887f814c87ef\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq" Oct 11 01:24:31 crc kubenswrapper[4743]: I1011 01:24:31.263507 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74e06001-e5b2-4a21-b3a0-887f814c87ef-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq\" (UID: \"74e06001-e5b2-4a21-b3a0-887f814c87ef\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq" Oct 11 01:24:31 crc kubenswrapper[4743]: I1011 01:24:31.263572 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxcb6\" (UniqueName: \"kubernetes.io/projected/74e06001-e5b2-4a21-b3a0-887f814c87ef-kube-api-access-mxcb6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq\" (UID: \"74e06001-e5b2-4a21-b3a0-887f814c87ef\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq" Oct 11 01:24:31 crc kubenswrapper[4743]: I1011 01:24:31.280959 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74e06001-e5b2-4a21-b3a0-887f814c87ef-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq\" (UID: \"74e06001-e5b2-4a21-b3a0-887f814c87ef\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq" Oct 11 01:24:31 crc kubenswrapper[4743]: I1011 01:24:31.281180 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74e06001-e5b2-4a21-b3a0-887f814c87ef-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq\" (UID: \"74e06001-e5b2-4a21-b3a0-887f814c87ef\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq" Oct 11 01:24:31 crc kubenswrapper[4743]: I1011 01:24:31.286669 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxcb6\" (UniqueName: \"kubernetes.io/projected/74e06001-e5b2-4a21-b3a0-887f814c87ef-kube-api-access-mxcb6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq\" (UID: \"74e06001-e5b2-4a21-b3a0-887f814c87ef\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq" Oct 11 01:24:31 crc kubenswrapper[4743]: I1011 01:24:31.369223 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq" Oct 11 01:24:32 crc kubenswrapper[4743]: I1011 01:24:32.065058 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq"] Oct 11 01:24:32 crc kubenswrapper[4743]: I1011 01:24:32.122475 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq" event={"ID":"74e06001-e5b2-4a21-b3a0-887f814c87ef","Type":"ContainerStarted","Data":"6ac02fecf6a70317d170b1b6d4418c831ff8410652a333ae56bb59ec310add29"} Oct 11 01:24:33 crc kubenswrapper[4743]: I1011 01:24:33.135428 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq" event={"ID":"74e06001-e5b2-4a21-b3a0-887f814c87ef","Type":"ContainerStarted","Data":"e48806dc0337703b86d04512c6fa718b8c44f1427f444fb92e8e70b4246812cb"} Oct 11 01:24:33 crc kubenswrapper[4743]: I1011 01:24:33.162738 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq" podStartSLOduration=1.697508849 podStartE2EDuration="2.162715566s" podCreationTimestamp="2025-10-11 01:24:31 +0000 UTC" firstStartedPulling="2025-10-11 01:24:32.062742397 +0000 UTC m=+1966.715722794" lastFinishedPulling="2025-10-11 01:24:32.527949074 +0000 UTC m=+1967.180929511" observedRunningTime="2025-10-11 01:24:33.156636071 +0000 UTC m=+1967.809616478" watchObservedRunningTime="2025-10-11 01:24:33.162715566 +0000 UTC m=+1967.815695973" Oct 11 01:24:44 crc kubenswrapper[4743]: I1011 01:24:44.458724 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:24:44 crc kubenswrapper[4743]: I1011 01:24:44.459316 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:24:49 crc kubenswrapper[4743]: I1011 01:24:49.046665 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-jckg7"] Oct 11 01:24:49 crc kubenswrapper[4743]: I1011 01:24:49.056710 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-jckg7"] Oct 11 01:24:50 crc kubenswrapper[4743]: I1011 01:24:50.105677 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96dd3888-06fd-48f8-a0b2-b320851dd83c" path="/var/lib/kubelet/pods/96dd3888-06fd-48f8-a0b2-b320851dd83c/volumes" Oct 11 01:24:51 crc kubenswrapper[4743]: I1011 01:24:51.796470 4743 scope.go:117] "RemoveContainer" containerID="30e71ad943a8da42b9307bce0b943dc03e0fdaee993ac5e9e252bea62c476658" Oct 11 01:24:51 crc kubenswrapper[4743]: I1011 01:24:51.836591 4743 scope.go:117] "RemoveContainer" containerID="882c56d4e18948fc28a76693ece959a3753d9d6f994f2f457793a77e9f7f6f3a" Oct 11 01:24:51 crc kubenswrapper[4743]: I1011 01:24:51.902436 4743 scope.go:117] "RemoveContainer" containerID="861b7786b13a062b3e3787a761ca7c7dbe1d6edb76d7f580dc6f805b919dda5c" Oct 11 01:24:51 crc kubenswrapper[4743]: I1011 01:24:51.947495 4743 scope.go:117] "RemoveContainer" containerID="e47943025051f713b963ec70bb6fb2087b75e8c359dc81e61d01a650b30278d8" Oct 11 01:24:52 crc kubenswrapper[4743]: I1011 01:24:52.015662 4743 scope.go:117] "RemoveContainer" containerID="b2544f0edc3793fa09e9eeafb6b0d684aa074fd3f9ba192b8139c99fb91ed17f" Oct 11 01:25:14 crc kubenswrapper[4743]: I1011 01:25:14.459084 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:25:14 crc kubenswrapper[4743]: I1011 01:25:14.459768 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:25:14 crc kubenswrapper[4743]: I1011 01:25:14.459837 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 01:25:14 crc kubenswrapper[4743]: I1011 01:25:14.460959 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"22fd023cd5183c2ca78abfab3b66c41277f63acb424eb906c5326f5e04010643"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 01:25:14 crc kubenswrapper[4743]: I1011 01:25:14.461055 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://22fd023cd5183c2ca78abfab3b66c41277f63acb424eb906c5326f5e04010643" gracePeriod=600 Oct 11 01:25:14 crc kubenswrapper[4743]: I1011 01:25:14.620515 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="22fd023cd5183c2ca78abfab3b66c41277f63acb424eb906c5326f5e04010643" exitCode=0 Oct 11 01:25:14 crc kubenswrapper[4743]: I1011 01:25:14.620566 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"22fd023cd5183c2ca78abfab3b66c41277f63acb424eb906c5326f5e04010643"} Oct 11 01:25:14 crc kubenswrapper[4743]: I1011 01:25:14.620605 4743 scope.go:117] "RemoveContainer" containerID="4dd4c92bb1fe51db0d7814650823b2db5bc31ac1ef997677802e7324969fe0d1" Oct 11 01:25:15 crc kubenswrapper[4743]: I1011 01:25:15.636208 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae"} Oct 11 01:25:28 crc kubenswrapper[4743]: I1011 01:25:28.800260 4743 generic.go:334] "Generic (PLEG): container finished" podID="74e06001-e5b2-4a21-b3a0-887f814c87ef" containerID="e48806dc0337703b86d04512c6fa718b8c44f1427f444fb92e8e70b4246812cb" exitCode=0 Oct 11 01:25:28 crc kubenswrapper[4743]: I1011 01:25:28.800335 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq" event={"ID":"74e06001-e5b2-4a21-b3a0-887f814c87ef","Type":"ContainerDied","Data":"e48806dc0337703b86d04512c6fa718b8c44f1427f444fb92e8e70b4246812cb"} Oct 11 01:25:30 crc kubenswrapper[4743]: I1011 01:25:30.387369 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq" Oct 11 01:25:30 crc kubenswrapper[4743]: I1011 01:25:30.532457 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxcb6\" (UniqueName: \"kubernetes.io/projected/74e06001-e5b2-4a21-b3a0-887f814c87ef-kube-api-access-mxcb6\") pod \"74e06001-e5b2-4a21-b3a0-887f814c87ef\" (UID: \"74e06001-e5b2-4a21-b3a0-887f814c87ef\") " Oct 11 01:25:30 crc kubenswrapper[4743]: I1011 01:25:30.532631 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74e06001-e5b2-4a21-b3a0-887f814c87ef-ssh-key\") pod \"74e06001-e5b2-4a21-b3a0-887f814c87ef\" (UID: \"74e06001-e5b2-4a21-b3a0-887f814c87ef\") " Oct 11 01:25:30 crc kubenswrapper[4743]: I1011 01:25:30.532689 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74e06001-e5b2-4a21-b3a0-887f814c87ef-inventory\") pod \"74e06001-e5b2-4a21-b3a0-887f814c87ef\" (UID: \"74e06001-e5b2-4a21-b3a0-887f814c87ef\") " Oct 11 01:25:30 crc kubenswrapper[4743]: I1011 01:25:30.539283 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74e06001-e5b2-4a21-b3a0-887f814c87ef-kube-api-access-mxcb6" (OuterVolumeSpecName: "kube-api-access-mxcb6") pod "74e06001-e5b2-4a21-b3a0-887f814c87ef" (UID: "74e06001-e5b2-4a21-b3a0-887f814c87ef"). InnerVolumeSpecName "kube-api-access-mxcb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:25:30 crc kubenswrapper[4743]: I1011 01:25:30.563893 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74e06001-e5b2-4a21-b3a0-887f814c87ef-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "74e06001-e5b2-4a21-b3a0-887f814c87ef" (UID: "74e06001-e5b2-4a21-b3a0-887f814c87ef"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:25:30 crc kubenswrapper[4743]: I1011 01:25:30.571885 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74e06001-e5b2-4a21-b3a0-887f814c87ef-inventory" (OuterVolumeSpecName: "inventory") pod "74e06001-e5b2-4a21-b3a0-887f814c87ef" (UID: "74e06001-e5b2-4a21-b3a0-887f814c87ef"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:25:30 crc kubenswrapper[4743]: I1011 01:25:30.635832 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxcb6\" (UniqueName: \"kubernetes.io/projected/74e06001-e5b2-4a21-b3a0-887f814c87ef-kube-api-access-mxcb6\") on node \"crc\" DevicePath \"\"" Oct 11 01:25:30 crc kubenswrapper[4743]: I1011 01:25:30.636110 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74e06001-e5b2-4a21-b3a0-887f814c87ef-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:25:30 crc kubenswrapper[4743]: I1011 01:25:30.636173 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74e06001-e5b2-4a21-b3a0-887f814c87ef-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:25:30 crc kubenswrapper[4743]: I1011 01:25:30.830628 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq" event={"ID":"74e06001-e5b2-4a21-b3a0-887f814c87ef","Type":"ContainerDied","Data":"6ac02fecf6a70317d170b1b6d4418c831ff8410652a333ae56bb59ec310add29"} Oct 11 01:25:30 crc kubenswrapper[4743]: I1011 01:25:30.830957 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ac02fecf6a70317d170b1b6d4418c831ff8410652a333ae56bb59ec310add29" Oct 11 01:25:30 crc kubenswrapper[4743]: I1011 01:25:30.830741 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq" Oct 11 01:25:30 crc kubenswrapper[4743]: I1011 01:25:30.933119 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-hqwq9"] Oct 11 01:25:30 crc kubenswrapper[4743]: E1011 01:25:30.933550 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e06001-e5b2-4a21-b3a0-887f814c87ef" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:25:30 crc kubenswrapper[4743]: I1011 01:25:30.933568 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e06001-e5b2-4a21-b3a0-887f814c87ef" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:25:30 crc kubenswrapper[4743]: I1011 01:25:30.933783 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="74e06001-e5b2-4a21-b3a0-887f814c87ef" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:25:30 crc kubenswrapper[4743]: I1011 01:25:30.934565 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-hqwq9" Oct 11 01:25:30 crc kubenswrapper[4743]: I1011 01:25:30.937615 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:25:30 crc kubenswrapper[4743]: I1011 01:25:30.937914 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:25:30 crc kubenswrapper[4743]: I1011 01:25:30.938090 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:25:30 crc kubenswrapper[4743]: I1011 01:25:30.939044 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:25:30 crc kubenswrapper[4743]: I1011 01:25:30.953734 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-hqwq9"] Oct 11 01:25:31 crc kubenswrapper[4743]: I1011 01:25:31.055306 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37d164e2-e69a-4faf-892c-b79e155a6c90-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-hqwq9\" (UID: \"37d164e2-e69a-4faf-892c-b79e155a6c90\") " pod="openstack/ssh-known-hosts-edpm-deployment-hqwq9" Oct 11 01:25:31 crc kubenswrapper[4743]: I1011 01:25:31.055606 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/37d164e2-e69a-4faf-892c-b79e155a6c90-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-hqwq9\" (UID: \"37d164e2-e69a-4faf-892c-b79e155a6c90\") " pod="openstack/ssh-known-hosts-edpm-deployment-hqwq9" Oct 11 01:25:31 crc kubenswrapper[4743]: I1011 01:25:31.055894 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hglt2\" (UniqueName: \"kubernetes.io/projected/37d164e2-e69a-4faf-892c-b79e155a6c90-kube-api-access-hglt2\") pod \"ssh-known-hosts-edpm-deployment-hqwq9\" (UID: \"37d164e2-e69a-4faf-892c-b79e155a6c90\") " pod="openstack/ssh-known-hosts-edpm-deployment-hqwq9" Oct 11 01:25:31 crc kubenswrapper[4743]: I1011 01:25:31.158933 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hglt2\" (UniqueName: \"kubernetes.io/projected/37d164e2-e69a-4faf-892c-b79e155a6c90-kube-api-access-hglt2\") pod \"ssh-known-hosts-edpm-deployment-hqwq9\" (UID: \"37d164e2-e69a-4faf-892c-b79e155a6c90\") " pod="openstack/ssh-known-hosts-edpm-deployment-hqwq9" Oct 11 01:25:31 crc kubenswrapper[4743]: I1011 01:25:31.159026 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37d164e2-e69a-4faf-892c-b79e155a6c90-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-hqwq9\" (UID: \"37d164e2-e69a-4faf-892c-b79e155a6c90\") " pod="openstack/ssh-known-hosts-edpm-deployment-hqwq9" Oct 11 01:25:31 crc kubenswrapper[4743]: I1011 01:25:31.159293 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/37d164e2-e69a-4faf-892c-b79e155a6c90-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-hqwq9\" (UID: \"37d164e2-e69a-4faf-892c-b79e155a6c90\") " pod="openstack/ssh-known-hosts-edpm-deployment-hqwq9" Oct 11 01:25:31 crc kubenswrapper[4743]: I1011 01:25:31.164674 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/37d164e2-e69a-4faf-892c-b79e155a6c90-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-hqwq9\" (UID: \"37d164e2-e69a-4faf-892c-b79e155a6c90\") " pod="openstack/ssh-known-hosts-edpm-deployment-hqwq9" Oct 11 01:25:31 crc kubenswrapper[4743]: I1011 01:25:31.166369 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37d164e2-e69a-4faf-892c-b79e155a6c90-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-hqwq9\" (UID: \"37d164e2-e69a-4faf-892c-b79e155a6c90\") " pod="openstack/ssh-known-hosts-edpm-deployment-hqwq9" Oct 11 01:25:31 crc kubenswrapper[4743]: I1011 01:25:31.190343 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hglt2\" (UniqueName: \"kubernetes.io/projected/37d164e2-e69a-4faf-892c-b79e155a6c90-kube-api-access-hglt2\") pod \"ssh-known-hosts-edpm-deployment-hqwq9\" (UID: \"37d164e2-e69a-4faf-892c-b79e155a6c90\") " pod="openstack/ssh-known-hosts-edpm-deployment-hqwq9" Oct 11 01:25:31 crc kubenswrapper[4743]: I1011 01:25:31.272448 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-hqwq9" Oct 11 01:25:31 crc kubenswrapper[4743]: I1011 01:25:31.920773 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-hqwq9"] Oct 11 01:25:32 crc kubenswrapper[4743]: I1011 01:25:32.864938 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-hqwq9" event={"ID":"37d164e2-e69a-4faf-892c-b79e155a6c90","Type":"ContainerStarted","Data":"f3255b4ef4ec1544035bc349b0d79923e4e4805732f30735ed42be6491cd27ce"} Oct 11 01:25:32 crc kubenswrapper[4743]: I1011 01:25:32.865734 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-hqwq9" event={"ID":"37d164e2-e69a-4faf-892c-b79e155a6c90","Type":"ContainerStarted","Data":"4cb18a6499006ec0f0a8a67153b24934703b5ed9defb1422b4168a6e11ffc562"} Oct 11 01:25:32 crc kubenswrapper[4743]: I1011 01:25:32.899594 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-hqwq9" podStartSLOduration=2.389684375 podStartE2EDuration="2.899572438s" podCreationTimestamp="2025-10-11 01:25:30 +0000 UTC" firstStartedPulling="2025-10-11 01:25:31.930886831 +0000 UTC m=+2026.583867228" lastFinishedPulling="2025-10-11 01:25:32.440774884 +0000 UTC m=+2027.093755291" observedRunningTime="2025-10-11 01:25:32.889272346 +0000 UTC m=+2027.542252753" watchObservedRunningTime="2025-10-11 01:25:32.899572438 +0000 UTC m=+2027.552552845" Oct 11 01:25:41 crc kubenswrapper[4743]: I1011 01:25:41.025708 4743 generic.go:334] "Generic (PLEG): container finished" podID="37d164e2-e69a-4faf-892c-b79e155a6c90" containerID="f3255b4ef4ec1544035bc349b0d79923e4e4805732f30735ed42be6491cd27ce" exitCode=0 Oct 11 01:25:41 crc kubenswrapper[4743]: I1011 01:25:41.025797 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-hqwq9" event={"ID":"37d164e2-e69a-4faf-892c-b79e155a6c90","Type":"ContainerDied","Data":"f3255b4ef4ec1544035bc349b0d79923e4e4805732f30735ed42be6491cd27ce"} Oct 11 01:25:42 crc kubenswrapper[4743]: I1011 01:25:42.581903 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-hqwq9" Oct 11 01:25:42 crc kubenswrapper[4743]: I1011 01:25:42.696566 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hglt2\" (UniqueName: \"kubernetes.io/projected/37d164e2-e69a-4faf-892c-b79e155a6c90-kube-api-access-hglt2\") pod \"37d164e2-e69a-4faf-892c-b79e155a6c90\" (UID: \"37d164e2-e69a-4faf-892c-b79e155a6c90\") " Oct 11 01:25:42 crc kubenswrapper[4743]: I1011 01:25:42.696668 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/37d164e2-e69a-4faf-892c-b79e155a6c90-inventory-0\") pod \"37d164e2-e69a-4faf-892c-b79e155a6c90\" (UID: \"37d164e2-e69a-4faf-892c-b79e155a6c90\") " Oct 11 01:25:42 crc kubenswrapper[4743]: I1011 01:25:42.696807 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37d164e2-e69a-4faf-892c-b79e155a6c90-ssh-key-openstack-edpm-ipam\") pod \"37d164e2-e69a-4faf-892c-b79e155a6c90\" (UID: \"37d164e2-e69a-4faf-892c-b79e155a6c90\") " Oct 11 01:25:42 crc kubenswrapper[4743]: I1011 01:25:42.704270 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37d164e2-e69a-4faf-892c-b79e155a6c90-kube-api-access-hglt2" (OuterVolumeSpecName: "kube-api-access-hglt2") pod "37d164e2-e69a-4faf-892c-b79e155a6c90" (UID: "37d164e2-e69a-4faf-892c-b79e155a6c90"). InnerVolumeSpecName "kube-api-access-hglt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:25:42 crc kubenswrapper[4743]: I1011 01:25:42.735427 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d164e2-e69a-4faf-892c-b79e155a6c90-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "37d164e2-e69a-4faf-892c-b79e155a6c90" (UID: "37d164e2-e69a-4faf-892c-b79e155a6c90"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:25:42 crc kubenswrapper[4743]: I1011 01:25:42.754012 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d164e2-e69a-4faf-892c-b79e155a6c90-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "37d164e2-e69a-4faf-892c-b79e155a6c90" (UID: "37d164e2-e69a-4faf-892c-b79e155a6c90"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:25:42 crc kubenswrapper[4743]: I1011 01:25:42.798744 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hglt2\" (UniqueName: \"kubernetes.io/projected/37d164e2-e69a-4faf-892c-b79e155a6c90-kube-api-access-hglt2\") on node \"crc\" DevicePath \"\"" Oct 11 01:25:42 crc kubenswrapper[4743]: I1011 01:25:42.798783 4743 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/37d164e2-e69a-4faf-892c-b79e155a6c90-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:25:42 crc kubenswrapper[4743]: I1011 01:25:42.798794 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37d164e2-e69a-4faf-892c-b79e155a6c90-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 11 01:25:43 crc kubenswrapper[4743]: I1011 01:25:43.069432 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-hqwq9" event={"ID":"37d164e2-e69a-4faf-892c-b79e155a6c90","Type":"ContainerDied","Data":"4cb18a6499006ec0f0a8a67153b24934703b5ed9defb1422b4168a6e11ffc562"} Oct 11 01:25:43 crc kubenswrapper[4743]: I1011 01:25:43.069713 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cb18a6499006ec0f0a8a67153b24934703b5ed9defb1422b4168a6e11ffc562" Oct 11 01:25:43 crc kubenswrapper[4743]: I1011 01:25:43.069632 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-hqwq9" Oct 11 01:25:43 crc kubenswrapper[4743]: I1011 01:25:43.143919 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6lpd"] Oct 11 01:25:43 crc kubenswrapper[4743]: E1011 01:25:43.144415 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37d164e2-e69a-4faf-892c-b79e155a6c90" containerName="ssh-known-hosts-edpm-deployment" Oct 11 01:25:43 crc kubenswrapper[4743]: I1011 01:25:43.144432 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="37d164e2-e69a-4faf-892c-b79e155a6c90" containerName="ssh-known-hosts-edpm-deployment" Oct 11 01:25:43 crc kubenswrapper[4743]: I1011 01:25:43.144672 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="37d164e2-e69a-4faf-892c-b79e155a6c90" containerName="ssh-known-hosts-edpm-deployment" Oct 11 01:25:43 crc kubenswrapper[4743]: I1011 01:25:43.145480 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6lpd" Oct 11 01:25:43 crc kubenswrapper[4743]: I1011 01:25:43.148084 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:25:43 crc kubenswrapper[4743]: I1011 01:25:43.148711 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:25:43 crc kubenswrapper[4743]: I1011 01:25:43.148934 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:25:43 crc kubenswrapper[4743]: I1011 01:25:43.151732 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:25:43 crc kubenswrapper[4743]: I1011 01:25:43.163283 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6lpd"] Oct 11 01:25:43 crc kubenswrapper[4743]: I1011 01:25:43.209496 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/566b2f7b-e36d-49d7-b985-dc1a39ad9253-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6lpd\" (UID: \"566b2f7b-e36d-49d7-b985-dc1a39ad9253\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6lpd" Oct 11 01:25:43 crc kubenswrapper[4743]: I1011 01:25:43.209645 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q684s\" (UniqueName: \"kubernetes.io/projected/566b2f7b-e36d-49d7-b985-dc1a39ad9253-kube-api-access-q684s\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6lpd\" (UID: \"566b2f7b-e36d-49d7-b985-dc1a39ad9253\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6lpd" Oct 11 01:25:43 crc kubenswrapper[4743]: I1011 01:25:43.209762 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/566b2f7b-e36d-49d7-b985-dc1a39ad9253-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6lpd\" (UID: \"566b2f7b-e36d-49d7-b985-dc1a39ad9253\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6lpd" Oct 11 01:25:43 crc kubenswrapper[4743]: I1011 01:25:43.311368 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/566b2f7b-e36d-49d7-b985-dc1a39ad9253-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6lpd\" (UID: \"566b2f7b-e36d-49d7-b985-dc1a39ad9253\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6lpd" Oct 11 01:25:43 crc kubenswrapper[4743]: I1011 01:25:43.311517 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q684s\" (UniqueName: \"kubernetes.io/projected/566b2f7b-e36d-49d7-b985-dc1a39ad9253-kube-api-access-q684s\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6lpd\" (UID: \"566b2f7b-e36d-49d7-b985-dc1a39ad9253\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6lpd" Oct 11 01:25:43 crc kubenswrapper[4743]: I1011 01:25:43.311639 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/566b2f7b-e36d-49d7-b985-dc1a39ad9253-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6lpd\" (UID: \"566b2f7b-e36d-49d7-b985-dc1a39ad9253\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6lpd" Oct 11 01:25:43 crc kubenswrapper[4743]: I1011 01:25:43.317999 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/566b2f7b-e36d-49d7-b985-dc1a39ad9253-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6lpd\" (UID: \"566b2f7b-e36d-49d7-b985-dc1a39ad9253\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6lpd" Oct 11 01:25:43 crc kubenswrapper[4743]: I1011 01:25:43.318948 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/566b2f7b-e36d-49d7-b985-dc1a39ad9253-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6lpd\" (UID: \"566b2f7b-e36d-49d7-b985-dc1a39ad9253\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6lpd" Oct 11 01:25:43 crc kubenswrapper[4743]: I1011 01:25:43.327255 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q684s\" (UniqueName: \"kubernetes.io/projected/566b2f7b-e36d-49d7-b985-dc1a39ad9253-kube-api-access-q684s\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6lpd\" (UID: \"566b2f7b-e36d-49d7-b985-dc1a39ad9253\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6lpd" Oct 11 01:25:43 crc kubenswrapper[4743]: I1011 01:25:43.467404 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6lpd" Oct 11 01:25:44 crc kubenswrapper[4743]: I1011 01:25:44.046211 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6lpd"] Oct 11 01:25:44 crc kubenswrapper[4743]: I1011 01:25:44.082729 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6lpd" event={"ID":"566b2f7b-e36d-49d7-b985-dc1a39ad9253","Type":"ContainerStarted","Data":"a40153ed9ad806ef088de88204361774bd4ac3f6a959a5965fcf671ede874777"} Oct 11 01:25:45 crc kubenswrapper[4743]: I1011 01:25:45.092991 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6lpd" event={"ID":"566b2f7b-e36d-49d7-b985-dc1a39ad9253","Type":"ContainerStarted","Data":"11579dcba3db904e38d5126991e25cd8c01bfe9e4b7659a4b740c2198691b63d"} Oct 11 01:25:45 crc kubenswrapper[4743]: I1011 01:25:45.117185 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6lpd" podStartSLOduration=1.554456317 podStartE2EDuration="2.117166025s" podCreationTimestamp="2025-10-11 01:25:43 +0000 UTC" firstStartedPulling="2025-10-11 01:25:44.050874223 +0000 UTC m=+2038.703854630" lastFinishedPulling="2025-10-11 01:25:44.613583911 +0000 UTC m=+2039.266564338" observedRunningTime="2025-10-11 01:25:45.107480158 +0000 UTC m=+2039.760460595" watchObservedRunningTime="2025-10-11 01:25:45.117166025 +0000 UTC m=+2039.770146422" Oct 11 01:25:55 crc kubenswrapper[4743]: I1011 01:25:55.265167 4743 generic.go:334] "Generic (PLEG): container finished" podID="566b2f7b-e36d-49d7-b985-dc1a39ad9253" containerID="11579dcba3db904e38d5126991e25cd8c01bfe9e4b7659a4b740c2198691b63d" exitCode=0 Oct 11 01:25:55 crc kubenswrapper[4743]: I1011 01:25:55.265264 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6lpd" event={"ID":"566b2f7b-e36d-49d7-b985-dc1a39ad9253","Type":"ContainerDied","Data":"11579dcba3db904e38d5126991e25cd8c01bfe9e4b7659a4b740c2198691b63d"} Oct 11 01:25:56 crc kubenswrapper[4743]: I1011 01:25:56.856808 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6lpd" Oct 11 01:25:56 crc kubenswrapper[4743]: I1011 01:25:56.957709 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q684s\" (UniqueName: \"kubernetes.io/projected/566b2f7b-e36d-49d7-b985-dc1a39ad9253-kube-api-access-q684s\") pod \"566b2f7b-e36d-49d7-b985-dc1a39ad9253\" (UID: \"566b2f7b-e36d-49d7-b985-dc1a39ad9253\") " Oct 11 01:25:56 crc kubenswrapper[4743]: I1011 01:25:56.958311 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/566b2f7b-e36d-49d7-b985-dc1a39ad9253-ssh-key\") pod \"566b2f7b-e36d-49d7-b985-dc1a39ad9253\" (UID: \"566b2f7b-e36d-49d7-b985-dc1a39ad9253\") " Oct 11 01:25:56 crc kubenswrapper[4743]: I1011 01:25:56.958418 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/566b2f7b-e36d-49d7-b985-dc1a39ad9253-inventory\") pod \"566b2f7b-e36d-49d7-b985-dc1a39ad9253\" (UID: \"566b2f7b-e36d-49d7-b985-dc1a39ad9253\") " Oct 11 01:25:56 crc kubenswrapper[4743]: I1011 01:25:56.969797 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/566b2f7b-e36d-49d7-b985-dc1a39ad9253-kube-api-access-q684s" (OuterVolumeSpecName: "kube-api-access-q684s") pod "566b2f7b-e36d-49d7-b985-dc1a39ad9253" (UID: "566b2f7b-e36d-49d7-b985-dc1a39ad9253"). InnerVolumeSpecName "kube-api-access-q684s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.006159 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/566b2f7b-e36d-49d7-b985-dc1a39ad9253-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "566b2f7b-e36d-49d7-b985-dc1a39ad9253" (UID: "566b2f7b-e36d-49d7-b985-dc1a39ad9253"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.022213 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/566b2f7b-e36d-49d7-b985-dc1a39ad9253-inventory" (OuterVolumeSpecName: "inventory") pod "566b2f7b-e36d-49d7-b985-dc1a39ad9253" (UID: "566b2f7b-e36d-49d7-b985-dc1a39ad9253"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.061954 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q684s\" (UniqueName: \"kubernetes.io/projected/566b2f7b-e36d-49d7-b985-dc1a39ad9253-kube-api-access-q684s\") on node \"crc\" DevicePath \"\"" Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.062027 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/566b2f7b-e36d-49d7-b985-dc1a39ad9253-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.062053 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/566b2f7b-e36d-49d7-b985-dc1a39ad9253-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.294421 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6lpd" event={"ID":"566b2f7b-e36d-49d7-b985-dc1a39ad9253","Type":"ContainerDied","Data":"a40153ed9ad806ef088de88204361774bd4ac3f6a959a5965fcf671ede874777"} Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.294484 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a40153ed9ad806ef088de88204361774bd4ac3f6a959a5965fcf671ede874777" Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.294454 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6lpd" Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.381726 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt"] Oct 11 01:25:57 crc kubenswrapper[4743]: E1011 01:25:57.385785 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="566b2f7b-e36d-49d7-b985-dc1a39ad9253" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.385811 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="566b2f7b-e36d-49d7-b985-dc1a39ad9253" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.386116 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="566b2f7b-e36d-49d7-b985-dc1a39ad9253" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.387030 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt" Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.389508 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.391818 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.391890 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.392607 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.394668 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt"] Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.471827 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f790cf8-5f73-4587-b2df-d0e7ef6622b0-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt\" (UID: \"1f790cf8-5f73-4587-b2df-d0e7ef6622b0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt" Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.471916 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4qd7\" (UniqueName: \"kubernetes.io/projected/1f790cf8-5f73-4587-b2df-d0e7ef6622b0-kube-api-access-n4qd7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt\" (UID: \"1f790cf8-5f73-4587-b2df-d0e7ef6622b0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt" Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.472438 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f790cf8-5f73-4587-b2df-d0e7ef6622b0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt\" (UID: \"1f790cf8-5f73-4587-b2df-d0e7ef6622b0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt" Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.574318 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f790cf8-5f73-4587-b2df-d0e7ef6622b0-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt\" (UID: \"1f790cf8-5f73-4587-b2df-d0e7ef6622b0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt" Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.574396 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4qd7\" (UniqueName: \"kubernetes.io/projected/1f790cf8-5f73-4587-b2df-d0e7ef6622b0-kube-api-access-n4qd7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt\" (UID: \"1f790cf8-5f73-4587-b2df-d0e7ef6622b0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt" Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.574657 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f790cf8-5f73-4587-b2df-d0e7ef6622b0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt\" (UID: \"1f790cf8-5f73-4587-b2df-d0e7ef6622b0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt" Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.582295 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f790cf8-5f73-4587-b2df-d0e7ef6622b0-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt\" (UID: \"1f790cf8-5f73-4587-b2df-d0e7ef6622b0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt" Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.582566 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f790cf8-5f73-4587-b2df-d0e7ef6622b0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt\" (UID: \"1f790cf8-5f73-4587-b2df-d0e7ef6622b0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt" Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.597706 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4qd7\" (UniqueName: \"kubernetes.io/projected/1f790cf8-5f73-4587-b2df-d0e7ef6622b0-kube-api-access-n4qd7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt\" (UID: \"1f790cf8-5f73-4587-b2df-d0e7ef6622b0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt" Oct 11 01:25:57 crc kubenswrapper[4743]: I1011 01:25:57.718077 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt" Oct 11 01:25:58 crc kubenswrapper[4743]: I1011 01:25:58.305840 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt"] Oct 11 01:25:59 crc kubenswrapper[4743]: I1011 01:25:59.320016 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt" event={"ID":"1f790cf8-5f73-4587-b2df-d0e7ef6622b0","Type":"ContainerStarted","Data":"66afe0758ed02251e593de3b2daeac8d2dbb2c88186d9e7047432c40f2dced08"} Oct 11 01:25:59 crc kubenswrapper[4743]: I1011 01:25:59.320432 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt" event={"ID":"1f790cf8-5f73-4587-b2df-d0e7ef6622b0","Type":"ContainerStarted","Data":"905a23f1e7d1ac6e48e56383206815622d659b80ee0eb7b1dbc55ea85dafcfc5"} Oct 11 01:25:59 crc kubenswrapper[4743]: I1011 01:25:59.339055 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt" podStartSLOduration=1.9263723480000001 podStartE2EDuration="2.339035349s" podCreationTimestamp="2025-10-11 01:25:57 +0000 UTC" firstStartedPulling="2025-10-11 01:25:58.310389765 +0000 UTC m=+2052.963370172" lastFinishedPulling="2025-10-11 01:25:58.723052776 +0000 UTC m=+2053.376033173" observedRunningTime="2025-10-11 01:25:59.338983438 +0000 UTC m=+2053.991963845" watchObservedRunningTime="2025-10-11 01:25:59.339035349 +0000 UTC m=+2053.992015746" Oct 11 01:26:05 crc kubenswrapper[4743]: I1011 01:26:05.709530 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d4m7w"] Oct 11 01:26:05 crc kubenswrapper[4743]: I1011 01:26:05.713162 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d4m7w" Oct 11 01:26:05 crc kubenswrapper[4743]: I1011 01:26:05.729693 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d4m7w"] Oct 11 01:26:05 crc kubenswrapper[4743]: I1011 01:26:05.854556 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d89a5e4a-5095-4b5d-8757-68c78e28e25a-catalog-content\") pod \"redhat-operators-d4m7w\" (UID: \"d89a5e4a-5095-4b5d-8757-68c78e28e25a\") " pod="openshift-marketplace/redhat-operators-d4m7w" Oct 11 01:26:05 crc kubenswrapper[4743]: I1011 01:26:05.854944 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb882\" (UniqueName: \"kubernetes.io/projected/d89a5e4a-5095-4b5d-8757-68c78e28e25a-kube-api-access-tb882\") pod \"redhat-operators-d4m7w\" (UID: \"d89a5e4a-5095-4b5d-8757-68c78e28e25a\") " pod="openshift-marketplace/redhat-operators-d4m7w" Oct 11 01:26:05 crc kubenswrapper[4743]: I1011 01:26:05.855023 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d89a5e4a-5095-4b5d-8757-68c78e28e25a-utilities\") pod \"redhat-operators-d4m7w\" (UID: \"d89a5e4a-5095-4b5d-8757-68c78e28e25a\") " pod="openshift-marketplace/redhat-operators-d4m7w" Oct 11 01:26:05 crc kubenswrapper[4743]: I1011 01:26:05.956869 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d89a5e4a-5095-4b5d-8757-68c78e28e25a-catalog-content\") pod \"redhat-operators-d4m7w\" (UID: \"d89a5e4a-5095-4b5d-8757-68c78e28e25a\") " pod="openshift-marketplace/redhat-operators-d4m7w" Oct 11 01:26:05 crc kubenswrapper[4743]: I1011 01:26:05.956994 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb882\" (UniqueName: \"kubernetes.io/projected/d89a5e4a-5095-4b5d-8757-68c78e28e25a-kube-api-access-tb882\") pod \"redhat-operators-d4m7w\" (UID: \"d89a5e4a-5095-4b5d-8757-68c78e28e25a\") " pod="openshift-marketplace/redhat-operators-d4m7w" Oct 11 01:26:05 crc kubenswrapper[4743]: I1011 01:26:05.957074 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d89a5e4a-5095-4b5d-8757-68c78e28e25a-utilities\") pod \"redhat-operators-d4m7w\" (UID: \"d89a5e4a-5095-4b5d-8757-68c78e28e25a\") " pod="openshift-marketplace/redhat-operators-d4m7w" Oct 11 01:26:05 crc kubenswrapper[4743]: I1011 01:26:05.957501 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d89a5e4a-5095-4b5d-8757-68c78e28e25a-catalog-content\") pod \"redhat-operators-d4m7w\" (UID: \"d89a5e4a-5095-4b5d-8757-68c78e28e25a\") " pod="openshift-marketplace/redhat-operators-d4m7w" Oct 11 01:26:05 crc kubenswrapper[4743]: I1011 01:26:05.957576 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d89a5e4a-5095-4b5d-8757-68c78e28e25a-utilities\") pod \"redhat-operators-d4m7w\" (UID: \"d89a5e4a-5095-4b5d-8757-68c78e28e25a\") " pod="openshift-marketplace/redhat-operators-d4m7w" Oct 11 01:26:05 crc kubenswrapper[4743]: I1011 01:26:05.984438 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb882\" (UniqueName: \"kubernetes.io/projected/d89a5e4a-5095-4b5d-8757-68c78e28e25a-kube-api-access-tb882\") pod \"redhat-operators-d4m7w\" (UID: \"d89a5e4a-5095-4b5d-8757-68c78e28e25a\") " pod="openshift-marketplace/redhat-operators-d4m7w" Oct 11 01:26:06 crc kubenswrapper[4743]: I1011 01:26:06.064144 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d4m7w" Oct 11 01:26:06 crc kubenswrapper[4743]: I1011 01:26:06.558469 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d4m7w"] Oct 11 01:26:07 crc kubenswrapper[4743]: I1011 01:26:07.413437 4743 generic.go:334] "Generic (PLEG): container finished" podID="d89a5e4a-5095-4b5d-8757-68c78e28e25a" containerID="42eef27c35904d898015916eec315de772a0fc88b1b9fe725a01817a2f57a95c" exitCode=0 Oct 11 01:26:07 crc kubenswrapper[4743]: I1011 01:26:07.413527 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4m7w" event={"ID":"d89a5e4a-5095-4b5d-8757-68c78e28e25a","Type":"ContainerDied","Data":"42eef27c35904d898015916eec315de772a0fc88b1b9fe725a01817a2f57a95c"} Oct 11 01:26:07 crc kubenswrapper[4743]: I1011 01:26:07.413684 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4m7w" event={"ID":"d89a5e4a-5095-4b5d-8757-68c78e28e25a","Type":"ContainerStarted","Data":"c6c347dab52dbd0c3de33e75077e625aa56254d10e482575acfa9085de60508c"} Oct 11 01:26:09 crc kubenswrapper[4743]: I1011 01:26:09.434024 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4m7w" event={"ID":"d89a5e4a-5095-4b5d-8757-68c78e28e25a","Type":"ContainerStarted","Data":"83baecd362cde102225d973abb62930b0c59446408b246777bbd9b3e6e3e1cb0"} Oct 11 01:26:10 crc kubenswrapper[4743]: I1011 01:26:10.453333 4743 generic.go:334] "Generic (PLEG): container finished" podID="1f790cf8-5f73-4587-b2df-d0e7ef6622b0" containerID="66afe0758ed02251e593de3b2daeac8d2dbb2c88186d9e7047432c40f2dced08" exitCode=0 Oct 11 01:26:10 crc kubenswrapper[4743]: I1011 01:26:10.453538 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt" event={"ID":"1f790cf8-5f73-4587-b2df-d0e7ef6622b0","Type":"ContainerDied","Data":"66afe0758ed02251e593de3b2daeac8d2dbb2c88186d9e7047432c40f2dced08"} Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.226761 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.294136 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4qd7\" (UniqueName: \"kubernetes.io/projected/1f790cf8-5f73-4587-b2df-d0e7ef6622b0-kube-api-access-n4qd7\") pod \"1f790cf8-5f73-4587-b2df-d0e7ef6622b0\" (UID: \"1f790cf8-5f73-4587-b2df-d0e7ef6622b0\") " Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.294190 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f790cf8-5f73-4587-b2df-d0e7ef6622b0-inventory\") pod \"1f790cf8-5f73-4587-b2df-d0e7ef6622b0\" (UID: \"1f790cf8-5f73-4587-b2df-d0e7ef6622b0\") " Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.294318 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f790cf8-5f73-4587-b2df-d0e7ef6622b0-ssh-key\") pod \"1f790cf8-5f73-4587-b2df-d0e7ef6622b0\" (UID: \"1f790cf8-5f73-4587-b2df-d0e7ef6622b0\") " Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.300159 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f790cf8-5f73-4587-b2df-d0e7ef6622b0-kube-api-access-n4qd7" (OuterVolumeSpecName: "kube-api-access-n4qd7") pod "1f790cf8-5f73-4587-b2df-d0e7ef6622b0" (UID: "1f790cf8-5f73-4587-b2df-d0e7ef6622b0"). InnerVolumeSpecName "kube-api-access-n4qd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.324336 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f790cf8-5f73-4587-b2df-d0e7ef6622b0-inventory" (OuterVolumeSpecName: "inventory") pod "1f790cf8-5f73-4587-b2df-d0e7ef6622b0" (UID: "1f790cf8-5f73-4587-b2df-d0e7ef6622b0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.326223 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f790cf8-5f73-4587-b2df-d0e7ef6622b0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1f790cf8-5f73-4587-b2df-d0e7ef6622b0" (UID: "1f790cf8-5f73-4587-b2df-d0e7ef6622b0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.397043 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4qd7\" (UniqueName: \"kubernetes.io/projected/1f790cf8-5f73-4587-b2df-d0e7ef6622b0-kube-api-access-n4qd7\") on node \"crc\" DevicePath \"\"" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.397300 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f790cf8-5f73-4587-b2df-d0e7ef6622b0-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.397309 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f790cf8-5f73-4587-b2df-d0e7ef6622b0-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.480964 4743 generic.go:334] "Generic (PLEG): container finished" podID="d89a5e4a-5095-4b5d-8757-68c78e28e25a" containerID="83baecd362cde102225d973abb62930b0c59446408b246777bbd9b3e6e3e1cb0" exitCode=0 Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.481063 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4m7w" event={"ID":"d89a5e4a-5095-4b5d-8757-68c78e28e25a","Type":"ContainerDied","Data":"83baecd362cde102225d973abb62930b0c59446408b246777bbd9b3e6e3e1cb0"} Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.483109 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt" event={"ID":"1f790cf8-5f73-4587-b2df-d0e7ef6622b0","Type":"ContainerDied","Data":"905a23f1e7d1ac6e48e56383206815622d659b80ee0eb7b1dbc55ea85dafcfc5"} Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.483153 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="905a23f1e7d1ac6e48e56383206815622d659b80ee0eb7b1dbc55ea85dafcfc5" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.483265 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.581008 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4"] Oct 11 01:26:12 crc kubenswrapper[4743]: E1011 01:26:12.581435 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f790cf8-5f73-4587-b2df-d0e7ef6622b0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.581471 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f790cf8-5f73-4587-b2df-d0e7ef6622b0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.581684 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f790cf8-5f73-4587-b2df-d0e7ef6622b0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.582510 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.588444 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.588692 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.588703 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.588821 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.588950 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.589005 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.589108 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.589223 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.607695 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4"] Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.703351 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.703431 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.703469 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.703507 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.703583 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.703609 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.703777 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.703961 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.704020 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.704146 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9tmz\" (UniqueName: \"kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-kube-api-access-f9tmz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.704188 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.704319 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.704374 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.806694 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.806754 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.806804 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.806877 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.806907 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.806949 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.806993 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.807020 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.807085 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9tmz\" (UniqueName: \"kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-kube-api-access-f9tmz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.807117 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.807162 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.807190 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.807270 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.813788 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.840363 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.842874 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.843140 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.845405 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.845487 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.846149 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.846157 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.849314 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.851138 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.851334 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.852670 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.867976 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9tmz\" (UniqueName: \"kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-kube-api-access-f9tmz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:12 crc kubenswrapper[4743]: I1011 01:26:12.911821 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:26:13 crc kubenswrapper[4743]: I1011 01:26:13.480571 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4"] Oct 11 01:26:13 crc kubenswrapper[4743]: I1011 01:26:13.514982 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4m7w" event={"ID":"d89a5e4a-5095-4b5d-8757-68c78e28e25a","Type":"ContainerStarted","Data":"193af6e6b0b944e67cddf537bda9a9a5306af6fcdc859b51283cfd164d862a4a"} Oct 11 01:26:13 crc kubenswrapper[4743]: I1011 01:26:13.535465 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d4m7w" podStartSLOduration=3.00009361 podStartE2EDuration="8.535442496s" podCreationTimestamp="2025-10-11 01:26:05 +0000 UTC" firstStartedPulling="2025-10-11 01:26:07.415486225 +0000 UTC m=+2062.068466622" lastFinishedPulling="2025-10-11 01:26:12.950835121 +0000 UTC m=+2067.603815508" observedRunningTime="2025-10-11 01:26:13.529092285 +0000 UTC m=+2068.182072712" watchObservedRunningTime="2025-10-11 01:26:13.535442496 +0000 UTC m=+2068.188422903" Oct 11 01:26:14 crc kubenswrapper[4743]: I1011 01:26:14.529223 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" event={"ID":"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5","Type":"ContainerStarted","Data":"a5c45fb4c9e8a53dc8993d298f39bc21ac4e8328ec181a98d856ef78c7809701"} Oct 11 01:26:14 crc kubenswrapper[4743]: I1011 01:26:14.529717 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" event={"ID":"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5","Type":"ContainerStarted","Data":"2dd94b5c959b6940db2a4163ce99c0e2fa6e4558325350429930e8c5ab437313"} Oct 11 01:26:14 crc kubenswrapper[4743]: I1011 01:26:14.552445 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" podStartSLOduration=2.100958116 podStartE2EDuration="2.552427564s" podCreationTimestamp="2025-10-11 01:26:12 +0000 UTC" firstStartedPulling="2025-10-11 01:26:13.496364252 +0000 UTC m=+2068.149344649" lastFinishedPulling="2025-10-11 01:26:13.94783371 +0000 UTC m=+2068.600814097" observedRunningTime="2025-10-11 01:26:14.551256004 +0000 UTC m=+2069.204236411" watchObservedRunningTime="2025-10-11 01:26:14.552427564 +0000 UTC m=+2069.205407961" Oct 11 01:26:16 crc kubenswrapper[4743]: I1011 01:26:16.064488 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d4m7w" Oct 11 01:26:16 crc kubenswrapper[4743]: I1011 01:26:16.064912 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d4m7w" Oct 11 01:26:17 crc kubenswrapper[4743]: I1011 01:26:17.119518 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d4m7w" podUID="d89a5e4a-5095-4b5d-8757-68c78e28e25a" containerName="registry-server" probeResult="failure" output=< Oct 11 01:26:17 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Oct 11 01:26:17 crc kubenswrapper[4743]: > Oct 11 01:26:19 crc kubenswrapper[4743]: I1011 01:26:19.555366 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zmx2h"] Oct 11 01:26:19 crc kubenswrapper[4743]: I1011 01:26:19.557736 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zmx2h" Oct 11 01:26:19 crc kubenswrapper[4743]: I1011 01:26:19.582441 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zmx2h"] Oct 11 01:26:19 crc kubenswrapper[4743]: I1011 01:26:19.702340 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa41272c-b7b3-4bd1-81db-b787376104b6-catalog-content\") pod \"redhat-marketplace-zmx2h\" (UID: \"aa41272c-b7b3-4bd1-81db-b787376104b6\") " pod="openshift-marketplace/redhat-marketplace-zmx2h" Oct 11 01:26:19 crc kubenswrapper[4743]: I1011 01:26:19.702668 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqzv5\" (UniqueName: \"kubernetes.io/projected/aa41272c-b7b3-4bd1-81db-b787376104b6-kube-api-access-pqzv5\") pod \"redhat-marketplace-zmx2h\" (UID: \"aa41272c-b7b3-4bd1-81db-b787376104b6\") " pod="openshift-marketplace/redhat-marketplace-zmx2h" Oct 11 01:26:19 crc kubenswrapper[4743]: I1011 01:26:19.702707 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa41272c-b7b3-4bd1-81db-b787376104b6-utilities\") pod \"redhat-marketplace-zmx2h\" (UID: \"aa41272c-b7b3-4bd1-81db-b787376104b6\") " pod="openshift-marketplace/redhat-marketplace-zmx2h" Oct 11 01:26:19 crc kubenswrapper[4743]: I1011 01:26:19.804331 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa41272c-b7b3-4bd1-81db-b787376104b6-catalog-content\") pod \"redhat-marketplace-zmx2h\" (UID: \"aa41272c-b7b3-4bd1-81db-b787376104b6\") " pod="openshift-marketplace/redhat-marketplace-zmx2h" Oct 11 01:26:19 crc kubenswrapper[4743]: I1011 01:26:19.804462 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqzv5\" (UniqueName: \"kubernetes.io/projected/aa41272c-b7b3-4bd1-81db-b787376104b6-kube-api-access-pqzv5\") pod \"redhat-marketplace-zmx2h\" (UID: \"aa41272c-b7b3-4bd1-81db-b787376104b6\") " pod="openshift-marketplace/redhat-marketplace-zmx2h" Oct 11 01:26:19 crc kubenswrapper[4743]: I1011 01:26:19.804514 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa41272c-b7b3-4bd1-81db-b787376104b6-utilities\") pod \"redhat-marketplace-zmx2h\" (UID: \"aa41272c-b7b3-4bd1-81db-b787376104b6\") " pod="openshift-marketplace/redhat-marketplace-zmx2h" Oct 11 01:26:19 crc kubenswrapper[4743]: I1011 01:26:19.805045 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa41272c-b7b3-4bd1-81db-b787376104b6-utilities\") pod \"redhat-marketplace-zmx2h\" (UID: \"aa41272c-b7b3-4bd1-81db-b787376104b6\") " pod="openshift-marketplace/redhat-marketplace-zmx2h" Oct 11 01:26:19 crc kubenswrapper[4743]: I1011 01:26:19.805284 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa41272c-b7b3-4bd1-81db-b787376104b6-catalog-content\") pod \"redhat-marketplace-zmx2h\" (UID: \"aa41272c-b7b3-4bd1-81db-b787376104b6\") " pod="openshift-marketplace/redhat-marketplace-zmx2h" Oct 11 01:26:19 crc kubenswrapper[4743]: I1011 01:26:19.837634 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqzv5\" (UniqueName: \"kubernetes.io/projected/aa41272c-b7b3-4bd1-81db-b787376104b6-kube-api-access-pqzv5\") pod \"redhat-marketplace-zmx2h\" (UID: \"aa41272c-b7b3-4bd1-81db-b787376104b6\") " pod="openshift-marketplace/redhat-marketplace-zmx2h" Oct 11 01:26:19 crc kubenswrapper[4743]: I1011 01:26:19.884199 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zmx2h" Oct 11 01:26:20 crc kubenswrapper[4743]: I1011 01:26:20.333432 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zmx2h"] Oct 11 01:26:20 crc kubenswrapper[4743]: W1011 01:26:20.334156 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa41272c_b7b3_4bd1_81db_b787376104b6.slice/crio-f44bc8e110f66ac53e3340513ddb11b52ab9a5ea0cbb22530ae6c1caf730cd93 WatchSource:0}: Error finding container f44bc8e110f66ac53e3340513ddb11b52ab9a5ea0cbb22530ae6c1caf730cd93: Status 404 returned error can't find the container with id f44bc8e110f66ac53e3340513ddb11b52ab9a5ea0cbb22530ae6c1caf730cd93 Oct 11 01:26:20 crc kubenswrapper[4743]: I1011 01:26:20.618500 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmx2h" event={"ID":"aa41272c-b7b3-4bd1-81db-b787376104b6","Type":"ContainerStarted","Data":"0c04402ad0e181c253f4aaa3c98aeb2f2ce5291980677452e7b43040b1d85ef6"} Oct 11 01:26:20 crc kubenswrapper[4743]: I1011 01:26:20.618787 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmx2h" event={"ID":"aa41272c-b7b3-4bd1-81db-b787376104b6","Type":"ContainerStarted","Data":"f44bc8e110f66ac53e3340513ddb11b52ab9a5ea0cbb22530ae6c1caf730cd93"} Oct 11 01:26:21 crc kubenswrapper[4743]: I1011 01:26:21.652626 4743 generic.go:334] "Generic (PLEG): container finished" podID="aa41272c-b7b3-4bd1-81db-b787376104b6" containerID="0c04402ad0e181c253f4aaa3c98aeb2f2ce5291980677452e7b43040b1d85ef6" exitCode=0 Oct 11 01:26:21 crc kubenswrapper[4743]: I1011 01:26:21.653016 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmx2h" event={"ID":"aa41272c-b7b3-4bd1-81db-b787376104b6","Type":"ContainerDied","Data":"0c04402ad0e181c253f4aaa3c98aeb2f2ce5291980677452e7b43040b1d85ef6"} Oct 11 01:26:22 crc kubenswrapper[4743]: I1011 01:26:22.664224 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmx2h" event={"ID":"aa41272c-b7b3-4bd1-81db-b787376104b6","Type":"ContainerStarted","Data":"3a95c2cce256add83ba893f62903396579aafdff6935a07eec3a62bc698e0d14"} Oct 11 01:26:23 crc kubenswrapper[4743]: I1011 01:26:23.680615 4743 generic.go:334] "Generic (PLEG): container finished" podID="aa41272c-b7b3-4bd1-81db-b787376104b6" containerID="3a95c2cce256add83ba893f62903396579aafdff6935a07eec3a62bc698e0d14" exitCode=0 Oct 11 01:26:23 crc kubenswrapper[4743]: I1011 01:26:23.680676 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmx2h" event={"ID":"aa41272c-b7b3-4bd1-81db-b787376104b6","Type":"ContainerDied","Data":"3a95c2cce256add83ba893f62903396579aafdff6935a07eec3a62bc698e0d14"} Oct 11 01:26:24 crc kubenswrapper[4743]: I1011 01:26:24.695197 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmx2h" event={"ID":"aa41272c-b7b3-4bd1-81db-b787376104b6","Type":"ContainerStarted","Data":"c178d94dd64ed783c4d6d5a268af206ef1f504cf339bbe50ee543b4b4d62f469"} Oct 11 01:26:24 crc kubenswrapper[4743]: I1011 01:26:24.712238 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zmx2h" podStartSLOduration=3.148672 podStartE2EDuration="5.712209659s" podCreationTimestamp="2025-10-11 01:26:19 +0000 UTC" firstStartedPulling="2025-10-11 01:26:21.655736347 +0000 UTC m=+2076.308716784" lastFinishedPulling="2025-10-11 01:26:24.219274036 +0000 UTC m=+2078.872254443" observedRunningTime="2025-10-11 01:26:24.710118926 +0000 UTC m=+2079.363099323" watchObservedRunningTime="2025-10-11 01:26:24.712209659 +0000 UTC m=+2079.365190096" Oct 11 01:26:26 crc kubenswrapper[4743]: I1011 01:26:26.123727 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d4m7w" Oct 11 01:26:26 crc kubenswrapper[4743]: I1011 01:26:26.177267 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d4m7w" Oct 11 01:26:26 crc kubenswrapper[4743]: I1011 01:26:26.880577 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d4m7w"] Oct 11 01:26:27 crc kubenswrapper[4743]: I1011 01:26:27.734714 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d4m7w" podUID="d89a5e4a-5095-4b5d-8757-68c78e28e25a" containerName="registry-server" containerID="cri-o://193af6e6b0b944e67cddf537bda9a9a5306af6fcdc859b51283cfd164d862a4a" gracePeriod=2 Oct 11 01:26:28 crc kubenswrapper[4743]: I1011 01:26:28.746831 4743 generic.go:334] "Generic (PLEG): container finished" podID="d89a5e4a-5095-4b5d-8757-68c78e28e25a" containerID="193af6e6b0b944e67cddf537bda9a9a5306af6fcdc859b51283cfd164d862a4a" exitCode=0 Oct 11 01:26:28 crc kubenswrapper[4743]: I1011 01:26:28.746880 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4m7w" event={"ID":"d89a5e4a-5095-4b5d-8757-68c78e28e25a","Type":"ContainerDied","Data":"193af6e6b0b944e67cddf537bda9a9a5306af6fcdc859b51283cfd164d862a4a"} Oct 11 01:26:28 crc kubenswrapper[4743]: I1011 01:26:28.747622 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4m7w" event={"ID":"d89a5e4a-5095-4b5d-8757-68c78e28e25a","Type":"ContainerDied","Data":"c6c347dab52dbd0c3de33e75077e625aa56254d10e482575acfa9085de60508c"} Oct 11 01:26:28 crc kubenswrapper[4743]: I1011 01:26:28.747642 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6c347dab52dbd0c3de33e75077e625aa56254d10e482575acfa9085de60508c" Oct 11 01:26:28 crc kubenswrapper[4743]: I1011 01:26:28.791852 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d4m7w" Oct 11 01:26:28 crc kubenswrapper[4743]: I1011 01:26:28.922958 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb882\" (UniqueName: \"kubernetes.io/projected/d89a5e4a-5095-4b5d-8757-68c78e28e25a-kube-api-access-tb882\") pod \"d89a5e4a-5095-4b5d-8757-68c78e28e25a\" (UID: \"d89a5e4a-5095-4b5d-8757-68c78e28e25a\") " Oct 11 01:26:28 crc kubenswrapper[4743]: I1011 01:26:28.923176 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d89a5e4a-5095-4b5d-8757-68c78e28e25a-catalog-content\") pod \"d89a5e4a-5095-4b5d-8757-68c78e28e25a\" (UID: \"d89a5e4a-5095-4b5d-8757-68c78e28e25a\") " Oct 11 01:26:28 crc kubenswrapper[4743]: I1011 01:26:28.923222 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d89a5e4a-5095-4b5d-8757-68c78e28e25a-utilities\") pod \"d89a5e4a-5095-4b5d-8757-68c78e28e25a\" (UID: \"d89a5e4a-5095-4b5d-8757-68c78e28e25a\") " Oct 11 01:26:28 crc kubenswrapper[4743]: I1011 01:26:28.924802 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d89a5e4a-5095-4b5d-8757-68c78e28e25a-utilities" (OuterVolumeSpecName: "utilities") pod "d89a5e4a-5095-4b5d-8757-68c78e28e25a" (UID: "d89a5e4a-5095-4b5d-8757-68c78e28e25a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:26:28 crc kubenswrapper[4743]: I1011 01:26:28.930424 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d89a5e4a-5095-4b5d-8757-68c78e28e25a-kube-api-access-tb882" (OuterVolumeSpecName: "kube-api-access-tb882") pod "d89a5e4a-5095-4b5d-8757-68c78e28e25a" (UID: "d89a5e4a-5095-4b5d-8757-68c78e28e25a"). InnerVolumeSpecName "kube-api-access-tb882". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:26:29 crc kubenswrapper[4743]: I1011 01:26:29.027462 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d89a5e4a-5095-4b5d-8757-68c78e28e25a-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 01:26:29 crc kubenswrapper[4743]: I1011 01:26:29.027514 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb882\" (UniqueName: \"kubernetes.io/projected/d89a5e4a-5095-4b5d-8757-68c78e28e25a-kube-api-access-tb882\") on node \"crc\" DevicePath \"\"" Oct 11 01:26:29 crc kubenswrapper[4743]: I1011 01:26:29.054418 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d89a5e4a-5095-4b5d-8757-68c78e28e25a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d89a5e4a-5095-4b5d-8757-68c78e28e25a" (UID: "d89a5e4a-5095-4b5d-8757-68c78e28e25a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:26:29 crc kubenswrapper[4743]: I1011 01:26:29.129624 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d89a5e4a-5095-4b5d-8757-68c78e28e25a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 01:26:29 crc kubenswrapper[4743]: I1011 01:26:29.755069 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d4m7w" Oct 11 01:26:29 crc kubenswrapper[4743]: I1011 01:26:29.784909 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d4m7w"] Oct 11 01:26:29 crc kubenswrapper[4743]: I1011 01:26:29.800242 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d4m7w"] Oct 11 01:26:29 crc kubenswrapper[4743]: I1011 01:26:29.884502 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zmx2h" Oct 11 01:26:29 crc kubenswrapper[4743]: I1011 01:26:29.884553 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zmx2h" Oct 11 01:26:29 crc kubenswrapper[4743]: I1011 01:26:29.927199 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zmx2h" Oct 11 01:26:30 crc kubenswrapper[4743]: I1011 01:26:30.119985 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d89a5e4a-5095-4b5d-8757-68c78e28e25a" path="/var/lib/kubelet/pods/d89a5e4a-5095-4b5d-8757-68c78e28e25a/volumes" Oct 11 01:26:30 crc kubenswrapper[4743]: I1011 01:26:30.893204 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zmx2h" Oct 11 01:26:32 crc kubenswrapper[4743]: I1011 01:26:32.295059 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zmx2h"] Oct 11 01:26:32 crc kubenswrapper[4743]: I1011 01:26:32.794781 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zmx2h" podUID="aa41272c-b7b3-4bd1-81db-b787376104b6" containerName="registry-server" containerID="cri-o://c178d94dd64ed783c4d6d5a268af206ef1f504cf339bbe50ee543b4b4d62f469" gracePeriod=2 Oct 11 01:26:33 crc kubenswrapper[4743]: I1011 01:26:33.350000 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zmx2h" Oct 11 01:26:33 crc kubenswrapper[4743]: I1011 01:26:33.544292 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqzv5\" (UniqueName: \"kubernetes.io/projected/aa41272c-b7b3-4bd1-81db-b787376104b6-kube-api-access-pqzv5\") pod \"aa41272c-b7b3-4bd1-81db-b787376104b6\" (UID: \"aa41272c-b7b3-4bd1-81db-b787376104b6\") " Oct 11 01:26:33 crc kubenswrapper[4743]: I1011 01:26:33.544359 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa41272c-b7b3-4bd1-81db-b787376104b6-utilities\") pod \"aa41272c-b7b3-4bd1-81db-b787376104b6\" (UID: \"aa41272c-b7b3-4bd1-81db-b787376104b6\") " Oct 11 01:26:33 crc kubenswrapper[4743]: I1011 01:26:33.544445 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa41272c-b7b3-4bd1-81db-b787376104b6-catalog-content\") pod \"aa41272c-b7b3-4bd1-81db-b787376104b6\" (UID: \"aa41272c-b7b3-4bd1-81db-b787376104b6\") " Oct 11 01:26:33 crc kubenswrapper[4743]: I1011 01:26:33.545250 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa41272c-b7b3-4bd1-81db-b787376104b6-utilities" (OuterVolumeSpecName: "utilities") pod "aa41272c-b7b3-4bd1-81db-b787376104b6" (UID: "aa41272c-b7b3-4bd1-81db-b787376104b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:26:33 crc kubenswrapper[4743]: I1011 01:26:33.551090 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa41272c-b7b3-4bd1-81db-b787376104b6-kube-api-access-pqzv5" (OuterVolumeSpecName: "kube-api-access-pqzv5") pod "aa41272c-b7b3-4bd1-81db-b787376104b6" (UID: "aa41272c-b7b3-4bd1-81db-b787376104b6"). InnerVolumeSpecName "kube-api-access-pqzv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:26:33 crc kubenswrapper[4743]: I1011 01:26:33.566234 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa41272c-b7b3-4bd1-81db-b787376104b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa41272c-b7b3-4bd1-81db-b787376104b6" (UID: "aa41272c-b7b3-4bd1-81db-b787376104b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:26:33 crc kubenswrapper[4743]: I1011 01:26:33.647122 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqzv5\" (UniqueName: \"kubernetes.io/projected/aa41272c-b7b3-4bd1-81db-b787376104b6-kube-api-access-pqzv5\") on node \"crc\" DevicePath \"\"" Oct 11 01:26:33 crc kubenswrapper[4743]: I1011 01:26:33.647175 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa41272c-b7b3-4bd1-81db-b787376104b6-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 01:26:33 crc kubenswrapper[4743]: I1011 01:26:33.647193 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa41272c-b7b3-4bd1-81db-b787376104b6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 01:26:33 crc kubenswrapper[4743]: I1011 01:26:33.807771 4743 generic.go:334] "Generic (PLEG): container finished" podID="aa41272c-b7b3-4bd1-81db-b787376104b6" containerID="c178d94dd64ed783c4d6d5a268af206ef1f504cf339bbe50ee543b4b4d62f469" exitCode=0 Oct 11 01:26:33 crc kubenswrapper[4743]: I1011 01:26:33.807811 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmx2h" event={"ID":"aa41272c-b7b3-4bd1-81db-b787376104b6","Type":"ContainerDied","Data":"c178d94dd64ed783c4d6d5a268af206ef1f504cf339bbe50ee543b4b4d62f469"} Oct 11 01:26:33 crc kubenswrapper[4743]: I1011 01:26:33.807866 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zmx2h" event={"ID":"aa41272c-b7b3-4bd1-81db-b787376104b6","Type":"ContainerDied","Data":"f44bc8e110f66ac53e3340513ddb11b52ab9a5ea0cbb22530ae6c1caf730cd93"} Oct 11 01:26:33 crc kubenswrapper[4743]: I1011 01:26:33.807867 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zmx2h" Oct 11 01:26:33 crc kubenswrapper[4743]: I1011 01:26:33.807885 4743 scope.go:117] "RemoveContainer" containerID="c178d94dd64ed783c4d6d5a268af206ef1f504cf339bbe50ee543b4b4d62f469" Oct 11 01:26:33 crc kubenswrapper[4743]: I1011 01:26:33.837430 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zmx2h"] Oct 11 01:26:33 crc kubenswrapper[4743]: I1011 01:26:33.840506 4743 scope.go:117] "RemoveContainer" containerID="3a95c2cce256add83ba893f62903396579aafdff6935a07eec3a62bc698e0d14" Oct 11 01:26:33 crc kubenswrapper[4743]: I1011 01:26:33.846788 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zmx2h"] Oct 11 01:26:33 crc kubenswrapper[4743]: I1011 01:26:33.865153 4743 scope.go:117] "RemoveContainer" containerID="0c04402ad0e181c253f4aaa3c98aeb2f2ce5291980677452e7b43040b1d85ef6" Oct 11 01:26:33 crc kubenswrapper[4743]: I1011 01:26:33.931214 4743 scope.go:117] "RemoveContainer" containerID="c178d94dd64ed783c4d6d5a268af206ef1f504cf339bbe50ee543b4b4d62f469" Oct 11 01:26:33 crc kubenswrapper[4743]: E1011 01:26:33.931923 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c178d94dd64ed783c4d6d5a268af206ef1f504cf339bbe50ee543b4b4d62f469\": container with ID starting with c178d94dd64ed783c4d6d5a268af206ef1f504cf339bbe50ee543b4b4d62f469 not found: ID does not exist" containerID="c178d94dd64ed783c4d6d5a268af206ef1f504cf339bbe50ee543b4b4d62f469" Oct 11 01:26:33 crc kubenswrapper[4743]: I1011 01:26:33.931995 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c178d94dd64ed783c4d6d5a268af206ef1f504cf339bbe50ee543b4b4d62f469"} err="failed to get container status \"c178d94dd64ed783c4d6d5a268af206ef1f504cf339bbe50ee543b4b4d62f469\": rpc error: code = NotFound desc = could not find container \"c178d94dd64ed783c4d6d5a268af206ef1f504cf339bbe50ee543b4b4d62f469\": container with ID starting with c178d94dd64ed783c4d6d5a268af206ef1f504cf339bbe50ee543b4b4d62f469 not found: ID does not exist" Oct 11 01:26:33 crc kubenswrapper[4743]: I1011 01:26:33.932048 4743 scope.go:117] "RemoveContainer" containerID="3a95c2cce256add83ba893f62903396579aafdff6935a07eec3a62bc698e0d14" Oct 11 01:26:33 crc kubenswrapper[4743]: E1011 01:26:33.932646 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a95c2cce256add83ba893f62903396579aafdff6935a07eec3a62bc698e0d14\": container with ID starting with 3a95c2cce256add83ba893f62903396579aafdff6935a07eec3a62bc698e0d14 not found: ID does not exist" containerID="3a95c2cce256add83ba893f62903396579aafdff6935a07eec3a62bc698e0d14" Oct 11 01:26:33 crc kubenswrapper[4743]: I1011 01:26:33.932718 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a95c2cce256add83ba893f62903396579aafdff6935a07eec3a62bc698e0d14"} err="failed to get container status \"3a95c2cce256add83ba893f62903396579aafdff6935a07eec3a62bc698e0d14\": rpc error: code = NotFound desc = could not find container \"3a95c2cce256add83ba893f62903396579aafdff6935a07eec3a62bc698e0d14\": container with ID starting with 3a95c2cce256add83ba893f62903396579aafdff6935a07eec3a62bc698e0d14 not found: ID does not exist" Oct 11 01:26:33 crc kubenswrapper[4743]: I1011 01:26:33.932766 4743 scope.go:117] "RemoveContainer" containerID="0c04402ad0e181c253f4aaa3c98aeb2f2ce5291980677452e7b43040b1d85ef6" Oct 11 01:26:33 crc kubenswrapper[4743]: E1011 01:26:33.933217 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c04402ad0e181c253f4aaa3c98aeb2f2ce5291980677452e7b43040b1d85ef6\": container with ID starting with 0c04402ad0e181c253f4aaa3c98aeb2f2ce5291980677452e7b43040b1d85ef6 not found: ID does not exist" containerID="0c04402ad0e181c253f4aaa3c98aeb2f2ce5291980677452e7b43040b1d85ef6" Oct 11 01:26:33 crc kubenswrapper[4743]: I1011 01:26:33.933282 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c04402ad0e181c253f4aaa3c98aeb2f2ce5291980677452e7b43040b1d85ef6"} err="failed to get container status \"0c04402ad0e181c253f4aaa3c98aeb2f2ce5291980677452e7b43040b1d85ef6\": rpc error: code = NotFound desc = could not find container \"0c04402ad0e181c253f4aaa3c98aeb2f2ce5291980677452e7b43040b1d85ef6\": container with ID starting with 0c04402ad0e181c253f4aaa3c98aeb2f2ce5291980677452e7b43040b1d85ef6 not found: ID does not exist" Oct 11 01:26:34 crc kubenswrapper[4743]: I1011 01:26:34.111815 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa41272c-b7b3-4bd1-81db-b787376104b6" path="/var/lib/kubelet/pods/aa41272c-b7b3-4bd1-81db-b787376104b6/volumes" Oct 11 01:26:53 crc kubenswrapper[4743]: I1011 01:26:53.054983 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-lc4jr"] Oct 11 01:26:53 crc kubenswrapper[4743]: I1011 01:26:53.063018 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-lc4jr"] Oct 11 01:26:54 crc kubenswrapper[4743]: I1011 01:26:54.102426 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d" path="/var/lib/kubelet/pods/b5a1515f-1a64-44da-b8e1-b41ff6aa1c8d/volumes" Oct 11 01:27:01 crc kubenswrapper[4743]: I1011 01:27:01.118263 4743 generic.go:334] "Generic (PLEG): container finished" podID="7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5" containerID="a5c45fb4c9e8a53dc8993d298f39bc21ac4e8328ec181a98d856ef78c7809701" exitCode=0 Oct 11 01:27:01 crc kubenswrapper[4743]: I1011 01:27:01.118326 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" event={"ID":"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5","Type":"ContainerDied","Data":"a5c45fb4c9e8a53dc8993d298f39bc21ac4e8328ec181a98d856ef78c7809701"} Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.751285 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.867006 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.867231 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-repo-setup-combined-ca-bundle\") pod \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.867287 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.867350 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-inventory\") pod \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.867398 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9tmz\" (UniqueName: \"kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-kube-api-access-f9tmz\") pod \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.867426 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.867457 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-telemetry-combined-ca-bundle\") pod \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.867488 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-telemetry-power-monitoring-combined-ca-bundle\") pod \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.867583 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-ssh-key\") pod \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.868315 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-libvirt-combined-ca-bundle\") pod \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.868341 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-bootstrap-combined-ca-bundle\") pod \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.868393 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-ovn-combined-ca-bundle\") pod \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.868466 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\" (UID: \"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5\") " Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.874629 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-kube-api-access-f9tmz" (OuterVolumeSpecName: "kube-api-access-f9tmz") pod "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5" (UID: "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5"). InnerVolumeSpecName "kube-api-access-f9tmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.875186 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5" (UID: "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.875311 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5" (UID: "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.876835 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5" (UID: "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.876885 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5" (UID: "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.877190 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5" (UID: "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.880381 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5" (UID: "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.880411 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5" (UID: "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.880521 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5" (UID: "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.880638 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5" (UID: "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.893071 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5" (UID: "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.915220 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-inventory" (OuterVolumeSpecName: "inventory") pod "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5" (UID: "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.916626 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5" (UID: "7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.970841 4743 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.970885 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.970895 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.970905 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9tmz\" (UniqueName: \"kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-kube-api-access-f9tmz\") on node \"crc\" DevicePath \"\"" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.970914 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.970924 4743 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.970932 4743 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.970944 4743 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.970951 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.970959 4743 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.970968 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.970977 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:27:02 crc kubenswrapper[4743]: I1011 01:27:02.970987 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.146296 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" event={"ID":"7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5","Type":"ContainerDied","Data":"2dd94b5c959b6940db2a4163ce99c0e2fa6e4558325350429930e8c5ab437313"} Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.146335 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dd94b5c959b6940db2a4163ce99c0e2fa6e4558325350429930e8c5ab437313" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.146450 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.256709 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns"] Oct 11 01:27:03 crc kubenswrapper[4743]: E1011 01:27:03.257195 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.257218 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 11 01:27:03 crc kubenswrapper[4743]: E1011 01:27:03.257240 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d89a5e4a-5095-4b5d-8757-68c78e28e25a" containerName="extract-content" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.257250 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d89a5e4a-5095-4b5d-8757-68c78e28e25a" containerName="extract-content" Oct 11 01:27:03 crc kubenswrapper[4743]: E1011 01:27:03.257266 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d89a5e4a-5095-4b5d-8757-68c78e28e25a" containerName="registry-server" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.257273 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d89a5e4a-5095-4b5d-8757-68c78e28e25a" containerName="registry-server" Oct 11 01:27:03 crc kubenswrapper[4743]: E1011 01:27:03.257299 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa41272c-b7b3-4bd1-81db-b787376104b6" containerName="extract-content" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.257308 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa41272c-b7b3-4bd1-81db-b787376104b6" containerName="extract-content" Oct 11 01:27:03 crc kubenswrapper[4743]: E1011 01:27:03.257337 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa41272c-b7b3-4bd1-81db-b787376104b6" containerName="extract-utilities" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.257344 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa41272c-b7b3-4bd1-81db-b787376104b6" containerName="extract-utilities" Oct 11 01:27:03 crc kubenswrapper[4743]: E1011 01:27:03.257363 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa41272c-b7b3-4bd1-81db-b787376104b6" containerName="registry-server" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.257370 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa41272c-b7b3-4bd1-81db-b787376104b6" containerName="registry-server" Oct 11 01:27:03 crc kubenswrapper[4743]: E1011 01:27:03.257382 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d89a5e4a-5095-4b5d-8757-68c78e28e25a" containerName="extract-utilities" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.257388 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d89a5e4a-5095-4b5d-8757-68c78e28e25a" containerName="extract-utilities" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.257635 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d89a5e4a-5095-4b5d-8757-68c78e28e25a" containerName="registry-server" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.257655 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa41272c-b7b3-4bd1-81db-b787376104b6" containerName="registry-server" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.257669 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.258680 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.262030 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.262432 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.262730 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.262962 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.264451 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.267471 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns"] Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.381039 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pdmns\" (UID: \"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.381286 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pdmns\" (UID: \"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.381414 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pdmns\" (UID: \"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.381643 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pdmns\" (UID: \"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.381769 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzrn4\" (UniqueName: \"kubernetes.io/projected/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-kube-api-access-nzrn4\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pdmns\" (UID: \"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.483084 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pdmns\" (UID: \"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.483216 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pdmns\" (UID: \"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.483252 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pdmns\" (UID: \"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.483319 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pdmns\" (UID: \"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.483388 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzrn4\" (UniqueName: \"kubernetes.io/projected/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-kube-api-access-nzrn4\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pdmns\" (UID: \"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.485052 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pdmns\" (UID: \"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.487022 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pdmns\" (UID: \"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.487100 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pdmns\" (UID: \"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.497599 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pdmns\" (UID: \"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.505700 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzrn4\" (UniqueName: \"kubernetes.io/projected/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-kube-api-access-nzrn4\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pdmns\" (UID: \"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns" Oct 11 01:27:03 crc kubenswrapper[4743]: I1011 01:27:03.575270 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns" Oct 11 01:27:04 crc kubenswrapper[4743]: W1011 01:27:04.136384 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61c7dd2d_7f02_47cb_8c1b_eadfa3fdf346.slice/crio-3edfef67b5094e93c2757cd50b1b3cf860a0b2c62c3f0b102e66573bad261d44 WatchSource:0}: Error finding container 3edfef67b5094e93c2757cd50b1b3cf860a0b2c62c3f0b102e66573bad261d44: Status 404 returned error can't find the container with id 3edfef67b5094e93c2757cd50b1b3cf860a0b2c62c3f0b102e66573bad261d44 Oct 11 01:27:04 crc kubenswrapper[4743]: I1011 01:27:04.137989 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns"] Oct 11 01:27:04 crc kubenswrapper[4743]: I1011 01:27:04.138487 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 01:27:04 crc kubenswrapper[4743]: I1011 01:27:04.157850 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns" event={"ID":"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346","Type":"ContainerStarted","Data":"3edfef67b5094e93c2757cd50b1b3cf860a0b2c62c3f0b102e66573bad261d44"} Oct 11 01:27:05 crc kubenswrapper[4743]: I1011 01:27:05.171553 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns" event={"ID":"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346","Type":"ContainerStarted","Data":"7110baaaee30b5c52234aea1454deb596388bc12377152a69a39c536e4080d71"} Oct 11 01:27:05 crc kubenswrapper[4743]: I1011 01:27:05.199078 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns" podStartSLOduration=1.7395646839999999 podStartE2EDuration="2.199050406s" podCreationTimestamp="2025-10-11 01:27:03 +0000 UTC" firstStartedPulling="2025-10-11 01:27:04.138229234 +0000 UTC m=+2118.791209651" lastFinishedPulling="2025-10-11 01:27:04.597714946 +0000 UTC m=+2119.250695373" observedRunningTime="2025-10-11 01:27:05.187208105 +0000 UTC m=+2119.840188532" watchObservedRunningTime="2025-10-11 01:27:05.199050406 +0000 UTC m=+2119.852030813" Oct 11 01:27:14 crc kubenswrapper[4743]: I1011 01:27:14.458074 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:27:14 crc kubenswrapper[4743]: I1011 01:27:14.459992 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:27:33 crc kubenswrapper[4743]: I1011 01:27:33.049094 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-47nj5"] Oct 11 01:27:33 crc kubenswrapper[4743]: I1011 01:27:33.061849 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-47nj5"] Oct 11 01:27:34 crc kubenswrapper[4743]: I1011 01:27:34.104620 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60648a98-998f-41d9-aceb-3ad66a1d2b04" path="/var/lib/kubelet/pods/60648a98-998f-41d9-aceb-3ad66a1d2b04/volumes" Oct 11 01:27:36 crc kubenswrapper[4743]: I1011 01:27:36.343895 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sf9xd"] Oct 11 01:27:36 crc kubenswrapper[4743]: I1011 01:27:36.361557 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sf9xd"] Oct 11 01:27:36 crc kubenswrapper[4743]: I1011 01:27:36.361674 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sf9xd" Oct 11 01:27:36 crc kubenswrapper[4743]: I1011 01:27:36.405278 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4236ff23-9605-4c8c-97ed-9e05a4bdff59-catalog-content\") pod \"community-operators-sf9xd\" (UID: \"4236ff23-9605-4c8c-97ed-9e05a4bdff59\") " pod="openshift-marketplace/community-operators-sf9xd" Oct 11 01:27:36 crc kubenswrapper[4743]: I1011 01:27:36.405426 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4236ff23-9605-4c8c-97ed-9e05a4bdff59-utilities\") pod \"community-operators-sf9xd\" (UID: \"4236ff23-9605-4c8c-97ed-9e05a4bdff59\") " pod="openshift-marketplace/community-operators-sf9xd" Oct 11 01:27:36 crc kubenswrapper[4743]: I1011 01:27:36.405476 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnlkt\" (UniqueName: \"kubernetes.io/projected/4236ff23-9605-4c8c-97ed-9e05a4bdff59-kube-api-access-pnlkt\") pod \"community-operators-sf9xd\" (UID: \"4236ff23-9605-4c8c-97ed-9e05a4bdff59\") " pod="openshift-marketplace/community-operators-sf9xd" Oct 11 01:27:36 crc kubenswrapper[4743]: I1011 01:27:36.506959 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4236ff23-9605-4c8c-97ed-9e05a4bdff59-catalog-content\") pod \"community-operators-sf9xd\" (UID: \"4236ff23-9605-4c8c-97ed-9e05a4bdff59\") " pod="openshift-marketplace/community-operators-sf9xd" Oct 11 01:27:36 crc kubenswrapper[4743]: I1011 01:27:36.507113 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4236ff23-9605-4c8c-97ed-9e05a4bdff59-utilities\") pod \"community-operators-sf9xd\" (UID: \"4236ff23-9605-4c8c-97ed-9e05a4bdff59\") " pod="openshift-marketplace/community-operators-sf9xd" Oct 11 01:27:36 crc kubenswrapper[4743]: I1011 01:27:36.507165 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnlkt\" (UniqueName: \"kubernetes.io/projected/4236ff23-9605-4c8c-97ed-9e05a4bdff59-kube-api-access-pnlkt\") pod \"community-operators-sf9xd\" (UID: \"4236ff23-9605-4c8c-97ed-9e05a4bdff59\") " pod="openshift-marketplace/community-operators-sf9xd" Oct 11 01:27:36 crc kubenswrapper[4743]: I1011 01:27:36.507494 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4236ff23-9605-4c8c-97ed-9e05a4bdff59-catalog-content\") pod \"community-operators-sf9xd\" (UID: \"4236ff23-9605-4c8c-97ed-9e05a4bdff59\") " pod="openshift-marketplace/community-operators-sf9xd" Oct 11 01:27:36 crc kubenswrapper[4743]: I1011 01:27:36.507954 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4236ff23-9605-4c8c-97ed-9e05a4bdff59-utilities\") pod \"community-operators-sf9xd\" (UID: \"4236ff23-9605-4c8c-97ed-9e05a4bdff59\") " pod="openshift-marketplace/community-operators-sf9xd" Oct 11 01:27:36 crc kubenswrapper[4743]: I1011 01:27:36.540621 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnlkt\" (UniqueName: \"kubernetes.io/projected/4236ff23-9605-4c8c-97ed-9e05a4bdff59-kube-api-access-pnlkt\") pod \"community-operators-sf9xd\" (UID: \"4236ff23-9605-4c8c-97ed-9e05a4bdff59\") " pod="openshift-marketplace/community-operators-sf9xd" Oct 11 01:27:36 crc kubenswrapper[4743]: I1011 01:27:36.701189 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sf9xd" Oct 11 01:27:37 crc kubenswrapper[4743]: I1011 01:27:37.675213 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sf9xd"] Oct 11 01:27:38 crc kubenswrapper[4743]: I1011 01:27:38.551004 4743 generic.go:334] "Generic (PLEG): container finished" podID="4236ff23-9605-4c8c-97ed-9e05a4bdff59" containerID="11cc12d59a1d8e23bb7ce87fcde3e46f722d4c16ae5f51a1022a97cd1105de0d" exitCode=0 Oct 11 01:27:38 crc kubenswrapper[4743]: I1011 01:27:38.551586 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sf9xd" event={"ID":"4236ff23-9605-4c8c-97ed-9e05a4bdff59","Type":"ContainerDied","Data":"11cc12d59a1d8e23bb7ce87fcde3e46f722d4c16ae5f51a1022a97cd1105de0d"} Oct 11 01:27:38 crc kubenswrapper[4743]: I1011 01:27:38.551614 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sf9xd" event={"ID":"4236ff23-9605-4c8c-97ed-9e05a4bdff59","Type":"ContainerStarted","Data":"0e55f2e8299aa00974087706a3e84dee21f85bf34c88cdcc1f3278fa43971e53"} Oct 11 01:27:40 crc kubenswrapper[4743]: I1011 01:27:40.575106 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sf9xd" event={"ID":"4236ff23-9605-4c8c-97ed-9e05a4bdff59","Type":"ContainerStarted","Data":"063337eeb595f23d3eef2813090953e52f2480be9170da45a4fc3ab126a74079"} Oct 11 01:27:41 crc kubenswrapper[4743]: I1011 01:27:41.590047 4743 generic.go:334] "Generic (PLEG): container finished" podID="4236ff23-9605-4c8c-97ed-9e05a4bdff59" containerID="063337eeb595f23d3eef2813090953e52f2480be9170da45a4fc3ab126a74079" exitCode=0 Oct 11 01:27:41 crc kubenswrapper[4743]: I1011 01:27:41.590181 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sf9xd" event={"ID":"4236ff23-9605-4c8c-97ed-9e05a4bdff59","Type":"ContainerDied","Data":"063337eeb595f23d3eef2813090953e52f2480be9170da45a4fc3ab126a74079"} Oct 11 01:27:42 crc kubenswrapper[4743]: I1011 01:27:42.604958 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sf9xd" event={"ID":"4236ff23-9605-4c8c-97ed-9e05a4bdff59","Type":"ContainerStarted","Data":"16dcb8a90e89fc4fed5c558434d0bda41075f045aa93840b1ea27b140b9ec0c7"} Oct 11 01:27:42 crc kubenswrapper[4743]: I1011 01:27:42.633105 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sf9xd" podStartSLOduration=3.163822692 podStartE2EDuration="6.6330734s" podCreationTimestamp="2025-10-11 01:27:36 +0000 UTC" firstStartedPulling="2025-10-11 01:27:38.554083787 +0000 UTC m=+2153.207064174" lastFinishedPulling="2025-10-11 01:27:42.023334485 +0000 UTC m=+2156.676314882" observedRunningTime="2025-10-11 01:27:42.629765974 +0000 UTC m=+2157.282746411" watchObservedRunningTime="2025-10-11 01:27:42.6330734 +0000 UTC m=+2157.286053837" Oct 11 01:27:44 crc kubenswrapper[4743]: I1011 01:27:44.458316 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:27:44 crc kubenswrapper[4743]: I1011 01:27:44.458823 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:27:46 crc kubenswrapper[4743]: I1011 01:27:46.701369 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sf9xd" Oct 11 01:27:46 crc kubenswrapper[4743]: I1011 01:27:46.701714 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sf9xd" Oct 11 01:27:46 crc kubenswrapper[4743]: I1011 01:27:46.792100 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sf9xd" Oct 11 01:27:47 crc kubenswrapper[4743]: I1011 01:27:47.750469 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sf9xd" Oct 11 01:27:47 crc kubenswrapper[4743]: I1011 01:27:47.816498 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sf9xd"] Oct 11 01:27:49 crc kubenswrapper[4743]: I1011 01:27:49.689742 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sf9xd" podUID="4236ff23-9605-4c8c-97ed-9e05a4bdff59" containerName="registry-server" containerID="cri-o://16dcb8a90e89fc4fed5c558434d0bda41075f045aa93840b1ea27b140b9ec0c7" gracePeriod=2 Oct 11 01:27:50 crc kubenswrapper[4743]: I1011 01:27:50.231113 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sf9xd" Oct 11 01:27:50 crc kubenswrapper[4743]: I1011 01:27:50.351634 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4236ff23-9605-4c8c-97ed-9e05a4bdff59-catalog-content\") pod \"4236ff23-9605-4c8c-97ed-9e05a4bdff59\" (UID: \"4236ff23-9605-4c8c-97ed-9e05a4bdff59\") " Oct 11 01:27:50 crc kubenswrapper[4743]: I1011 01:27:50.352146 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnlkt\" (UniqueName: \"kubernetes.io/projected/4236ff23-9605-4c8c-97ed-9e05a4bdff59-kube-api-access-pnlkt\") pod \"4236ff23-9605-4c8c-97ed-9e05a4bdff59\" (UID: \"4236ff23-9605-4c8c-97ed-9e05a4bdff59\") " Oct 11 01:27:50 crc kubenswrapper[4743]: I1011 01:27:50.352256 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4236ff23-9605-4c8c-97ed-9e05a4bdff59-utilities\") pod \"4236ff23-9605-4c8c-97ed-9e05a4bdff59\" (UID: \"4236ff23-9605-4c8c-97ed-9e05a4bdff59\") " Oct 11 01:27:50 crc kubenswrapper[4743]: I1011 01:27:50.353064 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4236ff23-9605-4c8c-97ed-9e05a4bdff59-utilities" (OuterVolumeSpecName: "utilities") pod "4236ff23-9605-4c8c-97ed-9e05a4bdff59" (UID: "4236ff23-9605-4c8c-97ed-9e05a4bdff59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:27:50 crc kubenswrapper[4743]: I1011 01:27:50.357376 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4236ff23-9605-4c8c-97ed-9e05a4bdff59-kube-api-access-pnlkt" (OuterVolumeSpecName: "kube-api-access-pnlkt") pod "4236ff23-9605-4c8c-97ed-9e05a4bdff59" (UID: "4236ff23-9605-4c8c-97ed-9e05a4bdff59"). InnerVolumeSpecName "kube-api-access-pnlkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:27:50 crc kubenswrapper[4743]: I1011 01:27:50.442062 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4236ff23-9605-4c8c-97ed-9e05a4bdff59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4236ff23-9605-4c8c-97ed-9e05a4bdff59" (UID: "4236ff23-9605-4c8c-97ed-9e05a4bdff59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:27:50 crc kubenswrapper[4743]: I1011 01:27:50.455265 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4236ff23-9605-4c8c-97ed-9e05a4bdff59-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 01:27:50 crc kubenswrapper[4743]: I1011 01:27:50.455395 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnlkt\" (UniqueName: \"kubernetes.io/projected/4236ff23-9605-4c8c-97ed-9e05a4bdff59-kube-api-access-pnlkt\") on node \"crc\" DevicePath \"\"" Oct 11 01:27:50 crc kubenswrapper[4743]: I1011 01:27:50.455524 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4236ff23-9605-4c8c-97ed-9e05a4bdff59-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 01:27:50 crc kubenswrapper[4743]: I1011 01:27:50.705526 4743 generic.go:334] "Generic (PLEG): container finished" podID="4236ff23-9605-4c8c-97ed-9e05a4bdff59" containerID="16dcb8a90e89fc4fed5c558434d0bda41075f045aa93840b1ea27b140b9ec0c7" exitCode=0 Oct 11 01:27:50 crc kubenswrapper[4743]: I1011 01:27:50.705595 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sf9xd" event={"ID":"4236ff23-9605-4c8c-97ed-9e05a4bdff59","Type":"ContainerDied","Data":"16dcb8a90e89fc4fed5c558434d0bda41075f045aa93840b1ea27b140b9ec0c7"} Oct 11 01:27:50 crc kubenswrapper[4743]: I1011 01:27:50.705905 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sf9xd" event={"ID":"4236ff23-9605-4c8c-97ed-9e05a4bdff59","Type":"ContainerDied","Data":"0e55f2e8299aa00974087706a3e84dee21f85bf34c88cdcc1f3278fa43971e53"} Oct 11 01:27:50 crc kubenswrapper[4743]: I1011 01:27:50.705926 4743 scope.go:117] "RemoveContainer" containerID="16dcb8a90e89fc4fed5c558434d0bda41075f045aa93840b1ea27b140b9ec0c7" Oct 11 01:27:50 crc kubenswrapper[4743]: I1011 01:27:50.705626 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sf9xd" Oct 11 01:27:50 crc kubenswrapper[4743]: I1011 01:27:50.746299 4743 scope.go:117] "RemoveContainer" containerID="063337eeb595f23d3eef2813090953e52f2480be9170da45a4fc3ab126a74079" Oct 11 01:27:50 crc kubenswrapper[4743]: I1011 01:27:50.749433 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sf9xd"] Oct 11 01:27:50 crc kubenswrapper[4743]: I1011 01:27:50.764605 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sf9xd"] Oct 11 01:27:50 crc kubenswrapper[4743]: I1011 01:27:50.771226 4743 scope.go:117] "RemoveContainer" containerID="11cc12d59a1d8e23bb7ce87fcde3e46f722d4c16ae5f51a1022a97cd1105de0d" Oct 11 01:27:50 crc kubenswrapper[4743]: I1011 01:27:50.847657 4743 scope.go:117] "RemoveContainer" containerID="16dcb8a90e89fc4fed5c558434d0bda41075f045aa93840b1ea27b140b9ec0c7" Oct 11 01:27:50 crc kubenswrapper[4743]: E1011 01:27:50.848160 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16dcb8a90e89fc4fed5c558434d0bda41075f045aa93840b1ea27b140b9ec0c7\": container with ID starting with 16dcb8a90e89fc4fed5c558434d0bda41075f045aa93840b1ea27b140b9ec0c7 not found: ID does not exist" containerID="16dcb8a90e89fc4fed5c558434d0bda41075f045aa93840b1ea27b140b9ec0c7" Oct 11 01:27:50 crc kubenswrapper[4743]: I1011 01:27:50.848213 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16dcb8a90e89fc4fed5c558434d0bda41075f045aa93840b1ea27b140b9ec0c7"} err="failed to get container status \"16dcb8a90e89fc4fed5c558434d0bda41075f045aa93840b1ea27b140b9ec0c7\": rpc error: code = NotFound desc = could not find container \"16dcb8a90e89fc4fed5c558434d0bda41075f045aa93840b1ea27b140b9ec0c7\": container with ID starting with 16dcb8a90e89fc4fed5c558434d0bda41075f045aa93840b1ea27b140b9ec0c7 not found: ID does not exist" Oct 11 01:27:50 crc kubenswrapper[4743]: I1011 01:27:50.848262 4743 scope.go:117] "RemoveContainer" containerID="063337eeb595f23d3eef2813090953e52f2480be9170da45a4fc3ab126a74079" Oct 11 01:27:50 crc kubenswrapper[4743]: E1011 01:27:50.848672 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"063337eeb595f23d3eef2813090953e52f2480be9170da45a4fc3ab126a74079\": container with ID starting with 063337eeb595f23d3eef2813090953e52f2480be9170da45a4fc3ab126a74079 not found: ID does not exist" containerID="063337eeb595f23d3eef2813090953e52f2480be9170da45a4fc3ab126a74079" Oct 11 01:27:50 crc kubenswrapper[4743]: I1011 01:27:50.848705 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"063337eeb595f23d3eef2813090953e52f2480be9170da45a4fc3ab126a74079"} err="failed to get container status \"063337eeb595f23d3eef2813090953e52f2480be9170da45a4fc3ab126a74079\": rpc error: code = NotFound desc = could not find container \"063337eeb595f23d3eef2813090953e52f2480be9170da45a4fc3ab126a74079\": container with ID starting with 063337eeb595f23d3eef2813090953e52f2480be9170da45a4fc3ab126a74079 not found: ID does not exist" Oct 11 01:27:50 crc kubenswrapper[4743]: I1011 01:27:50.848729 4743 scope.go:117] "RemoveContainer" containerID="11cc12d59a1d8e23bb7ce87fcde3e46f722d4c16ae5f51a1022a97cd1105de0d" Oct 11 01:27:50 crc kubenswrapper[4743]: E1011 01:27:50.849108 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11cc12d59a1d8e23bb7ce87fcde3e46f722d4c16ae5f51a1022a97cd1105de0d\": container with ID starting with 11cc12d59a1d8e23bb7ce87fcde3e46f722d4c16ae5f51a1022a97cd1105de0d not found: ID does not exist" containerID="11cc12d59a1d8e23bb7ce87fcde3e46f722d4c16ae5f51a1022a97cd1105de0d" Oct 11 01:27:50 crc kubenswrapper[4743]: I1011 01:27:50.849189 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11cc12d59a1d8e23bb7ce87fcde3e46f722d4c16ae5f51a1022a97cd1105de0d"} err="failed to get container status \"11cc12d59a1d8e23bb7ce87fcde3e46f722d4c16ae5f51a1022a97cd1105de0d\": rpc error: code = NotFound desc = could not find container \"11cc12d59a1d8e23bb7ce87fcde3e46f722d4c16ae5f51a1022a97cd1105de0d\": container with ID starting with 11cc12d59a1d8e23bb7ce87fcde3e46f722d4c16ae5f51a1022a97cd1105de0d not found: ID does not exist" Oct 11 01:27:52 crc kubenswrapper[4743]: I1011 01:27:52.109414 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4236ff23-9605-4c8c-97ed-9e05a4bdff59" path="/var/lib/kubelet/pods/4236ff23-9605-4c8c-97ed-9e05a4bdff59/volumes" Oct 11 01:27:52 crc kubenswrapper[4743]: I1011 01:27:52.252292 4743 scope.go:117] "RemoveContainer" containerID="636e1e422bcdceb11e9f438441f678aff8edfed5bd734d0cc4bdc66334a276fe" Oct 11 01:27:52 crc kubenswrapper[4743]: I1011 01:27:52.297763 4743 scope.go:117] "RemoveContainer" containerID="ec47936c8ea8e3016e46f32eef56df713af2a9c92346709a02bfa557442bb542" Oct 11 01:28:14 crc kubenswrapper[4743]: I1011 01:28:14.459067 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:28:14 crc kubenswrapper[4743]: I1011 01:28:14.459653 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:28:14 crc kubenswrapper[4743]: I1011 01:28:14.459701 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 01:28:14 crc kubenswrapper[4743]: I1011 01:28:14.460543 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 01:28:14 crc kubenswrapper[4743]: I1011 01:28:14.460613 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" gracePeriod=600 Oct 11 01:28:14 crc kubenswrapper[4743]: E1011 01:28:14.614599 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:28:14 crc kubenswrapper[4743]: I1011 01:28:14.999024 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" exitCode=0 Oct 11 01:28:14 crc kubenswrapper[4743]: I1011 01:28:14.999110 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae"} Oct 11 01:28:14 crc kubenswrapper[4743]: I1011 01:28:14.999358 4743 scope.go:117] "RemoveContainer" containerID="22fd023cd5183c2ca78abfab3b66c41277f63acb424eb906c5326f5e04010643" Oct 11 01:28:15 crc kubenswrapper[4743]: I1011 01:28:15.000197 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:28:15 crc kubenswrapper[4743]: E1011 01:28:15.000613 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:28:27 crc kubenswrapper[4743]: I1011 01:28:27.092582 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:28:27 crc kubenswrapper[4743]: E1011 01:28:27.093423 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:28:28 crc kubenswrapper[4743]: I1011 01:28:28.158058 4743 generic.go:334] "Generic (PLEG): container finished" podID="61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346" containerID="7110baaaee30b5c52234aea1454deb596388bc12377152a69a39c536e4080d71" exitCode=0 Oct 11 01:28:28 crc kubenswrapper[4743]: I1011 01:28:28.158113 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns" event={"ID":"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346","Type":"ContainerDied","Data":"7110baaaee30b5c52234aea1454deb596388bc12377152a69a39c536e4080d71"} Oct 11 01:28:29 crc kubenswrapper[4743]: I1011 01:28:29.612713 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns" Oct 11 01:28:29 crc kubenswrapper[4743]: I1011 01:28:29.794730 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-inventory\") pod \"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346\" (UID: \"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346\") " Oct 11 01:28:29 crc kubenswrapper[4743]: I1011 01:28:29.794823 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-ovncontroller-config-0\") pod \"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346\" (UID: \"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346\") " Oct 11 01:28:29 crc kubenswrapper[4743]: I1011 01:28:29.795072 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-ovn-combined-ca-bundle\") pod \"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346\" (UID: \"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346\") " Oct 11 01:28:29 crc kubenswrapper[4743]: I1011 01:28:29.795172 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-ssh-key\") pod \"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346\" (UID: \"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346\") " Oct 11 01:28:29 crc kubenswrapper[4743]: I1011 01:28:29.795235 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzrn4\" (UniqueName: \"kubernetes.io/projected/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-kube-api-access-nzrn4\") pod \"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346\" (UID: \"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346\") " Oct 11 01:28:29 crc kubenswrapper[4743]: I1011 01:28:29.807073 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346" (UID: "61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:28:29 crc kubenswrapper[4743]: I1011 01:28:29.807079 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-kube-api-access-nzrn4" (OuterVolumeSpecName: "kube-api-access-nzrn4") pod "61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346" (UID: "61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346"). InnerVolumeSpecName "kube-api-access-nzrn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:28:29 crc kubenswrapper[4743]: I1011 01:28:29.838479 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-inventory" (OuterVolumeSpecName: "inventory") pod "61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346" (UID: "61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:28:29 crc kubenswrapper[4743]: I1011 01:28:29.854789 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346" (UID: "61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:28:29 crc kubenswrapper[4743]: I1011 01:28:29.860062 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346" (UID: "61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:28:29 crc kubenswrapper[4743]: I1011 01:28:29.898315 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:28:29 crc kubenswrapper[4743]: I1011 01:28:29.898355 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:28:29 crc kubenswrapper[4743]: I1011 01:28:29.898369 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzrn4\" (UniqueName: \"kubernetes.io/projected/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-kube-api-access-nzrn4\") on node \"crc\" DevicePath \"\"" Oct 11 01:28:29 crc kubenswrapper[4743]: I1011 01:28:29.898385 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:28:29 crc kubenswrapper[4743]: I1011 01:28:29.898397 4743 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.192322 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns" event={"ID":"61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346","Type":"ContainerDied","Data":"3edfef67b5094e93c2757cd50b1b3cf860a0b2c62c3f0b102e66573bad261d44"} Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.192666 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3edfef67b5094e93c2757cd50b1b3cf860a0b2c62c3f0b102e66573bad261d44" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.192427 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.300441 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8"] Oct 11 01:28:30 crc kubenswrapper[4743]: E1011 01:28:30.301029 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4236ff23-9605-4c8c-97ed-9e05a4bdff59" containerName="extract-utilities" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.301053 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4236ff23-9605-4c8c-97ed-9e05a4bdff59" containerName="extract-utilities" Oct 11 01:28:30 crc kubenswrapper[4743]: E1011 01:28:30.301095 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4236ff23-9605-4c8c-97ed-9e05a4bdff59" containerName="extract-content" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.301106 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4236ff23-9605-4c8c-97ed-9e05a4bdff59" containerName="extract-content" Oct 11 01:28:30 crc kubenswrapper[4743]: E1011 01:28:30.301145 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.301155 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 11 01:28:30 crc kubenswrapper[4743]: E1011 01:28:30.301183 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4236ff23-9605-4c8c-97ed-9e05a4bdff59" containerName="registry-server" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.301192 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4236ff23-9605-4c8c-97ed-9e05a4bdff59" containerName="registry-server" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.301467 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.301494 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4236ff23-9605-4c8c-97ed-9e05a4bdff59" containerName="registry-server" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.302472 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.306687 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.307881 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.309188 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.311171 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.312733 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.315043 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8"] Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.410570 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53edf58a-7be3-40ee-af4a-d110d1607356-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8\" (UID: \"53edf58a-7be3-40ee-af4a-d110d1607356\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.411331 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/53edf58a-7be3-40ee-af4a-d110d1607356-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8\" (UID: \"53edf58a-7be3-40ee-af4a-d110d1607356\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.411527 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53edf58a-7be3-40ee-af4a-d110d1607356-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8\" (UID: \"53edf58a-7be3-40ee-af4a-d110d1607356\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.411589 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53edf58a-7be3-40ee-af4a-d110d1607356-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8\" (UID: \"53edf58a-7be3-40ee-af4a-d110d1607356\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.411851 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v46xt\" (UniqueName: \"kubernetes.io/projected/53edf58a-7be3-40ee-af4a-d110d1607356-kube-api-access-v46xt\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8\" (UID: \"53edf58a-7be3-40ee-af4a-d110d1607356\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.514134 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/53edf58a-7be3-40ee-af4a-d110d1607356-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8\" (UID: \"53edf58a-7be3-40ee-af4a-d110d1607356\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.514311 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53edf58a-7be3-40ee-af4a-d110d1607356-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8\" (UID: \"53edf58a-7be3-40ee-af4a-d110d1607356\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.514378 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53edf58a-7be3-40ee-af4a-d110d1607356-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8\" (UID: \"53edf58a-7be3-40ee-af4a-d110d1607356\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.514561 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v46xt\" (UniqueName: \"kubernetes.io/projected/53edf58a-7be3-40ee-af4a-d110d1607356-kube-api-access-v46xt\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8\" (UID: \"53edf58a-7be3-40ee-af4a-d110d1607356\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.514713 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53edf58a-7be3-40ee-af4a-d110d1607356-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8\" (UID: \"53edf58a-7be3-40ee-af4a-d110d1607356\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.520326 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53edf58a-7be3-40ee-af4a-d110d1607356-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8\" (UID: \"53edf58a-7be3-40ee-af4a-d110d1607356\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.520533 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53edf58a-7be3-40ee-af4a-d110d1607356-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8\" (UID: \"53edf58a-7be3-40ee-af4a-d110d1607356\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.523647 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53edf58a-7be3-40ee-af4a-d110d1607356-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8\" (UID: \"53edf58a-7be3-40ee-af4a-d110d1607356\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.524391 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/53edf58a-7be3-40ee-af4a-d110d1607356-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8\" (UID: \"53edf58a-7be3-40ee-af4a-d110d1607356\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.534580 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v46xt\" (UniqueName: \"kubernetes.io/projected/53edf58a-7be3-40ee-af4a-d110d1607356-kube-api-access-v46xt\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8\" (UID: \"53edf58a-7be3-40ee-af4a-d110d1607356\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8" Oct 11 01:28:30 crc kubenswrapper[4743]: I1011 01:28:30.638961 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8" Oct 11 01:28:31 crc kubenswrapper[4743]: I1011 01:28:31.313042 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8"] Oct 11 01:28:32 crc kubenswrapper[4743]: I1011 01:28:32.216203 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8" event={"ID":"53edf58a-7be3-40ee-af4a-d110d1607356","Type":"ContainerStarted","Data":"194784a5b82826cd68de474dbbae81f1a035204609cec6fde1df35cd9fc0f11d"} Oct 11 01:28:32 crc kubenswrapper[4743]: I1011 01:28:32.252900 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8" podStartSLOduration=1.658350157 podStartE2EDuration="2.252874736s" podCreationTimestamp="2025-10-11 01:28:30 +0000 UTC" firstStartedPulling="2025-10-11 01:28:31.32493242 +0000 UTC m=+2205.977912827" lastFinishedPulling="2025-10-11 01:28:31.919456969 +0000 UTC m=+2206.572437406" observedRunningTime="2025-10-11 01:28:32.235224696 +0000 UTC m=+2206.888205093" watchObservedRunningTime="2025-10-11 01:28:32.252874736 +0000 UTC m=+2206.905855143" Oct 11 01:28:33 crc kubenswrapper[4743]: I1011 01:28:33.232798 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8" event={"ID":"53edf58a-7be3-40ee-af4a-d110d1607356","Type":"ContainerStarted","Data":"9954638b31e5b269195f86d27aa2cc9adcdfc1813da96e97ac8cf5069b92922f"} Oct 11 01:28:41 crc kubenswrapper[4743]: I1011 01:28:41.092152 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:28:41 crc kubenswrapper[4743]: E1011 01:28:41.093206 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:28:56 crc kubenswrapper[4743]: I1011 01:28:56.108678 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:28:56 crc kubenswrapper[4743]: E1011 01:28:56.109622 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:29:07 crc kubenswrapper[4743]: I1011 01:29:07.092543 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:29:07 crc kubenswrapper[4743]: E1011 01:29:07.093402 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:29:18 crc kubenswrapper[4743]: I1011 01:29:18.092155 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:29:18 crc kubenswrapper[4743]: E1011 01:29:18.093296 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:29:33 crc kubenswrapper[4743]: I1011 01:29:33.091978 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:29:33 crc kubenswrapper[4743]: E1011 01:29:33.093427 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:29:44 crc kubenswrapper[4743]: I1011 01:29:44.091945 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:29:44 crc kubenswrapper[4743]: E1011 01:29:44.092782 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:29:55 crc kubenswrapper[4743]: I1011 01:29:55.092514 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:29:55 crc kubenswrapper[4743]: E1011 01:29:55.094127 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:30:00 crc kubenswrapper[4743]: I1011 01:30:00.169334 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335770-kjbb9"] Oct 11 01:30:00 crc kubenswrapper[4743]: I1011 01:30:00.172353 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335770-kjbb9" Oct 11 01:30:00 crc kubenswrapper[4743]: I1011 01:30:00.176046 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 11 01:30:00 crc kubenswrapper[4743]: I1011 01:30:00.177276 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 11 01:30:00 crc kubenswrapper[4743]: I1011 01:30:00.184435 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335770-kjbb9"] Oct 11 01:30:00 crc kubenswrapper[4743]: I1011 01:30:00.314331 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fec83ae-8524-4dcd-9983-e207576458c5-secret-volume\") pod \"collect-profiles-29335770-kjbb9\" (UID: \"7fec83ae-8524-4dcd-9983-e207576458c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335770-kjbb9" Oct 11 01:30:00 crc kubenswrapper[4743]: I1011 01:30:00.314691 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fec83ae-8524-4dcd-9983-e207576458c5-config-volume\") pod \"collect-profiles-29335770-kjbb9\" (UID: \"7fec83ae-8524-4dcd-9983-e207576458c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335770-kjbb9" Oct 11 01:30:00 crc kubenswrapper[4743]: I1011 01:30:00.314937 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sblk\" (UniqueName: \"kubernetes.io/projected/7fec83ae-8524-4dcd-9983-e207576458c5-kube-api-access-8sblk\") pod \"collect-profiles-29335770-kjbb9\" (UID: \"7fec83ae-8524-4dcd-9983-e207576458c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335770-kjbb9" Oct 11 01:30:00 crc kubenswrapper[4743]: I1011 01:30:00.416906 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sblk\" (UniqueName: \"kubernetes.io/projected/7fec83ae-8524-4dcd-9983-e207576458c5-kube-api-access-8sblk\") pod \"collect-profiles-29335770-kjbb9\" (UID: \"7fec83ae-8524-4dcd-9983-e207576458c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335770-kjbb9" Oct 11 01:30:00 crc kubenswrapper[4743]: I1011 01:30:00.417001 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fec83ae-8524-4dcd-9983-e207576458c5-secret-volume\") pod \"collect-profiles-29335770-kjbb9\" (UID: \"7fec83ae-8524-4dcd-9983-e207576458c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335770-kjbb9" Oct 11 01:30:00 crc kubenswrapper[4743]: I1011 01:30:00.417079 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fec83ae-8524-4dcd-9983-e207576458c5-config-volume\") pod \"collect-profiles-29335770-kjbb9\" (UID: \"7fec83ae-8524-4dcd-9983-e207576458c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335770-kjbb9" Oct 11 01:30:00 crc kubenswrapper[4743]: I1011 01:30:00.418007 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fec83ae-8524-4dcd-9983-e207576458c5-config-volume\") pod \"collect-profiles-29335770-kjbb9\" (UID: \"7fec83ae-8524-4dcd-9983-e207576458c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335770-kjbb9" Oct 11 01:30:00 crc kubenswrapper[4743]: I1011 01:30:00.424113 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fec83ae-8524-4dcd-9983-e207576458c5-secret-volume\") pod \"collect-profiles-29335770-kjbb9\" (UID: \"7fec83ae-8524-4dcd-9983-e207576458c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335770-kjbb9" Oct 11 01:30:00 crc kubenswrapper[4743]: I1011 01:30:00.436920 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sblk\" (UniqueName: \"kubernetes.io/projected/7fec83ae-8524-4dcd-9983-e207576458c5-kube-api-access-8sblk\") pod \"collect-profiles-29335770-kjbb9\" (UID: \"7fec83ae-8524-4dcd-9983-e207576458c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335770-kjbb9" Oct 11 01:30:00 crc kubenswrapper[4743]: I1011 01:30:00.512209 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335770-kjbb9" Oct 11 01:30:00 crc kubenswrapper[4743]: I1011 01:30:00.981797 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335770-kjbb9"] Oct 11 01:30:01 crc kubenswrapper[4743]: I1011 01:30:01.333542 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335770-kjbb9" event={"ID":"7fec83ae-8524-4dcd-9983-e207576458c5","Type":"ContainerStarted","Data":"f538fd74ba3a7de967eea3554782d8114f61dfb9b43b9ae2c7cd1a2ee0332969"} Oct 11 01:30:01 crc kubenswrapper[4743]: I1011 01:30:01.333910 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335770-kjbb9" event={"ID":"7fec83ae-8524-4dcd-9983-e207576458c5","Type":"ContainerStarted","Data":"68e5756b2a4af3be6ab681bc3de1fa7db6a01af930f61df5f04226c3535d4b25"} Oct 11 01:30:01 crc kubenswrapper[4743]: I1011 01:30:01.360799 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29335770-kjbb9" podStartSLOduration=1.36078135 podStartE2EDuration="1.36078135s" podCreationTimestamp="2025-10-11 01:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 01:30:01.351714623 +0000 UTC m=+2296.004695030" watchObservedRunningTime="2025-10-11 01:30:01.36078135 +0000 UTC m=+2296.013761747" Oct 11 01:30:02 crc kubenswrapper[4743]: I1011 01:30:02.343709 4743 generic.go:334] "Generic (PLEG): container finished" podID="7fec83ae-8524-4dcd-9983-e207576458c5" containerID="f538fd74ba3a7de967eea3554782d8114f61dfb9b43b9ae2c7cd1a2ee0332969" exitCode=0 Oct 11 01:30:02 crc kubenswrapper[4743]: I1011 01:30:02.343822 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335770-kjbb9" event={"ID":"7fec83ae-8524-4dcd-9983-e207576458c5","Type":"ContainerDied","Data":"f538fd74ba3a7de967eea3554782d8114f61dfb9b43b9ae2c7cd1a2ee0332969"} Oct 11 01:30:03 crc kubenswrapper[4743]: I1011 01:30:03.787535 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335770-kjbb9" Oct 11 01:30:03 crc kubenswrapper[4743]: I1011 01:30:03.902240 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fec83ae-8524-4dcd-9983-e207576458c5-secret-volume\") pod \"7fec83ae-8524-4dcd-9983-e207576458c5\" (UID: \"7fec83ae-8524-4dcd-9983-e207576458c5\") " Oct 11 01:30:03 crc kubenswrapper[4743]: I1011 01:30:03.902415 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sblk\" (UniqueName: \"kubernetes.io/projected/7fec83ae-8524-4dcd-9983-e207576458c5-kube-api-access-8sblk\") pod \"7fec83ae-8524-4dcd-9983-e207576458c5\" (UID: \"7fec83ae-8524-4dcd-9983-e207576458c5\") " Oct 11 01:30:03 crc kubenswrapper[4743]: I1011 01:30:03.902601 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fec83ae-8524-4dcd-9983-e207576458c5-config-volume\") pod \"7fec83ae-8524-4dcd-9983-e207576458c5\" (UID: \"7fec83ae-8524-4dcd-9983-e207576458c5\") " Oct 11 01:30:03 crc kubenswrapper[4743]: I1011 01:30:03.903612 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fec83ae-8524-4dcd-9983-e207576458c5-config-volume" (OuterVolumeSpecName: "config-volume") pod "7fec83ae-8524-4dcd-9983-e207576458c5" (UID: "7fec83ae-8524-4dcd-9983-e207576458c5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:30:03 crc kubenswrapper[4743]: I1011 01:30:03.911195 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fec83ae-8524-4dcd-9983-e207576458c5-kube-api-access-8sblk" (OuterVolumeSpecName: "kube-api-access-8sblk") pod "7fec83ae-8524-4dcd-9983-e207576458c5" (UID: "7fec83ae-8524-4dcd-9983-e207576458c5"). InnerVolumeSpecName "kube-api-access-8sblk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:30:03 crc kubenswrapper[4743]: I1011 01:30:03.911768 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fec83ae-8524-4dcd-9983-e207576458c5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7fec83ae-8524-4dcd-9983-e207576458c5" (UID: "7fec83ae-8524-4dcd-9983-e207576458c5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:30:04 crc kubenswrapper[4743]: I1011 01:30:04.004745 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fec83ae-8524-4dcd-9983-e207576458c5-config-volume\") on node \"crc\" DevicePath \"\"" Oct 11 01:30:04 crc kubenswrapper[4743]: I1011 01:30:04.004781 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fec83ae-8524-4dcd-9983-e207576458c5-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 11 01:30:04 crc kubenswrapper[4743]: I1011 01:30:04.004791 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sblk\" (UniqueName: \"kubernetes.io/projected/7fec83ae-8524-4dcd-9983-e207576458c5-kube-api-access-8sblk\") on node \"crc\" DevicePath \"\"" Oct 11 01:30:04 crc kubenswrapper[4743]: I1011 01:30:04.387403 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335770-kjbb9" event={"ID":"7fec83ae-8524-4dcd-9983-e207576458c5","Type":"ContainerDied","Data":"68e5756b2a4af3be6ab681bc3de1fa7db6a01af930f61df5f04226c3535d4b25"} Oct 11 01:30:04 crc kubenswrapper[4743]: I1011 01:30:04.387702 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68e5756b2a4af3be6ab681bc3de1fa7db6a01af930f61df5f04226c3535d4b25" Oct 11 01:30:04 crc kubenswrapper[4743]: I1011 01:30:04.387667 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335770-kjbb9" Oct 11 01:30:04 crc kubenswrapper[4743]: I1011 01:30:04.436310 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335725-pg67v"] Oct 11 01:30:04 crc kubenswrapper[4743]: I1011 01:30:04.448360 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335725-pg67v"] Oct 11 01:30:06 crc kubenswrapper[4743]: I1011 01:30:06.107794 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="962b1e46-ba63-4aa2-9882-a04487a05813" path="/var/lib/kubelet/pods/962b1e46-ba63-4aa2-9882-a04487a05813/volumes" Oct 11 01:30:07 crc kubenswrapper[4743]: I1011 01:30:07.093267 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:30:07 crc kubenswrapper[4743]: E1011 01:30:07.093813 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:30:22 crc kubenswrapper[4743]: I1011 01:30:22.092107 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:30:22 crc kubenswrapper[4743]: E1011 01:30:22.093151 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:30:34 crc kubenswrapper[4743]: I1011 01:30:34.092555 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:30:34 crc kubenswrapper[4743]: E1011 01:30:34.093611 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:30:45 crc kubenswrapper[4743]: I1011 01:30:45.092710 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:30:45 crc kubenswrapper[4743]: E1011 01:30:45.094163 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:30:52 crc kubenswrapper[4743]: I1011 01:30:52.481631 4743 scope.go:117] "RemoveContainer" containerID="057a072f6b313157d0743795a2f56c68aa7d55e7b92aaabe95560a120c6bc0cc" Oct 11 01:30:58 crc kubenswrapper[4743]: I1011 01:30:58.091726 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:30:58 crc kubenswrapper[4743]: E1011 01:30:58.092653 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:31:09 crc kubenswrapper[4743]: I1011 01:31:09.091416 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:31:09 crc kubenswrapper[4743]: E1011 01:31:09.092194 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:31:24 crc kubenswrapper[4743]: I1011 01:31:24.091707 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:31:24 crc kubenswrapper[4743]: E1011 01:31:24.092390 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:31:38 crc kubenswrapper[4743]: I1011 01:31:38.092378 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:31:38 crc kubenswrapper[4743]: E1011 01:31:38.093350 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:31:53 crc kubenswrapper[4743]: I1011 01:31:53.092420 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:31:53 crc kubenswrapper[4743]: E1011 01:31:53.093715 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:32:05 crc kubenswrapper[4743]: I1011 01:32:05.092252 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:32:05 crc kubenswrapper[4743]: E1011 01:32:05.093304 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:32:18 crc kubenswrapper[4743]: I1011 01:32:18.093357 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:32:18 crc kubenswrapper[4743]: E1011 01:32:18.094429 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:32:29 crc kubenswrapper[4743]: I1011 01:32:29.092790 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:32:29 crc kubenswrapper[4743]: E1011 01:32:29.094030 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:32:41 crc kubenswrapper[4743]: I1011 01:32:41.094353 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:32:41 crc kubenswrapper[4743]: E1011 01:32:41.095847 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:32:44 crc kubenswrapper[4743]: I1011 01:32:44.401467 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kxmfv"] Oct 11 01:32:44 crc kubenswrapper[4743]: E1011 01:32:44.402298 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fec83ae-8524-4dcd-9983-e207576458c5" containerName="collect-profiles" Oct 11 01:32:44 crc kubenswrapper[4743]: I1011 01:32:44.402314 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fec83ae-8524-4dcd-9983-e207576458c5" containerName="collect-profiles" Oct 11 01:32:44 crc kubenswrapper[4743]: I1011 01:32:44.402619 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fec83ae-8524-4dcd-9983-e207576458c5" containerName="collect-profiles" Oct 11 01:32:44 crc kubenswrapper[4743]: I1011 01:32:44.404530 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kxmfv" Oct 11 01:32:44 crc kubenswrapper[4743]: I1011 01:32:44.415211 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kxmfv"] Oct 11 01:32:44 crc kubenswrapper[4743]: I1011 01:32:44.566839 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b645da6-d00b-4689-a341-bdf4e6e618a4-catalog-content\") pod \"certified-operators-kxmfv\" (UID: \"1b645da6-d00b-4689-a341-bdf4e6e618a4\") " pod="openshift-marketplace/certified-operators-kxmfv" Oct 11 01:32:44 crc kubenswrapper[4743]: I1011 01:32:44.567277 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b645da6-d00b-4689-a341-bdf4e6e618a4-utilities\") pod \"certified-operators-kxmfv\" (UID: \"1b645da6-d00b-4689-a341-bdf4e6e618a4\") " pod="openshift-marketplace/certified-operators-kxmfv" Oct 11 01:32:44 crc kubenswrapper[4743]: I1011 01:32:44.567729 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x484h\" (UniqueName: \"kubernetes.io/projected/1b645da6-d00b-4689-a341-bdf4e6e618a4-kube-api-access-x484h\") pod \"certified-operators-kxmfv\" (UID: \"1b645da6-d00b-4689-a341-bdf4e6e618a4\") " pod="openshift-marketplace/certified-operators-kxmfv" Oct 11 01:32:44 crc kubenswrapper[4743]: I1011 01:32:44.670234 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b645da6-d00b-4689-a341-bdf4e6e618a4-catalog-content\") pod \"certified-operators-kxmfv\" (UID: \"1b645da6-d00b-4689-a341-bdf4e6e618a4\") " pod="openshift-marketplace/certified-operators-kxmfv" Oct 11 01:32:44 crc kubenswrapper[4743]: I1011 01:32:44.670290 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b645da6-d00b-4689-a341-bdf4e6e618a4-utilities\") pod \"certified-operators-kxmfv\" (UID: \"1b645da6-d00b-4689-a341-bdf4e6e618a4\") " pod="openshift-marketplace/certified-operators-kxmfv" Oct 11 01:32:44 crc kubenswrapper[4743]: I1011 01:32:44.670380 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x484h\" (UniqueName: \"kubernetes.io/projected/1b645da6-d00b-4689-a341-bdf4e6e618a4-kube-api-access-x484h\") pod \"certified-operators-kxmfv\" (UID: \"1b645da6-d00b-4689-a341-bdf4e6e618a4\") " pod="openshift-marketplace/certified-operators-kxmfv" Oct 11 01:32:44 crc kubenswrapper[4743]: I1011 01:32:44.670879 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b645da6-d00b-4689-a341-bdf4e6e618a4-utilities\") pod \"certified-operators-kxmfv\" (UID: \"1b645da6-d00b-4689-a341-bdf4e6e618a4\") " pod="openshift-marketplace/certified-operators-kxmfv" Oct 11 01:32:44 crc kubenswrapper[4743]: I1011 01:32:44.670961 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b645da6-d00b-4689-a341-bdf4e6e618a4-catalog-content\") pod \"certified-operators-kxmfv\" (UID: \"1b645da6-d00b-4689-a341-bdf4e6e618a4\") " pod="openshift-marketplace/certified-operators-kxmfv" Oct 11 01:32:44 crc kubenswrapper[4743]: I1011 01:32:44.693532 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x484h\" (UniqueName: \"kubernetes.io/projected/1b645da6-d00b-4689-a341-bdf4e6e618a4-kube-api-access-x484h\") pod \"certified-operators-kxmfv\" (UID: \"1b645da6-d00b-4689-a341-bdf4e6e618a4\") " pod="openshift-marketplace/certified-operators-kxmfv" Oct 11 01:32:44 crc kubenswrapper[4743]: I1011 01:32:44.727768 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kxmfv" Oct 11 01:32:45 crc kubenswrapper[4743]: I1011 01:32:45.243428 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kxmfv"] Oct 11 01:32:45 crc kubenswrapper[4743]: I1011 01:32:45.538659 4743 generic.go:334] "Generic (PLEG): container finished" podID="1b645da6-d00b-4689-a341-bdf4e6e618a4" containerID="6548b63ee822406ff564a14d5e37e0c643ae90246d10d19bd0dd1b1cb8cc7d6f" exitCode=0 Oct 11 01:32:45 crc kubenswrapper[4743]: I1011 01:32:45.539035 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kxmfv" event={"ID":"1b645da6-d00b-4689-a341-bdf4e6e618a4","Type":"ContainerDied","Data":"6548b63ee822406ff564a14d5e37e0c643ae90246d10d19bd0dd1b1cb8cc7d6f"} Oct 11 01:32:45 crc kubenswrapper[4743]: I1011 01:32:45.539073 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kxmfv" event={"ID":"1b645da6-d00b-4689-a341-bdf4e6e618a4","Type":"ContainerStarted","Data":"6c2677499f9a2569305d063800ca36e78e84c74071050fc1edea717b54bbeb7a"} Oct 11 01:32:45 crc kubenswrapper[4743]: I1011 01:32:45.541454 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 01:32:47 crc kubenswrapper[4743]: I1011 01:32:47.558892 4743 generic.go:334] "Generic (PLEG): container finished" podID="1b645da6-d00b-4689-a341-bdf4e6e618a4" containerID="d9604d4c01fb91c9bd6e2e326ec73e35d14f786c76c1a1ea88d69a00ec0f8fb0" exitCode=0 Oct 11 01:32:47 crc kubenswrapper[4743]: I1011 01:32:47.559179 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kxmfv" event={"ID":"1b645da6-d00b-4689-a341-bdf4e6e618a4","Type":"ContainerDied","Data":"d9604d4c01fb91c9bd6e2e326ec73e35d14f786c76c1a1ea88d69a00ec0f8fb0"} Oct 11 01:32:48 crc kubenswrapper[4743]: I1011 01:32:48.570474 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kxmfv" event={"ID":"1b645da6-d00b-4689-a341-bdf4e6e618a4","Type":"ContainerStarted","Data":"235ae487aa57c3bbaa2f01475dde90e6f170b1dc1302a4e0e61e148fc056c513"} Oct 11 01:32:48 crc kubenswrapper[4743]: I1011 01:32:48.611141 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kxmfv" podStartSLOduration=2.086798769 podStartE2EDuration="4.61112082s" podCreationTimestamp="2025-10-11 01:32:44 +0000 UTC" firstStartedPulling="2025-10-11 01:32:45.541105648 +0000 UTC m=+2460.194086085" lastFinishedPulling="2025-10-11 01:32:48.065427739 +0000 UTC m=+2462.718408136" observedRunningTime="2025-10-11 01:32:48.595876144 +0000 UTC m=+2463.248856551" watchObservedRunningTime="2025-10-11 01:32:48.61112082 +0000 UTC m=+2463.264101227" Oct 11 01:32:52 crc kubenswrapper[4743]: I1011 01:32:52.608096 4743 scope.go:117] "RemoveContainer" containerID="193af6e6b0b944e67cddf537bda9a9a5306af6fcdc859b51283cfd164d862a4a" Oct 11 01:32:52 crc kubenswrapper[4743]: I1011 01:32:52.643979 4743 scope.go:117] "RemoveContainer" containerID="83baecd362cde102225d973abb62930b0c59446408b246777bbd9b3e6e3e1cb0" Oct 11 01:32:52 crc kubenswrapper[4743]: I1011 01:32:52.677193 4743 scope.go:117] "RemoveContainer" containerID="42eef27c35904d898015916eec315de772a0fc88b1b9fe725a01817a2f57a95c" Oct 11 01:32:54 crc kubenswrapper[4743]: I1011 01:32:54.092246 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:32:54 crc kubenswrapper[4743]: E1011 01:32:54.092946 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:32:54 crc kubenswrapper[4743]: I1011 01:32:54.727948 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kxmfv" Oct 11 01:32:54 crc kubenswrapper[4743]: I1011 01:32:54.728536 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kxmfv" Oct 11 01:32:54 crc kubenswrapper[4743]: I1011 01:32:54.779850 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kxmfv" Oct 11 01:32:55 crc kubenswrapper[4743]: I1011 01:32:55.742850 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kxmfv" Oct 11 01:32:55 crc kubenswrapper[4743]: I1011 01:32:55.803558 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kxmfv"] Oct 11 01:32:57 crc kubenswrapper[4743]: I1011 01:32:57.708649 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kxmfv" podUID="1b645da6-d00b-4689-a341-bdf4e6e618a4" containerName="registry-server" containerID="cri-o://235ae487aa57c3bbaa2f01475dde90e6f170b1dc1302a4e0e61e148fc056c513" gracePeriod=2 Oct 11 01:32:58 crc kubenswrapper[4743]: I1011 01:32:58.219581 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kxmfv" Oct 11 01:32:58 crc kubenswrapper[4743]: I1011 01:32:58.337036 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b645da6-d00b-4689-a341-bdf4e6e618a4-utilities\") pod \"1b645da6-d00b-4689-a341-bdf4e6e618a4\" (UID: \"1b645da6-d00b-4689-a341-bdf4e6e618a4\") " Oct 11 01:32:58 crc kubenswrapper[4743]: I1011 01:32:58.337121 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b645da6-d00b-4689-a341-bdf4e6e618a4-catalog-content\") pod \"1b645da6-d00b-4689-a341-bdf4e6e618a4\" (UID: \"1b645da6-d00b-4689-a341-bdf4e6e618a4\") " Oct 11 01:32:58 crc kubenswrapper[4743]: I1011 01:32:58.337286 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x484h\" (UniqueName: \"kubernetes.io/projected/1b645da6-d00b-4689-a341-bdf4e6e618a4-kube-api-access-x484h\") pod \"1b645da6-d00b-4689-a341-bdf4e6e618a4\" (UID: \"1b645da6-d00b-4689-a341-bdf4e6e618a4\") " Oct 11 01:32:58 crc kubenswrapper[4743]: I1011 01:32:58.337920 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b645da6-d00b-4689-a341-bdf4e6e618a4-utilities" (OuterVolumeSpecName: "utilities") pod "1b645da6-d00b-4689-a341-bdf4e6e618a4" (UID: "1b645da6-d00b-4689-a341-bdf4e6e618a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:32:58 crc kubenswrapper[4743]: I1011 01:32:58.342221 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b645da6-d00b-4689-a341-bdf4e6e618a4-kube-api-access-x484h" (OuterVolumeSpecName: "kube-api-access-x484h") pod "1b645da6-d00b-4689-a341-bdf4e6e618a4" (UID: "1b645da6-d00b-4689-a341-bdf4e6e618a4"). InnerVolumeSpecName "kube-api-access-x484h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:32:58 crc kubenswrapper[4743]: I1011 01:32:58.382109 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b645da6-d00b-4689-a341-bdf4e6e618a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b645da6-d00b-4689-a341-bdf4e6e618a4" (UID: "1b645da6-d00b-4689-a341-bdf4e6e618a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:32:58 crc kubenswrapper[4743]: I1011 01:32:58.440004 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b645da6-d00b-4689-a341-bdf4e6e618a4-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 01:32:58 crc kubenswrapper[4743]: I1011 01:32:58.440051 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b645da6-d00b-4689-a341-bdf4e6e618a4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 01:32:58 crc kubenswrapper[4743]: I1011 01:32:58.440065 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x484h\" (UniqueName: \"kubernetes.io/projected/1b645da6-d00b-4689-a341-bdf4e6e618a4-kube-api-access-x484h\") on node \"crc\" DevicePath \"\"" Oct 11 01:32:58 crc kubenswrapper[4743]: I1011 01:32:58.723921 4743 generic.go:334] "Generic (PLEG): container finished" podID="1b645da6-d00b-4689-a341-bdf4e6e618a4" containerID="235ae487aa57c3bbaa2f01475dde90e6f170b1dc1302a4e0e61e148fc056c513" exitCode=0 Oct 11 01:32:58 crc kubenswrapper[4743]: I1011 01:32:58.723995 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kxmfv" Oct 11 01:32:58 crc kubenswrapper[4743]: I1011 01:32:58.724037 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kxmfv" event={"ID":"1b645da6-d00b-4689-a341-bdf4e6e618a4","Type":"ContainerDied","Data":"235ae487aa57c3bbaa2f01475dde90e6f170b1dc1302a4e0e61e148fc056c513"} Oct 11 01:32:58 crc kubenswrapper[4743]: I1011 01:32:58.724123 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kxmfv" event={"ID":"1b645da6-d00b-4689-a341-bdf4e6e618a4","Type":"ContainerDied","Data":"6c2677499f9a2569305d063800ca36e78e84c74071050fc1edea717b54bbeb7a"} Oct 11 01:32:58 crc kubenswrapper[4743]: I1011 01:32:58.724155 4743 scope.go:117] "RemoveContainer" containerID="235ae487aa57c3bbaa2f01475dde90e6f170b1dc1302a4e0e61e148fc056c513" Oct 11 01:32:58 crc kubenswrapper[4743]: I1011 01:32:58.772983 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kxmfv"] Oct 11 01:32:58 crc kubenswrapper[4743]: I1011 01:32:58.782223 4743 scope.go:117] "RemoveContainer" containerID="d9604d4c01fb91c9bd6e2e326ec73e35d14f786c76c1a1ea88d69a00ec0f8fb0" Oct 11 01:32:58 crc kubenswrapper[4743]: I1011 01:32:58.785689 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kxmfv"] Oct 11 01:32:58 crc kubenswrapper[4743]: I1011 01:32:58.814486 4743 scope.go:117] "RemoveContainer" containerID="6548b63ee822406ff564a14d5e37e0c643ae90246d10d19bd0dd1b1cb8cc7d6f" Oct 11 01:32:58 crc kubenswrapper[4743]: I1011 01:32:58.889763 4743 scope.go:117] "RemoveContainer" containerID="235ae487aa57c3bbaa2f01475dde90e6f170b1dc1302a4e0e61e148fc056c513" Oct 11 01:32:58 crc kubenswrapper[4743]: E1011 01:32:58.890415 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"235ae487aa57c3bbaa2f01475dde90e6f170b1dc1302a4e0e61e148fc056c513\": container with ID starting with 235ae487aa57c3bbaa2f01475dde90e6f170b1dc1302a4e0e61e148fc056c513 not found: ID does not exist" containerID="235ae487aa57c3bbaa2f01475dde90e6f170b1dc1302a4e0e61e148fc056c513" Oct 11 01:32:58 crc kubenswrapper[4743]: I1011 01:32:58.890470 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"235ae487aa57c3bbaa2f01475dde90e6f170b1dc1302a4e0e61e148fc056c513"} err="failed to get container status \"235ae487aa57c3bbaa2f01475dde90e6f170b1dc1302a4e0e61e148fc056c513\": rpc error: code = NotFound desc = could not find container \"235ae487aa57c3bbaa2f01475dde90e6f170b1dc1302a4e0e61e148fc056c513\": container with ID starting with 235ae487aa57c3bbaa2f01475dde90e6f170b1dc1302a4e0e61e148fc056c513 not found: ID does not exist" Oct 11 01:32:58 crc kubenswrapper[4743]: I1011 01:32:58.890504 4743 scope.go:117] "RemoveContainer" containerID="d9604d4c01fb91c9bd6e2e326ec73e35d14f786c76c1a1ea88d69a00ec0f8fb0" Oct 11 01:32:58 crc kubenswrapper[4743]: E1011 01:32:58.891225 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9604d4c01fb91c9bd6e2e326ec73e35d14f786c76c1a1ea88d69a00ec0f8fb0\": container with ID starting with d9604d4c01fb91c9bd6e2e326ec73e35d14f786c76c1a1ea88d69a00ec0f8fb0 not found: ID does not exist" containerID="d9604d4c01fb91c9bd6e2e326ec73e35d14f786c76c1a1ea88d69a00ec0f8fb0" Oct 11 01:32:58 crc kubenswrapper[4743]: I1011 01:32:58.891256 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9604d4c01fb91c9bd6e2e326ec73e35d14f786c76c1a1ea88d69a00ec0f8fb0"} err="failed to get container status \"d9604d4c01fb91c9bd6e2e326ec73e35d14f786c76c1a1ea88d69a00ec0f8fb0\": rpc error: code = NotFound desc = could not find container \"d9604d4c01fb91c9bd6e2e326ec73e35d14f786c76c1a1ea88d69a00ec0f8fb0\": container with ID starting with d9604d4c01fb91c9bd6e2e326ec73e35d14f786c76c1a1ea88d69a00ec0f8fb0 not found: ID does not exist" Oct 11 01:32:58 crc kubenswrapper[4743]: I1011 01:32:58.891280 4743 scope.go:117] "RemoveContainer" containerID="6548b63ee822406ff564a14d5e37e0c643ae90246d10d19bd0dd1b1cb8cc7d6f" Oct 11 01:32:58 crc kubenswrapper[4743]: E1011 01:32:58.891627 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6548b63ee822406ff564a14d5e37e0c643ae90246d10d19bd0dd1b1cb8cc7d6f\": container with ID starting with 6548b63ee822406ff564a14d5e37e0c643ae90246d10d19bd0dd1b1cb8cc7d6f not found: ID does not exist" containerID="6548b63ee822406ff564a14d5e37e0c643ae90246d10d19bd0dd1b1cb8cc7d6f" Oct 11 01:32:58 crc kubenswrapper[4743]: I1011 01:32:58.891670 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6548b63ee822406ff564a14d5e37e0c643ae90246d10d19bd0dd1b1cb8cc7d6f"} err="failed to get container status \"6548b63ee822406ff564a14d5e37e0c643ae90246d10d19bd0dd1b1cb8cc7d6f\": rpc error: code = NotFound desc = could not find container \"6548b63ee822406ff564a14d5e37e0c643ae90246d10d19bd0dd1b1cb8cc7d6f\": container with ID starting with 6548b63ee822406ff564a14d5e37e0c643ae90246d10d19bd0dd1b1cb8cc7d6f not found: ID does not exist" Oct 11 01:33:00 crc kubenswrapper[4743]: I1011 01:33:00.111714 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b645da6-d00b-4689-a341-bdf4e6e618a4" path="/var/lib/kubelet/pods/1b645da6-d00b-4689-a341-bdf4e6e618a4/volumes" Oct 11 01:33:06 crc kubenswrapper[4743]: I1011 01:33:06.121603 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:33:06 crc kubenswrapper[4743]: E1011 01:33:06.123839 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:33:21 crc kubenswrapper[4743]: I1011 01:33:21.091425 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:33:22 crc kubenswrapper[4743]: I1011 01:33:22.032271 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"ed4b65bd7e78e9bd75562b390e48eafdf4eff27b1128523c7bfdb6570c4e9641"} Oct 11 01:33:32 crc kubenswrapper[4743]: I1011 01:33:32.176982 4743 generic.go:334] "Generic (PLEG): container finished" podID="53edf58a-7be3-40ee-af4a-d110d1607356" containerID="9954638b31e5b269195f86d27aa2cc9adcdfc1813da96e97ac8cf5069b92922f" exitCode=0 Oct 11 01:33:32 crc kubenswrapper[4743]: I1011 01:33:32.177096 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8" event={"ID":"53edf58a-7be3-40ee-af4a-d110d1607356","Type":"ContainerDied","Data":"9954638b31e5b269195f86d27aa2cc9adcdfc1813da96e97ac8cf5069b92922f"} Oct 11 01:33:33 crc kubenswrapper[4743]: I1011 01:33:33.708301 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8" Oct 11 01:33:33 crc kubenswrapper[4743]: I1011 01:33:33.820078 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v46xt\" (UniqueName: \"kubernetes.io/projected/53edf58a-7be3-40ee-af4a-d110d1607356-kube-api-access-v46xt\") pod \"53edf58a-7be3-40ee-af4a-d110d1607356\" (UID: \"53edf58a-7be3-40ee-af4a-d110d1607356\") " Oct 11 01:33:33 crc kubenswrapper[4743]: I1011 01:33:33.820137 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/53edf58a-7be3-40ee-af4a-d110d1607356-libvirt-secret-0\") pod \"53edf58a-7be3-40ee-af4a-d110d1607356\" (UID: \"53edf58a-7be3-40ee-af4a-d110d1607356\") " Oct 11 01:33:33 crc kubenswrapper[4743]: I1011 01:33:33.820363 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53edf58a-7be3-40ee-af4a-d110d1607356-inventory\") pod \"53edf58a-7be3-40ee-af4a-d110d1607356\" (UID: \"53edf58a-7be3-40ee-af4a-d110d1607356\") " Oct 11 01:33:33 crc kubenswrapper[4743]: I1011 01:33:33.820481 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53edf58a-7be3-40ee-af4a-d110d1607356-libvirt-combined-ca-bundle\") pod \"53edf58a-7be3-40ee-af4a-d110d1607356\" (UID: \"53edf58a-7be3-40ee-af4a-d110d1607356\") " Oct 11 01:33:33 crc kubenswrapper[4743]: I1011 01:33:33.820585 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53edf58a-7be3-40ee-af4a-d110d1607356-ssh-key\") pod \"53edf58a-7be3-40ee-af4a-d110d1607356\" (UID: \"53edf58a-7be3-40ee-af4a-d110d1607356\") " Oct 11 01:33:33 crc kubenswrapper[4743]: I1011 01:33:33.827145 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53edf58a-7be3-40ee-af4a-d110d1607356-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "53edf58a-7be3-40ee-af4a-d110d1607356" (UID: "53edf58a-7be3-40ee-af4a-d110d1607356"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:33:33 crc kubenswrapper[4743]: I1011 01:33:33.827794 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53edf58a-7be3-40ee-af4a-d110d1607356-kube-api-access-v46xt" (OuterVolumeSpecName: "kube-api-access-v46xt") pod "53edf58a-7be3-40ee-af4a-d110d1607356" (UID: "53edf58a-7be3-40ee-af4a-d110d1607356"). InnerVolumeSpecName "kube-api-access-v46xt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:33:33 crc kubenswrapper[4743]: I1011 01:33:33.852280 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53edf58a-7be3-40ee-af4a-d110d1607356-inventory" (OuterVolumeSpecName: "inventory") pod "53edf58a-7be3-40ee-af4a-d110d1607356" (UID: "53edf58a-7be3-40ee-af4a-d110d1607356"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:33:33 crc kubenswrapper[4743]: I1011 01:33:33.854842 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53edf58a-7be3-40ee-af4a-d110d1607356-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "53edf58a-7be3-40ee-af4a-d110d1607356" (UID: "53edf58a-7be3-40ee-af4a-d110d1607356"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:33:33 crc kubenswrapper[4743]: I1011 01:33:33.858779 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53edf58a-7be3-40ee-af4a-d110d1607356-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "53edf58a-7be3-40ee-af4a-d110d1607356" (UID: "53edf58a-7be3-40ee-af4a-d110d1607356"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:33:33 crc kubenswrapper[4743]: I1011 01:33:33.922621 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v46xt\" (UniqueName: \"kubernetes.io/projected/53edf58a-7be3-40ee-af4a-d110d1607356-kube-api-access-v46xt\") on node \"crc\" DevicePath \"\"" Oct 11 01:33:33 crc kubenswrapper[4743]: I1011 01:33:33.922655 4743 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/53edf58a-7be3-40ee-af4a-d110d1607356-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:33:33 crc kubenswrapper[4743]: I1011 01:33:33.922665 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53edf58a-7be3-40ee-af4a-d110d1607356-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:33:33 crc kubenswrapper[4743]: I1011 01:33:33.922674 4743 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53edf58a-7be3-40ee-af4a-d110d1607356-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:33:33 crc kubenswrapper[4743]: I1011 01:33:33.922683 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53edf58a-7be3-40ee-af4a-d110d1607356-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.203536 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8" event={"ID":"53edf58a-7be3-40ee-af4a-d110d1607356","Type":"ContainerDied","Data":"194784a5b82826cd68de474dbbae81f1a035204609cec6fde1df35cd9fc0f11d"} Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.203759 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.203777 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="194784a5b82826cd68de474dbbae81f1a035204609cec6fde1df35cd9fc0f11d" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.305013 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr"] Oct 11 01:33:34 crc kubenswrapper[4743]: E1011 01:33:34.305421 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b645da6-d00b-4689-a341-bdf4e6e618a4" containerName="registry-server" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.305438 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b645da6-d00b-4689-a341-bdf4e6e618a4" containerName="registry-server" Oct 11 01:33:34 crc kubenswrapper[4743]: E1011 01:33:34.305472 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b645da6-d00b-4689-a341-bdf4e6e618a4" containerName="extract-utilities" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.305479 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b645da6-d00b-4689-a341-bdf4e6e618a4" containerName="extract-utilities" Oct 11 01:33:34 crc kubenswrapper[4743]: E1011 01:33:34.305491 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53edf58a-7be3-40ee-af4a-d110d1607356" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.305498 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="53edf58a-7be3-40ee-af4a-d110d1607356" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 11 01:33:34 crc kubenswrapper[4743]: E1011 01:33:34.305509 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b645da6-d00b-4689-a341-bdf4e6e618a4" containerName="extract-content" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.305516 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b645da6-d00b-4689-a341-bdf4e6e618a4" containerName="extract-content" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.305716 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="53edf58a-7be3-40ee-af4a-d110d1607356" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.305729 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b645da6-d00b-4689-a341-bdf4e6e618a4" containerName="registry-server" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.306410 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.310027 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.310402 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.310655 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.310838 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.311043 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.333571 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr"] Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.434388 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.434700 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.434751 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcgvd\" (UniqueName: \"kubernetes.io/projected/4255878f-68bf-41cc-8f1a-6e38ac2e2401-kube-api-access-fcgvd\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.434890 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.434934 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.435082 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.435128 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.537210 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.537257 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcgvd\" (UniqueName: \"kubernetes.io/projected/4255878f-68bf-41cc-8f1a-6e38ac2e2401-kube-api-access-fcgvd\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.537286 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.537304 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.537345 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.537371 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.537437 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.541933 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.544866 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.546675 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.546835 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.548594 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.555722 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.559031 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcgvd\" (UniqueName: \"kubernetes.io/projected/4255878f-68bf-41cc-8f1a-6e38ac2e2401-kube-api-access-fcgvd\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" Oct 11 01:33:34 crc kubenswrapper[4743]: I1011 01:33:34.636955 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" Oct 11 01:33:35 crc kubenswrapper[4743]: I1011 01:33:35.162770 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr"] Oct 11 01:33:35 crc kubenswrapper[4743]: I1011 01:33:35.213230 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" event={"ID":"4255878f-68bf-41cc-8f1a-6e38ac2e2401","Type":"ContainerStarted","Data":"5be1738d831c0a12b79628f7d214268e1aee2f828dcd9cd4f597f958277a146b"} Oct 11 01:33:36 crc kubenswrapper[4743]: I1011 01:33:36.225442 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" event={"ID":"4255878f-68bf-41cc-8f1a-6e38ac2e2401","Type":"ContainerStarted","Data":"fa1867cd662351046501c234689e2de6640e4bd19820fe084e7482f0ca1a8784"} Oct 11 01:33:36 crc kubenswrapper[4743]: I1011 01:33:36.243065 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" podStartSLOduration=1.461352959 podStartE2EDuration="2.243042457s" podCreationTimestamp="2025-10-11 01:33:34 +0000 UTC" firstStartedPulling="2025-10-11 01:33:35.168035291 +0000 UTC m=+2509.821015688" lastFinishedPulling="2025-10-11 01:33:35.949724779 +0000 UTC m=+2510.602705186" observedRunningTime="2025-10-11 01:33:36.238596465 +0000 UTC m=+2510.891576862" watchObservedRunningTime="2025-10-11 01:33:36.243042457 +0000 UTC m=+2510.896022854" Oct 11 01:35:44 crc kubenswrapper[4743]: I1011 01:35:44.458255 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:35:44 crc kubenswrapper[4743]: I1011 01:35:44.458799 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:36:10 crc kubenswrapper[4743]: I1011 01:36:10.117926 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rnj5k"] Oct 11 01:36:10 crc kubenswrapper[4743]: I1011 01:36:10.121990 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rnj5k" Oct 11 01:36:10 crc kubenswrapper[4743]: I1011 01:36:10.137286 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rnj5k"] Oct 11 01:36:10 crc kubenswrapper[4743]: I1011 01:36:10.285080 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6893fc8e-9116-40a8-81dc-eb00208ee425-catalog-content\") pod \"redhat-operators-rnj5k\" (UID: \"6893fc8e-9116-40a8-81dc-eb00208ee425\") " pod="openshift-marketplace/redhat-operators-rnj5k" Oct 11 01:36:10 crc kubenswrapper[4743]: I1011 01:36:10.285294 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6893fc8e-9116-40a8-81dc-eb00208ee425-utilities\") pod \"redhat-operators-rnj5k\" (UID: \"6893fc8e-9116-40a8-81dc-eb00208ee425\") " pod="openshift-marketplace/redhat-operators-rnj5k" Oct 11 01:36:10 crc kubenswrapper[4743]: I1011 01:36:10.285516 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trhgh\" (UniqueName: \"kubernetes.io/projected/6893fc8e-9116-40a8-81dc-eb00208ee425-kube-api-access-trhgh\") pod \"redhat-operators-rnj5k\" (UID: \"6893fc8e-9116-40a8-81dc-eb00208ee425\") " pod="openshift-marketplace/redhat-operators-rnj5k" Oct 11 01:36:10 crc kubenswrapper[4743]: I1011 01:36:10.388321 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6893fc8e-9116-40a8-81dc-eb00208ee425-utilities\") pod \"redhat-operators-rnj5k\" (UID: \"6893fc8e-9116-40a8-81dc-eb00208ee425\") " pod="openshift-marketplace/redhat-operators-rnj5k" Oct 11 01:36:10 crc kubenswrapper[4743]: I1011 01:36:10.388384 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6893fc8e-9116-40a8-81dc-eb00208ee425-utilities\") pod \"redhat-operators-rnj5k\" (UID: \"6893fc8e-9116-40a8-81dc-eb00208ee425\") " pod="openshift-marketplace/redhat-operators-rnj5k" Oct 11 01:36:10 crc kubenswrapper[4743]: I1011 01:36:10.388493 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trhgh\" (UniqueName: \"kubernetes.io/projected/6893fc8e-9116-40a8-81dc-eb00208ee425-kube-api-access-trhgh\") pod \"redhat-operators-rnj5k\" (UID: \"6893fc8e-9116-40a8-81dc-eb00208ee425\") " pod="openshift-marketplace/redhat-operators-rnj5k" Oct 11 01:36:10 crc kubenswrapper[4743]: I1011 01:36:10.389124 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6893fc8e-9116-40a8-81dc-eb00208ee425-catalog-content\") pod \"redhat-operators-rnj5k\" (UID: \"6893fc8e-9116-40a8-81dc-eb00208ee425\") " pod="openshift-marketplace/redhat-operators-rnj5k" Oct 11 01:36:10 crc kubenswrapper[4743]: I1011 01:36:10.389426 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6893fc8e-9116-40a8-81dc-eb00208ee425-catalog-content\") pod \"redhat-operators-rnj5k\" (UID: \"6893fc8e-9116-40a8-81dc-eb00208ee425\") " pod="openshift-marketplace/redhat-operators-rnj5k" Oct 11 01:36:10 crc kubenswrapper[4743]: I1011 01:36:10.411032 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trhgh\" (UniqueName: \"kubernetes.io/projected/6893fc8e-9116-40a8-81dc-eb00208ee425-kube-api-access-trhgh\") pod \"redhat-operators-rnj5k\" (UID: \"6893fc8e-9116-40a8-81dc-eb00208ee425\") " pod="openshift-marketplace/redhat-operators-rnj5k" Oct 11 01:36:10 crc kubenswrapper[4743]: I1011 01:36:10.457671 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rnj5k" Oct 11 01:36:10 crc kubenswrapper[4743]: I1011 01:36:10.933834 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rnj5k"] Oct 11 01:36:11 crc kubenswrapper[4743]: I1011 01:36:11.000264 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnj5k" event={"ID":"6893fc8e-9116-40a8-81dc-eb00208ee425","Type":"ContainerStarted","Data":"3bd1a1adeb104c121d6250a5cfa27d3c33000d5ecb4364dc48a4deb481388a8a"} Oct 11 01:36:12 crc kubenswrapper[4743]: I1011 01:36:12.015562 4743 generic.go:334] "Generic (PLEG): container finished" podID="6893fc8e-9116-40a8-81dc-eb00208ee425" containerID="d50f921df520a3b982eb3daa363b5ea05bf568b1b5f5fa5f7f3b43d90094b63d" exitCode=0 Oct 11 01:36:12 crc kubenswrapper[4743]: I1011 01:36:12.015826 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnj5k" event={"ID":"6893fc8e-9116-40a8-81dc-eb00208ee425","Type":"ContainerDied","Data":"d50f921df520a3b982eb3daa363b5ea05bf568b1b5f5fa5f7f3b43d90094b63d"} Oct 11 01:36:14 crc kubenswrapper[4743]: I1011 01:36:14.048782 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnj5k" event={"ID":"6893fc8e-9116-40a8-81dc-eb00208ee425","Type":"ContainerStarted","Data":"e0d28b40f0e83900b54f9274c1afd19a59b3f7e6a60c725a2f3dbbbe47a5eff3"} Oct 11 01:36:14 crc kubenswrapper[4743]: I1011 01:36:14.458061 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:36:14 crc kubenswrapper[4743]: I1011 01:36:14.458470 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:36:16 crc kubenswrapper[4743]: I1011 01:36:16.097352 4743 generic.go:334] "Generic (PLEG): container finished" podID="6893fc8e-9116-40a8-81dc-eb00208ee425" containerID="e0d28b40f0e83900b54f9274c1afd19a59b3f7e6a60c725a2f3dbbbe47a5eff3" exitCode=0 Oct 11 01:36:16 crc kubenswrapper[4743]: I1011 01:36:16.109784 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnj5k" event={"ID":"6893fc8e-9116-40a8-81dc-eb00208ee425","Type":"ContainerDied","Data":"e0d28b40f0e83900b54f9274c1afd19a59b3f7e6a60c725a2f3dbbbe47a5eff3"} Oct 11 01:36:17 crc kubenswrapper[4743]: I1011 01:36:17.107970 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnj5k" event={"ID":"6893fc8e-9116-40a8-81dc-eb00208ee425","Type":"ContainerStarted","Data":"2844370a565d8910dd3618631f6f007339317d84b6d25bb1a43d0ae67977d120"} Oct 11 01:36:17 crc kubenswrapper[4743]: I1011 01:36:17.130291 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rnj5k" podStartSLOduration=2.621245991 podStartE2EDuration="7.130267916s" podCreationTimestamp="2025-10-11 01:36:10 +0000 UTC" firstStartedPulling="2025-10-11 01:36:12.019214585 +0000 UTC m=+2666.672195022" lastFinishedPulling="2025-10-11 01:36:16.52823654 +0000 UTC m=+2671.181216947" observedRunningTime="2025-10-11 01:36:17.123297923 +0000 UTC m=+2671.776278320" watchObservedRunningTime="2025-10-11 01:36:17.130267916 +0000 UTC m=+2671.783248333" Oct 11 01:36:20 crc kubenswrapper[4743]: I1011 01:36:20.457966 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rnj5k" Oct 11 01:36:20 crc kubenswrapper[4743]: I1011 01:36:20.459652 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rnj5k" Oct 11 01:36:21 crc kubenswrapper[4743]: I1011 01:36:21.548962 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rnj5k" podUID="6893fc8e-9116-40a8-81dc-eb00208ee425" containerName="registry-server" probeResult="failure" output=< Oct 11 01:36:21 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Oct 11 01:36:21 crc kubenswrapper[4743]: > Oct 11 01:36:30 crc kubenswrapper[4743]: I1011 01:36:30.542212 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rnj5k" Oct 11 01:36:30 crc kubenswrapper[4743]: I1011 01:36:30.634132 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rnj5k" Oct 11 01:36:30 crc kubenswrapper[4743]: I1011 01:36:30.791987 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rnj5k"] Oct 11 01:36:32 crc kubenswrapper[4743]: I1011 01:36:32.274030 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rnj5k" podUID="6893fc8e-9116-40a8-81dc-eb00208ee425" containerName="registry-server" containerID="cri-o://2844370a565d8910dd3618631f6f007339317d84b6d25bb1a43d0ae67977d120" gracePeriod=2 Oct 11 01:36:32 crc kubenswrapper[4743]: I1011 01:36:32.845027 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rnj5k" Oct 11 01:36:32 crc kubenswrapper[4743]: I1011 01:36:32.941981 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6893fc8e-9116-40a8-81dc-eb00208ee425-catalog-content\") pod \"6893fc8e-9116-40a8-81dc-eb00208ee425\" (UID: \"6893fc8e-9116-40a8-81dc-eb00208ee425\") " Oct 11 01:36:32 crc kubenswrapper[4743]: I1011 01:36:32.942432 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6893fc8e-9116-40a8-81dc-eb00208ee425-utilities\") pod \"6893fc8e-9116-40a8-81dc-eb00208ee425\" (UID: \"6893fc8e-9116-40a8-81dc-eb00208ee425\") " Oct 11 01:36:32 crc kubenswrapper[4743]: I1011 01:36:32.942483 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trhgh\" (UniqueName: \"kubernetes.io/projected/6893fc8e-9116-40a8-81dc-eb00208ee425-kube-api-access-trhgh\") pod \"6893fc8e-9116-40a8-81dc-eb00208ee425\" (UID: \"6893fc8e-9116-40a8-81dc-eb00208ee425\") " Oct 11 01:36:32 crc kubenswrapper[4743]: I1011 01:36:32.943095 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6893fc8e-9116-40a8-81dc-eb00208ee425-utilities" (OuterVolumeSpecName: "utilities") pod "6893fc8e-9116-40a8-81dc-eb00208ee425" (UID: "6893fc8e-9116-40a8-81dc-eb00208ee425"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:36:32 crc kubenswrapper[4743]: I1011 01:36:32.948848 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6893fc8e-9116-40a8-81dc-eb00208ee425-kube-api-access-trhgh" (OuterVolumeSpecName: "kube-api-access-trhgh") pod "6893fc8e-9116-40a8-81dc-eb00208ee425" (UID: "6893fc8e-9116-40a8-81dc-eb00208ee425"). InnerVolumeSpecName "kube-api-access-trhgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:36:33 crc kubenswrapper[4743]: I1011 01:36:33.030962 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6893fc8e-9116-40a8-81dc-eb00208ee425-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6893fc8e-9116-40a8-81dc-eb00208ee425" (UID: "6893fc8e-9116-40a8-81dc-eb00208ee425"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:36:33 crc kubenswrapper[4743]: I1011 01:36:33.044572 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trhgh\" (UniqueName: \"kubernetes.io/projected/6893fc8e-9116-40a8-81dc-eb00208ee425-kube-api-access-trhgh\") on node \"crc\" DevicePath \"\"" Oct 11 01:36:33 crc kubenswrapper[4743]: I1011 01:36:33.044614 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6893fc8e-9116-40a8-81dc-eb00208ee425-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 01:36:33 crc kubenswrapper[4743]: I1011 01:36:33.044625 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6893fc8e-9116-40a8-81dc-eb00208ee425-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 01:36:33 crc kubenswrapper[4743]: I1011 01:36:33.285838 4743 generic.go:334] "Generic (PLEG): container finished" podID="6893fc8e-9116-40a8-81dc-eb00208ee425" containerID="2844370a565d8910dd3618631f6f007339317d84b6d25bb1a43d0ae67977d120" exitCode=0 Oct 11 01:36:33 crc kubenswrapper[4743]: I1011 01:36:33.285914 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnj5k" event={"ID":"6893fc8e-9116-40a8-81dc-eb00208ee425","Type":"ContainerDied","Data":"2844370a565d8910dd3618631f6f007339317d84b6d25bb1a43d0ae67977d120"} Oct 11 01:36:33 crc kubenswrapper[4743]: I1011 01:36:33.285923 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rnj5k" Oct 11 01:36:33 crc kubenswrapper[4743]: I1011 01:36:33.285947 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnj5k" event={"ID":"6893fc8e-9116-40a8-81dc-eb00208ee425","Type":"ContainerDied","Data":"3bd1a1adeb104c121d6250a5cfa27d3c33000d5ecb4364dc48a4deb481388a8a"} Oct 11 01:36:33 crc kubenswrapper[4743]: I1011 01:36:33.285974 4743 scope.go:117] "RemoveContainer" containerID="2844370a565d8910dd3618631f6f007339317d84b6d25bb1a43d0ae67977d120" Oct 11 01:36:33 crc kubenswrapper[4743]: I1011 01:36:33.323110 4743 scope.go:117] "RemoveContainer" containerID="e0d28b40f0e83900b54f9274c1afd19a59b3f7e6a60c725a2f3dbbbe47a5eff3" Oct 11 01:36:33 crc kubenswrapper[4743]: I1011 01:36:33.323140 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rnj5k"] Oct 11 01:36:33 crc kubenswrapper[4743]: I1011 01:36:33.334953 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rnj5k"] Oct 11 01:36:33 crc kubenswrapper[4743]: I1011 01:36:33.349180 4743 scope.go:117] "RemoveContainer" containerID="d50f921df520a3b982eb3daa363b5ea05bf568b1b5f5fa5f7f3b43d90094b63d" Oct 11 01:36:33 crc kubenswrapper[4743]: I1011 01:36:33.415715 4743 scope.go:117] "RemoveContainer" containerID="2844370a565d8910dd3618631f6f007339317d84b6d25bb1a43d0ae67977d120" Oct 11 01:36:33 crc kubenswrapper[4743]: E1011 01:36:33.416583 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2844370a565d8910dd3618631f6f007339317d84b6d25bb1a43d0ae67977d120\": container with ID starting with 2844370a565d8910dd3618631f6f007339317d84b6d25bb1a43d0ae67977d120 not found: ID does not exist" containerID="2844370a565d8910dd3618631f6f007339317d84b6d25bb1a43d0ae67977d120" Oct 11 01:36:33 crc kubenswrapper[4743]: I1011 01:36:33.416620 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2844370a565d8910dd3618631f6f007339317d84b6d25bb1a43d0ae67977d120"} err="failed to get container status \"2844370a565d8910dd3618631f6f007339317d84b6d25bb1a43d0ae67977d120\": rpc error: code = NotFound desc = could not find container \"2844370a565d8910dd3618631f6f007339317d84b6d25bb1a43d0ae67977d120\": container with ID starting with 2844370a565d8910dd3618631f6f007339317d84b6d25bb1a43d0ae67977d120 not found: ID does not exist" Oct 11 01:36:33 crc kubenswrapper[4743]: I1011 01:36:33.416657 4743 scope.go:117] "RemoveContainer" containerID="e0d28b40f0e83900b54f9274c1afd19a59b3f7e6a60c725a2f3dbbbe47a5eff3" Oct 11 01:36:33 crc kubenswrapper[4743]: E1011 01:36:33.417086 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0d28b40f0e83900b54f9274c1afd19a59b3f7e6a60c725a2f3dbbbe47a5eff3\": container with ID starting with e0d28b40f0e83900b54f9274c1afd19a59b3f7e6a60c725a2f3dbbbe47a5eff3 not found: ID does not exist" containerID="e0d28b40f0e83900b54f9274c1afd19a59b3f7e6a60c725a2f3dbbbe47a5eff3" Oct 11 01:36:33 crc kubenswrapper[4743]: I1011 01:36:33.417122 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0d28b40f0e83900b54f9274c1afd19a59b3f7e6a60c725a2f3dbbbe47a5eff3"} err="failed to get container status \"e0d28b40f0e83900b54f9274c1afd19a59b3f7e6a60c725a2f3dbbbe47a5eff3\": rpc error: code = NotFound desc = could not find container \"e0d28b40f0e83900b54f9274c1afd19a59b3f7e6a60c725a2f3dbbbe47a5eff3\": container with ID starting with e0d28b40f0e83900b54f9274c1afd19a59b3f7e6a60c725a2f3dbbbe47a5eff3 not found: ID does not exist" Oct 11 01:36:33 crc kubenswrapper[4743]: I1011 01:36:33.417149 4743 scope.go:117] "RemoveContainer" containerID="d50f921df520a3b982eb3daa363b5ea05bf568b1b5f5fa5f7f3b43d90094b63d" Oct 11 01:36:33 crc kubenswrapper[4743]: E1011 01:36:33.417391 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d50f921df520a3b982eb3daa363b5ea05bf568b1b5f5fa5f7f3b43d90094b63d\": container with ID starting with d50f921df520a3b982eb3daa363b5ea05bf568b1b5f5fa5f7f3b43d90094b63d not found: ID does not exist" containerID="d50f921df520a3b982eb3daa363b5ea05bf568b1b5f5fa5f7f3b43d90094b63d" Oct 11 01:36:33 crc kubenswrapper[4743]: I1011 01:36:33.417426 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d50f921df520a3b982eb3daa363b5ea05bf568b1b5f5fa5f7f3b43d90094b63d"} err="failed to get container status \"d50f921df520a3b982eb3daa363b5ea05bf568b1b5f5fa5f7f3b43d90094b63d\": rpc error: code = NotFound desc = could not find container \"d50f921df520a3b982eb3daa363b5ea05bf568b1b5f5fa5f7f3b43d90094b63d\": container with ID starting with d50f921df520a3b982eb3daa363b5ea05bf568b1b5f5fa5f7f3b43d90094b63d not found: ID does not exist" Oct 11 01:36:34 crc kubenswrapper[4743]: I1011 01:36:34.106102 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6893fc8e-9116-40a8-81dc-eb00208ee425" path="/var/lib/kubelet/pods/6893fc8e-9116-40a8-81dc-eb00208ee425/volumes" Oct 11 01:36:36 crc kubenswrapper[4743]: I1011 01:36:36.321731 4743 generic.go:334] "Generic (PLEG): container finished" podID="4255878f-68bf-41cc-8f1a-6e38ac2e2401" containerID="fa1867cd662351046501c234689e2de6640e4bd19820fe084e7482f0ca1a8784" exitCode=0 Oct 11 01:36:36 crc kubenswrapper[4743]: I1011 01:36:36.322117 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" event={"ID":"4255878f-68bf-41cc-8f1a-6e38ac2e2401","Type":"ContainerDied","Data":"fa1867cd662351046501c234689e2de6640e4bd19820fe084e7482f0ca1a8784"} Oct 11 01:36:37 crc kubenswrapper[4743]: I1011 01:36:37.892797 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.044963 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-ceilometer-compute-config-data-2\") pod \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.045335 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-telemetry-combined-ca-bundle\") pod \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.045485 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcgvd\" (UniqueName: \"kubernetes.io/projected/4255878f-68bf-41cc-8f1a-6e38ac2e2401-kube-api-access-fcgvd\") pod \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.045604 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-ceilometer-compute-config-data-1\") pod \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.045775 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-inventory\") pod \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.045931 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-ssh-key\") pod \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.045966 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-ceilometer-compute-config-data-0\") pod \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\" (UID: \"4255878f-68bf-41cc-8f1a-6e38ac2e2401\") " Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.050895 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "4255878f-68bf-41cc-8f1a-6e38ac2e2401" (UID: "4255878f-68bf-41cc-8f1a-6e38ac2e2401"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.055040 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4255878f-68bf-41cc-8f1a-6e38ac2e2401-kube-api-access-fcgvd" (OuterVolumeSpecName: "kube-api-access-fcgvd") pod "4255878f-68bf-41cc-8f1a-6e38ac2e2401" (UID: "4255878f-68bf-41cc-8f1a-6e38ac2e2401"). InnerVolumeSpecName "kube-api-access-fcgvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.077513 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4255878f-68bf-41cc-8f1a-6e38ac2e2401" (UID: "4255878f-68bf-41cc-8f1a-6e38ac2e2401"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.078381 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-inventory" (OuterVolumeSpecName: "inventory") pod "4255878f-68bf-41cc-8f1a-6e38ac2e2401" (UID: "4255878f-68bf-41cc-8f1a-6e38ac2e2401"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.079789 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "4255878f-68bf-41cc-8f1a-6e38ac2e2401" (UID: "4255878f-68bf-41cc-8f1a-6e38ac2e2401"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.081085 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "4255878f-68bf-41cc-8f1a-6e38ac2e2401" (UID: "4255878f-68bf-41cc-8f1a-6e38ac2e2401"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.086441 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "4255878f-68bf-41cc-8f1a-6e38ac2e2401" (UID: "4255878f-68bf-41cc-8f1a-6e38ac2e2401"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.148766 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.148820 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.148831 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.148876 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.148887 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.148899 4743 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4255878f-68bf-41cc-8f1a-6e38ac2e2401-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.148912 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcgvd\" (UniqueName: \"kubernetes.io/projected/4255878f-68bf-41cc-8f1a-6e38ac2e2401-kube-api-access-fcgvd\") on node \"crc\" DevicePath \"\"" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.370051 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" event={"ID":"4255878f-68bf-41cc-8f1a-6e38ac2e2401","Type":"ContainerDied","Data":"5be1738d831c0a12b79628f7d214268e1aee2f828dcd9cd4f597f958277a146b"} Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.370109 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5be1738d831c0a12b79628f7d214268e1aee2f828dcd9cd4f597f958277a146b" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.370636 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.431910 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7"] Oct 11 01:36:38 crc kubenswrapper[4743]: E1011 01:36:38.432407 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6893fc8e-9116-40a8-81dc-eb00208ee425" containerName="extract-content" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.432428 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6893fc8e-9116-40a8-81dc-eb00208ee425" containerName="extract-content" Oct 11 01:36:38 crc kubenswrapper[4743]: E1011 01:36:38.432448 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4255878f-68bf-41cc-8f1a-6e38ac2e2401" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.432455 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4255878f-68bf-41cc-8f1a-6e38ac2e2401" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 11 01:36:38 crc kubenswrapper[4743]: E1011 01:36:38.432476 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6893fc8e-9116-40a8-81dc-eb00208ee425" containerName="extract-utilities" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.432482 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6893fc8e-9116-40a8-81dc-eb00208ee425" containerName="extract-utilities" Oct 11 01:36:38 crc kubenswrapper[4743]: E1011 01:36:38.432495 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6893fc8e-9116-40a8-81dc-eb00208ee425" containerName="registry-server" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.432500 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6893fc8e-9116-40a8-81dc-eb00208ee425" containerName="registry-server" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.432715 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4255878f-68bf-41cc-8f1a-6e38ac2e2401" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.432822 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6893fc8e-9116-40a8-81dc-eb00208ee425" containerName="registry-server" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.433662 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.435996 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.436353 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.436556 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.436739 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.437056 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.444877 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7"] Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.463075 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.463320 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.463404 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.463531 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.463614 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.463803 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.566134 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.566446 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnxcw\" (UniqueName: \"kubernetes.io/projected/da7b50d8-258c-468b-8a7e-0ec7ae206309-kube-api-access-mnxcw\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.566560 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.566667 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.566703 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.566728 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.566769 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.570383 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.570599 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.570905 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.571619 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.571851 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.573045 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.667717 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnxcw\" (UniqueName: \"kubernetes.io/projected/da7b50d8-258c-468b-8a7e-0ec7ae206309-kube-api-access-mnxcw\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.685983 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnxcw\" (UniqueName: \"kubernetes.io/projected/da7b50d8-258c-468b-8a7e-0ec7ae206309-kube-api-access-mnxcw\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" Oct 11 01:36:38 crc kubenswrapper[4743]: I1011 01:36:38.762828 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" Oct 11 01:36:39 crc kubenswrapper[4743]: I1011 01:36:39.325287 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7"] Oct 11 01:36:39 crc kubenswrapper[4743]: I1011 01:36:39.380916 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" event={"ID":"da7b50d8-258c-468b-8a7e-0ec7ae206309","Type":"ContainerStarted","Data":"d5a0ef89fb149c59d32ec7f9470212ae5371805e688801a34af2dcb360c3547c"} Oct 11 01:36:41 crc kubenswrapper[4743]: I1011 01:36:41.407594 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" event={"ID":"da7b50d8-258c-468b-8a7e-0ec7ae206309","Type":"ContainerStarted","Data":"50eea15b2740c117a462c411d8db4768720a017834258dfe8ecaf0b4e24b0453"} Oct 11 01:36:41 crc kubenswrapper[4743]: I1011 01:36:41.432182 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" podStartSLOduration=2.535495535 podStartE2EDuration="3.432163531s" podCreationTimestamp="2025-10-11 01:36:38 +0000 UTC" firstStartedPulling="2025-10-11 01:36:39.329109246 +0000 UTC m=+2693.982089643" lastFinishedPulling="2025-10-11 01:36:40.225777242 +0000 UTC m=+2694.878757639" observedRunningTime="2025-10-11 01:36:41.430156441 +0000 UTC m=+2696.083136848" watchObservedRunningTime="2025-10-11 01:36:41.432163531 +0000 UTC m=+2696.085143938" Oct 11 01:36:44 crc kubenswrapper[4743]: I1011 01:36:44.458677 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:36:44 crc kubenswrapper[4743]: I1011 01:36:44.459003 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:36:44 crc kubenswrapper[4743]: I1011 01:36:44.459056 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 01:36:44 crc kubenswrapper[4743]: I1011 01:36:44.459920 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ed4b65bd7e78e9bd75562b390e48eafdf4eff27b1128523c7bfdb6570c4e9641"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 01:36:44 crc kubenswrapper[4743]: I1011 01:36:44.460240 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://ed4b65bd7e78e9bd75562b390e48eafdf4eff27b1128523c7bfdb6570c4e9641" gracePeriod=600 Oct 11 01:36:45 crc kubenswrapper[4743]: I1011 01:36:45.461830 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="ed4b65bd7e78e9bd75562b390e48eafdf4eff27b1128523c7bfdb6570c4e9641" exitCode=0 Oct 11 01:36:45 crc kubenswrapper[4743]: I1011 01:36:45.461983 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"ed4b65bd7e78e9bd75562b390e48eafdf4eff27b1128523c7bfdb6570c4e9641"} Oct 11 01:36:45 crc kubenswrapper[4743]: I1011 01:36:45.462408 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723"} Oct 11 01:36:45 crc kubenswrapper[4743]: I1011 01:36:45.462448 4743 scope.go:117] "RemoveContainer" containerID="cf9dc144cbba351cee736f9d035ea7c635ec0460821f8e9aa803ef97ac3675ae" Oct 11 01:38:19 crc kubenswrapper[4743]: I1011 01:38:19.912330 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nfztq"] Oct 11 01:38:19 crc kubenswrapper[4743]: I1011 01:38:19.915484 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nfztq" Oct 11 01:38:19 crc kubenswrapper[4743]: I1011 01:38:19.936597 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nfztq"] Oct 11 01:38:20 crc kubenswrapper[4743]: I1011 01:38:20.099512 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tnjq\" (UniqueName: \"kubernetes.io/projected/edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f-kube-api-access-7tnjq\") pod \"community-operators-nfztq\" (UID: \"edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f\") " pod="openshift-marketplace/community-operators-nfztq" Oct 11 01:38:20 crc kubenswrapper[4743]: I1011 01:38:20.100341 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f-catalog-content\") pod \"community-operators-nfztq\" (UID: \"edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f\") " pod="openshift-marketplace/community-operators-nfztq" Oct 11 01:38:20 crc kubenswrapper[4743]: I1011 01:38:20.101240 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f-utilities\") pod \"community-operators-nfztq\" (UID: \"edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f\") " pod="openshift-marketplace/community-operators-nfztq" Oct 11 01:38:20 crc kubenswrapper[4743]: I1011 01:38:20.203833 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tnjq\" (UniqueName: \"kubernetes.io/projected/edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f-kube-api-access-7tnjq\") pod \"community-operators-nfztq\" (UID: \"edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f\") " pod="openshift-marketplace/community-operators-nfztq" Oct 11 01:38:20 crc kubenswrapper[4743]: I1011 01:38:20.203899 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f-catalog-content\") pod \"community-operators-nfztq\" (UID: \"edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f\") " pod="openshift-marketplace/community-operators-nfztq" Oct 11 01:38:20 crc kubenswrapper[4743]: I1011 01:38:20.203965 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f-utilities\") pod \"community-operators-nfztq\" (UID: \"edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f\") " pod="openshift-marketplace/community-operators-nfztq" Oct 11 01:38:20 crc kubenswrapper[4743]: I1011 01:38:20.204450 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f-utilities\") pod \"community-operators-nfztq\" (UID: \"edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f\") " pod="openshift-marketplace/community-operators-nfztq" Oct 11 01:38:20 crc kubenswrapper[4743]: I1011 01:38:20.204608 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f-catalog-content\") pod \"community-operators-nfztq\" (UID: \"edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f\") " pod="openshift-marketplace/community-operators-nfztq" Oct 11 01:38:20 crc kubenswrapper[4743]: I1011 01:38:20.224431 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tnjq\" (UniqueName: \"kubernetes.io/projected/edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f-kube-api-access-7tnjq\") pod \"community-operators-nfztq\" (UID: \"edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f\") " pod="openshift-marketplace/community-operators-nfztq" Oct 11 01:38:20 crc kubenswrapper[4743]: I1011 01:38:20.239810 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nfztq" Oct 11 01:38:20 crc kubenswrapper[4743]: W1011 01:38:20.769916 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedfbefd6_a8c7_4e3b_80db_b8c0c1dba35f.slice/crio-857163249d5dfa8360f37eb2362f7d3ae8081846b2fdf13a58c95f3c053e7d71 WatchSource:0}: Error finding container 857163249d5dfa8360f37eb2362f7d3ae8081846b2fdf13a58c95f3c053e7d71: Status 404 returned error can't find the container with id 857163249d5dfa8360f37eb2362f7d3ae8081846b2fdf13a58c95f3c053e7d71 Oct 11 01:38:20 crc kubenswrapper[4743]: I1011 01:38:20.772388 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nfztq"] Oct 11 01:38:21 crc kubenswrapper[4743]: I1011 01:38:21.606629 4743 generic.go:334] "Generic (PLEG): container finished" podID="edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f" containerID="344dabdf65632a8fa0e88f8aeac59c3554d94fd545197f2be46904061a59b2f2" exitCode=0 Oct 11 01:38:21 crc kubenswrapper[4743]: I1011 01:38:21.606678 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfztq" event={"ID":"edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f","Type":"ContainerDied","Data":"344dabdf65632a8fa0e88f8aeac59c3554d94fd545197f2be46904061a59b2f2"} Oct 11 01:38:21 crc kubenswrapper[4743]: I1011 01:38:21.606707 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfztq" event={"ID":"edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f","Type":"ContainerStarted","Data":"857163249d5dfa8360f37eb2362f7d3ae8081846b2fdf13a58c95f3c053e7d71"} Oct 11 01:38:21 crc kubenswrapper[4743]: I1011 01:38:21.610796 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 01:38:22 crc kubenswrapper[4743]: I1011 01:38:22.619316 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfztq" event={"ID":"edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f","Type":"ContainerStarted","Data":"9eb66e455a518eeca6a79024dfc9cc191e3d6e0201632ad3357e2bc5f9c4ad2d"} Oct 11 01:38:23 crc kubenswrapper[4743]: I1011 01:38:23.636900 4743 generic.go:334] "Generic (PLEG): container finished" podID="edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f" containerID="9eb66e455a518eeca6a79024dfc9cc191e3d6e0201632ad3357e2bc5f9c4ad2d" exitCode=0 Oct 11 01:38:23 crc kubenswrapper[4743]: I1011 01:38:23.636991 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfztq" event={"ID":"edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f","Type":"ContainerDied","Data":"9eb66e455a518eeca6a79024dfc9cc191e3d6e0201632ad3357e2bc5f9c4ad2d"} Oct 11 01:38:24 crc kubenswrapper[4743]: I1011 01:38:24.648638 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfztq" event={"ID":"edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f","Type":"ContainerStarted","Data":"d3c36b882a00e7412fb2648d8e21f0a61ea8ace85a92e99a5c3aff12199c41b6"} Oct 11 01:38:24 crc kubenswrapper[4743]: I1011 01:38:24.666331 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nfztq" podStartSLOduration=3.202899161 podStartE2EDuration="5.666312697s" podCreationTimestamp="2025-10-11 01:38:19 +0000 UTC" firstStartedPulling="2025-10-11 01:38:21.610236947 +0000 UTC m=+2796.263217374" lastFinishedPulling="2025-10-11 01:38:24.073650503 +0000 UTC m=+2798.726630910" observedRunningTime="2025-10-11 01:38:24.663189019 +0000 UTC m=+2799.316169446" watchObservedRunningTime="2025-10-11 01:38:24.666312697 +0000 UTC m=+2799.319293104" Oct 11 01:38:30 crc kubenswrapper[4743]: I1011 01:38:30.241154 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nfztq" Oct 11 01:38:30 crc kubenswrapper[4743]: I1011 01:38:30.242314 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nfztq" Oct 11 01:38:30 crc kubenswrapper[4743]: I1011 01:38:30.294500 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nfztq" Oct 11 01:38:30 crc kubenswrapper[4743]: I1011 01:38:30.815103 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nfztq" Oct 11 01:38:30 crc kubenswrapper[4743]: I1011 01:38:30.870225 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nfztq"] Oct 11 01:38:32 crc kubenswrapper[4743]: I1011 01:38:32.751905 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nfztq" podUID="edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f" containerName="registry-server" containerID="cri-o://d3c36b882a00e7412fb2648d8e21f0a61ea8ace85a92e99a5c3aff12199c41b6" gracePeriod=2 Oct 11 01:38:33 crc kubenswrapper[4743]: I1011 01:38:33.764354 4743 generic.go:334] "Generic (PLEG): container finished" podID="edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f" containerID="d3c36b882a00e7412fb2648d8e21f0a61ea8ace85a92e99a5c3aff12199c41b6" exitCode=0 Oct 11 01:38:33 crc kubenswrapper[4743]: I1011 01:38:33.764428 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfztq" event={"ID":"edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f","Type":"ContainerDied","Data":"d3c36b882a00e7412fb2648d8e21f0a61ea8ace85a92e99a5c3aff12199c41b6"} Oct 11 01:38:33 crc kubenswrapper[4743]: I1011 01:38:33.765170 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfztq" event={"ID":"edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f","Type":"ContainerDied","Data":"857163249d5dfa8360f37eb2362f7d3ae8081846b2fdf13a58c95f3c053e7d71"} Oct 11 01:38:33 crc kubenswrapper[4743]: I1011 01:38:33.765218 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="857163249d5dfa8360f37eb2362f7d3ae8081846b2fdf13a58c95f3c053e7d71" Oct 11 01:38:33 crc kubenswrapper[4743]: I1011 01:38:33.823271 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nfztq" Oct 11 01:38:33 crc kubenswrapper[4743]: I1011 01:38:33.928564 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tnjq\" (UniqueName: \"kubernetes.io/projected/edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f-kube-api-access-7tnjq\") pod \"edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f\" (UID: \"edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f\") " Oct 11 01:38:33 crc kubenswrapper[4743]: I1011 01:38:33.928726 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f-utilities\") pod \"edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f\" (UID: \"edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f\") " Oct 11 01:38:33 crc kubenswrapper[4743]: I1011 01:38:33.928816 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f-catalog-content\") pod \"edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f\" (UID: \"edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f\") " Oct 11 01:38:33 crc kubenswrapper[4743]: I1011 01:38:33.930070 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f-utilities" (OuterVolumeSpecName: "utilities") pod "edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f" (UID: "edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:38:33 crc kubenswrapper[4743]: I1011 01:38:33.936936 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f-kube-api-access-7tnjq" (OuterVolumeSpecName: "kube-api-access-7tnjq") pod "edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f" (UID: "edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f"). InnerVolumeSpecName "kube-api-access-7tnjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:38:33 crc kubenswrapper[4743]: I1011 01:38:33.993618 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f" (UID: "edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:38:34 crc kubenswrapper[4743]: I1011 01:38:34.032289 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tnjq\" (UniqueName: \"kubernetes.io/projected/edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f-kube-api-access-7tnjq\") on node \"crc\" DevicePath \"\"" Oct 11 01:38:34 crc kubenswrapper[4743]: I1011 01:38:34.032327 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 01:38:34 crc kubenswrapper[4743]: I1011 01:38:34.032337 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 01:38:34 crc kubenswrapper[4743]: I1011 01:38:34.776063 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nfztq" Oct 11 01:38:34 crc kubenswrapper[4743]: I1011 01:38:34.816634 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nfztq"] Oct 11 01:38:34 crc kubenswrapper[4743]: I1011 01:38:34.826638 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nfztq"] Oct 11 01:38:36 crc kubenswrapper[4743]: I1011 01:38:36.118986 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f" path="/var/lib/kubelet/pods/edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f/volumes" Oct 11 01:38:44 crc kubenswrapper[4743]: I1011 01:38:44.458536 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:38:44 crc kubenswrapper[4743]: I1011 01:38:44.460083 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:39:13 crc kubenswrapper[4743]: I1011 01:39:13.352448 4743 generic.go:334] "Generic (PLEG): container finished" podID="da7b50d8-258c-468b-8a7e-0ec7ae206309" containerID="50eea15b2740c117a462c411d8db4768720a017834258dfe8ecaf0b4e24b0453" exitCode=0 Oct 11 01:39:13 crc kubenswrapper[4743]: I1011 01:39:13.352533 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" event={"ID":"da7b50d8-258c-468b-8a7e-0ec7ae206309","Type":"ContainerDied","Data":"50eea15b2740c117a462c411d8db4768720a017834258dfe8ecaf0b4e24b0453"} Oct 11 01:39:14 crc kubenswrapper[4743]: I1011 01:39:14.458620 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:39:14 crc kubenswrapper[4743]: I1011 01:39:14.458961 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:39:14 crc kubenswrapper[4743]: I1011 01:39:14.886241 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" Oct 11 01:39:14 crc kubenswrapper[4743]: I1011 01:39:14.971319 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-ceilometer-ipmi-config-data-2\") pod \"da7b50d8-258c-468b-8a7e-0ec7ae206309\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " Oct 11 01:39:14 crc kubenswrapper[4743]: I1011 01:39:14.971403 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-ceilometer-ipmi-config-data-1\") pod \"da7b50d8-258c-468b-8a7e-0ec7ae206309\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " Oct 11 01:39:14 crc kubenswrapper[4743]: I1011 01:39:14.971465 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-telemetry-power-monitoring-combined-ca-bundle\") pod \"da7b50d8-258c-468b-8a7e-0ec7ae206309\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " Oct 11 01:39:14 crc kubenswrapper[4743]: I1011 01:39:14.971548 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-ssh-key\") pod \"da7b50d8-258c-468b-8a7e-0ec7ae206309\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " Oct 11 01:39:14 crc kubenswrapper[4743]: I1011 01:39:14.971585 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-ceilometer-ipmi-config-data-0\") pod \"da7b50d8-258c-468b-8a7e-0ec7ae206309\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " Oct 11 01:39:14 crc kubenswrapper[4743]: I1011 01:39:14.971639 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-inventory\") pod \"da7b50d8-258c-468b-8a7e-0ec7ae206309\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " Oct 11 01:39:14 crc kubenswrapper[4743]: I1011 01:39:14.971786 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnxcw\" (UniqueName: \"kubernetes.io/projected/da7b50d8-258c-468b-8a7e-0ec7ae206309-kube-api-access-mnxcw\") pod \"da7b50d8-258c-468b-8a7e-0ec7ae206309\" (UID: \"da7b50d8-258c-468b-8a7e-0ec7ae206309\") " Oct 11 01:39:14 crc kubenswrapper[4743]: I1011 01:39:14.980178 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "da7b50d8-258c-468b-8a7e-0ec7ae206309" (UID: "da7b50d8-258c-468b-8a7e-0ec7ae206309"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:39:14 crc kubenswrapper[4743]: I1011 01:39:14.980260 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da7b50d8-258c-468b-8a7e-0ec7ae206309-kube-api-access-mnxcw" (OuterVolumeSpecName: "kube-api-access-mnxcw") pod "da7b50d8-258c-468b-8a7e-0ec7ae206309" (UID: "da7b50d8-258c-468b-8a7e-0ec7ae206309"). InnerVolumeSpecName "kube-api-access-mnxcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.016312 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "da7b50d8-258c-468b-8a7e-0ec7ae206309" (UID: "da7b50d8-258c-468b-8a7e-0ec7ae206309"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.018964 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "da7b50d8-258c-468b-8a7e-0ec7ae206309" (UID: "da7b50d8-258c-468b-8a7e-0ec7ae206309"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.033735 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "da7b50d8-258c-468b-8a7e-0ec7ae206309" (UID: "da7b50d8-258c-468b-8a7e-0ec7ae206309"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.035122 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-inventory" (OuterVolumeSpecName: "inventory") pod "da7b50d8-258c-468b-8a7e-0ec7ae206309" (UID: "da7b50d8-258c-468b-8a7e-0ec7ae206309"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.038425 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "da7b50d8-258c-468b-8a7e-0ec7ae206309" (UID: "da7b50d8-258c-468b-8a7e-0ec7ae206309"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.076730 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.076940 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.077071 4743 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.077405 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.077589 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.077721 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da7b50d8-258c-468b-8a7e-0ec7ae206309-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.077849 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnxcw\" (UniqueName: \"kubernetes.io/projected/da7b50d8-258c-468b-8a7e-0ec7ae206309-kube-api-access-mnxcw\") on node \"crc\" DevicePath \"\"" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.386628 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" event={"ID":"da7b50d8-258c-468b-8a7e-0ec7ae206309","Type":"ContainerDied","Data":"d5a0ef89fb149c59d32ec7f9470212ae5371805e688801a34af2dcb360c3547c"} Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.386693 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5a0ef89fb149c59d32ec7f9470212ae5371805e688801a34af2dcb360c3547c" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.386794 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.526842 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m"] Oct 11 01:39:15 crc kubenswrapper[4743]: E1011 01:39:15.527287 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f" containerName="extract-utilities" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.527303 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f" containerName="extract-utilities" Oct 11 01:39:15 crc kubenswrapper[4743]: E1011 01:39:15.527314 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f" containerName="extract-content" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.527321 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f" containerName="extract-content" Oct 11 01:39:15 crc kubenswrapper[4743]: E1011 01:39:15.527341 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da7b50d8-258c-468b-8a7e-0ec7ae206309" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.527349 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="da7b50d8-258c-468b-8a7e-0ec7ae206309" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Oct 11 01:39:15 crc kubenswrapper[4743]: E1011 01:39:15.527383 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f" containerName="registry-server" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.527390 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f" containerName="registry-server" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.527612 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="da7b50d8-258c-468b-8a7e-0ec7ae206309" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.527631 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="edfbefd6-a8c7-4e3b-80db-b8c0c1dba35f" containerName="registry-server" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.528448 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.539144 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.540158 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.541502 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.541601 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.541676 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.545520 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m"] Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.692376 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k277b\" (UniqueName: \"kubernetes.io/projected/f555b790-f6a4-4c68-8d8d-97fd07743aa8-kube-api-access-k277b\") pod \"logging-edpm-deployment-openstack-edpm-ipam-h7k2m\" (UID: \"f555b790-f6a4-4c68-8d8d-97fd07743aa8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.692739 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f555b790-f6a4-4c68-8d8d-97fd07743aa8-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-h7k2m\" (UID: \"f555b790-f6a4-4c68-8d8d-97fd07743aa8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.692880 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f555b790-f6a4-4c68-8d8d-97fd07743aa8-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-h7k2m\" (UID: \"f555b790-f6a4-4c68-8d8d-97fd07743aa8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.692911 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f555b790-f6a4-4c68-8d8d-97fd07743aa8-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-h7k2m\" (UID: \"f555b790-f6a4-4c68-8d8d-97fd07743aa8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.692941 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f555b790-f6a4-4c68-8d8d-97fd07743aa8-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-h7k2m\" (UID: \"f555b790-f6a4-4c68-8d8d-97fd07743aa8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.794922 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k277b\" (UniqueName: \"kubernetes.io/projected/f555b790-f6a4-4c68-8d8d-97fd07743aa8-kube-api-access-k277b\") pod \"logging-edpm-deployment-openstack-edpm-ipam-h7k2m\" (UID: \"f555b790-f6a4-4c68-8d8d-97fd07743aa8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.795173 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f555b790-f6a4-4c68-8d8d-97fd07743aa8-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-h7k2m\" (UID: \"f555b790-f6a4-4c68-8d8d-97fd07743aa8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.797444 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f555b790-f6a4-4c68-8d8d-97fd07743aa8-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-h7k2m\" (UID: \"f555b790-f6a4-4c68-8d8d-97fd07743aa8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.797640 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f555b790-f6a4-4c68-8d8d-97fd07743aa8-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-h7k2m\" (UID: \"f555b790-f6a4-4c68-8d8d-97fd07743aa8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.801002 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f555b790-f6a4-4c68-8d8d-97fd07743aa8-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-h7k2m\" (UID: \"f555b790-f6a4-4c68-8d8d-97fd07743aa8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.801089 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f555b790-f6a4-4c68-8d8d-97fd07743aa8-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-h7k2m\" (UID: \"f555b790-f6a4-4c68-8d8d-97fd07743aa8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.802088 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f555b790-f6a4-4c68-8d8d-97fd07743aa8-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-h7k2m\" (UID: \"f555b790-f6a4-4c68-8d8d-97fd07743aa8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.804546 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f555b790-f6a4-4c68-8d8d-97fd07743aa8-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-h7k2m\" (UID: \"f555b790-f6a4-4c68-8d8d-97fd07743aa8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.807615 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f555b790-f6a4-4c68-8d8d-97fd07743aa8-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-h7k2m\" (UID: \"f555b790-f6a4-4c68-8d8d-97fd07743aa8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.814034 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k277b\" (UniqueName: \"kubernetes.io/projected/f555b790-f6a4-4c68-8d8d-97fd07743aa8-kube-api-access-k277b\") pod \"logging-edpm-deployment-openstack-edpm-ipam-h7k2m\" (UID: \"f555b790-f6a4-4c68-8d8d-97fd07743aa8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m" Oct 11 01:39:15 crc kubenswrapper[4743]: I1011 01:39:15.894639 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m" Oct 11 01:39:16 crc kubenswrapper[4743]: I1011 01:39:16.558968 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m"] Oct 11 01:39:17 crc kubenswrapper[4743]: I1011 01:39:17.411093 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m" event={"ID":"f555b790-f6a4-4c68-8d8d-97fd07743aa8","Type":"ContainerStarted","Data":"122eba6d93dbca833018bf9d8c603bbac03a07e43c41098372e569602536e126"} Oct 11 01:39:17 crc kubenswrapper[4743]: I1011 01:39:17.411568 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m" event={"ID":"f555b790-f6a4-4c68-8d8d-97fd07743aa8","Type":"ContainerStarted","Data":"d83b5b5cc94a9e9d18cf2304ed54385f9812fc5fc4cac1e76655cb6578381750"} Oct 11 01:39:17 crc kubenswrapper[4743]: I1011 01:39:17.436567 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m" podStartSLOduration=1.920764796 podStartE2EDuration="2.436546583s" podCreationTimestamp="2025-10-11 01:39:15 +0000 UTC" firstStartedPulling="2025-10-11 01:39:16.572466265 +0000 UTC m=+2851.225446702" lastFinishedPulling="2025-10-11 01:39:17.088248052 +0000 UTC m=+2851.741228489" observedRunningTime="2025-10-11 01:39:17.429843987 +0000 UTC m=+2852.082824384" watchObservedRunningTime="2025-10-11 01:39:17.436546583 +0000 UTC m=+2852.089526980" Oct 11 01:39:39 crc kubenswrapper[4743]: I1011 01:39:39.628461 4743 generic.go:334] "Generic (PLEG): container finished" podID="f555b790-f6a4-4c68-8d8d-97fd07743aa8" containerID="122eba6d93dbca833018bf9d8c603bbac03a07e43c41098372e569602536e126" exitCode=0 Oct 11 01:39:39 crc kubenswrapper[4743]: I1011 01:39:39.628529 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m" event={"ID":"f555b790-f6a4-4c68-8d8d-97fd07743aa8","Type":"ContainerDied","Data":"122eba6d93dbca833018bf9d8c603bbac03a07e43c41098372e569602536e126"} Oct 11 01:39:41 crc kubenswrapper[4743]: I1011 01:39:41.144712 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m" Oct 11 01:39:41 crc kubenswrapper[4743]: I1011 01:39:41.286644 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f555b790-f6a4-4c68-8d8d-97fd07743aa8-logging-compute-config-data-0\") pod \"f555b790-f6a4-4c68-8d8d-97fd07743aa8\" (UID: \"f555b790-f6a4-4c68-8d8d-97fd07743aa8\") " Oct 11 01:39:41 crc kubenswrapper[4743]: I1011 01:39:41.286878 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f555b790-f6a4-4c68-8d8d-97fd07743aa8-inventory\") pod \"f555b790-f6a4-4c68-8d8d-97fd07743aa8\" (UID: \"f555b790-f6a4-4c68-8d8d-97fd07743aa8\") " Oct 11 01:39:41 crc kubenswrapper[4743]: I1011 01:39:41.286924 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f555b790-f6a4-4c68-8d8d-97fd07743aa8-ssh-key\") pod \"f555b790-f6a4-4c68-8d8d-97fd07743aa8\" (UID: \"f555b790-f6a4-4c68-8d8d-97fd07743aa8\") " Oct 11 01:39:41 crc kubenswrapper[4743]: I1011 01:39:41.286964 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f555b790-f6a4-4c68-8d8d-97fd07743aa8-logging-compute-config-data-1\") pod \"f555b790-f6a4-4c68-8d8d-97fd07743aa8\" (UID: \"f555b790-f6a4-4c68-8d8d-97fd07743aa8\") " Oct 11 01:39:41 crc kubenswrapper[4743]: I1011 01:39:41.287044 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k277b\" (UniqueName: \"kubernetes.io/projected/f555b790-f6a4-4c68-8d8d-97fd07743aa8-kube-api-access-k277b\") pod \"f555b790-f6a4-4c68-8d8d-97fd07743aa8\" (UID: \"f555b790-f6a4-4c68-8d8d-97fd07743aa8\") " Oct 11 01:39:41 crc kubenswrapper[4743]: I1011 01:39:41.292326 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f555b790-f6a4-4c68-8d8d-97fd07743aa8-kube-api-access-k277b" (OuterVolumeSpecName: "kube-api-access-k277b") pod "f555b790-f6a4-4c68-8d8d-97fd07743aa8" (UID: "f555b790-f6a4-4c68-8d8d-97fd07743aa8"). InnerVolumeSpecName "kube-api-access-k277b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:39:41 crc kubenswrapper[4743]: I1011 01:39:41.316486 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f555b790-f6a4-4c68-8d8d-97fd07743aa8-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "f555b790-f6a4-4c68-8d8d-97fd07743aa8" (UID: "f555b790-f6a4-4c68-8d8d-97fd07743aa8"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:39:41 crc kubenswrapper[4743]: I1011 01:39:41.328596 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f555b790-f6a4-4c68-8d8d-97fd07743aa8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f555b790-f6a4-4c68-8d8d-97fd07743aa8" (UID: "f555b790-f6a4-4c68-8d8d-97fd07743aa8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:39:41 crc kubenswrapper[4743]: I1011 01:39:41.354181 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f555b790-f6a4-4c68-8d8d-97fd07743aa8-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "f555b790-f6a4-4c68-8d8d-97fd07743aa8" (UID: "f555b790-f6a4-4c68-8d8d-97fd07743aa8"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:39:41 crc kubenswrapper[4743]: I1011 01:39:41.356335 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f555b790-f6a4-4c68-8d8d-97fd07743aa8-inventory" (OuterVolumeSpecName: "inventory") pod "f555b790-f6a4-4c68-8d8d-97fd07743aa8" (UID: "f555b790-f6a4-4c68-8d8d-97fd07743aa8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:39:41 crc kubenswrapper[4743]: I1011 01:39:41.389716 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k277b\" (UniqueName: \"kubernetes.io/projected/f555b790-f6a4-4c68-8d8d-97fd07743aa8-kube-api-access-k277b\") on node \"crc\" DevicePath \"\"" Oct 11 01:39:41 crc kubenswrapper[4743]: I1011 01:39:41.389786 4743 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f555b790-f6a4-4c68-8d8d-97fd07743aa8-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:39:41 crc kubenswrapper[4743]: I1011 01:39:41.389803 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f555b790-f6a4-4c68-8d8d-97fd07743aa8-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:39:41 crc kubenswrapper[4743]: I1011 01:39:41.389816 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f555b790-f6a4-4c68-8d8d-97fd07743aa8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:39:41 crc kubenswrapper[4743]: I1011 01:39:41.389833 4743 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f555b790-f6a4-4c68-8d8d-97fd07743aa8-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 11 01:39:41 crc kubenswrapper[4743]: I1011 01:39:41.664522 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m" event={"ID":"f555b790-f6a4-4c68-8d8d-97fd07743aa8","Type":"ContainerDied","Data":"d83b5b5cc94a9e9d18cf2304ed54385f9812fc5fc4cac1e76655cb6578381750"} Oct 11 01:39:41 crc kubenswrapper[4743]: I1011 01:39:41.664835 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d83b5b5cc94a9e9d18cf2304ed54385f9812fc5fc4cac1e76655cb6578381750" Oct 11 01:39:41 crc kubenswrapper[4743]: I1011 01:39:41.664611 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m" Oct 11 01:39:44 crc kubenswrapper[4743]: I1011 01:39:44.458280 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:39:44 crc kubenswrapper[4743]: I1011 01:39:44.458627 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:39:44 crc kubenswrapper[4743]: I1011 01:39:44.458675 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 01:39:44 crc kubenswrapper[4743]: I1011 01:39:44.459410 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 01:39:44 crc kubenswrapper[4743]: I1011 01:39:44.459464 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" gracePeriod=600 Oct 11 01:39:44 crc kubenswrapper[4743]: E1011 01:39:44.581004 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:39:44 crc kubenswrapper[4743]: I1011 01:39:44.704427 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" exitCode=0 Oct 11 01:39:44 crc kubenswrapper[4743]: I1011 01:39:44.704475 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723"} Oct 11 01:39:44 crc kubenswrapper[4743]: I1011 01:39:44.704518 4743 scope.go:117] "RemoveContainer" containerID="ed4b65bd7e78e9bd75562b390e48eafdf4eff27b1128523c7bfdb6570c4e9641" Oct 11 01:39:44 crc kubenswrapper[4743]: I1011 01:39:44.707081 4743 scope.go:117] "RemoveContainer" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" Oct 11 01:39:44 crc kubenswrapper[4743]: E1011 01:39:44.707885 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:39:58 crc kubenswrapper[4743]: I1011 01:39:58.091711 4743 scope.go:117] "RemoveContainer" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" Oct 11 01:39:58 crc kubenswrapper[4743]: E1011 01:39:58.092745 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:40:13 crc kubenswrapper[4743]: I1011 01:40:13.093292 4743 scope.go:117] "RemoveContainer" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" Oct 11 01:40:13 crc kubenswrapper[4743]: E1011 01:40:13.094643 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:40:25 crc kubenswrapper[4743]: I1011 01:40:25.093001 4743 scope.go:117] "RemoveContainer" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" Oct 11 01:40:25 crc kubenswrapper[4743]: E1011 01:40:25.094701 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:40:40 crc kubenswrapper[4743]: I1011 01:40:40.091762 4743 scope.go:117] "RemoveContainer" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" Oct 11 01:40:40 crc kubenswrapper[4743]: E1011 01:40:40.092567 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:40:54 crc kubenswrapper[4743]: I1011 01:40:54.092188 4743 scope.go:117] "RemoveContainer" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" Oct 11 01:40:54 crc kubenswrapper[4743]: E1011 01:40:54.094295 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:41:09 crc kubenswrapper[4743]: I1011 01:41:09.091972 4743 scope.go:117] "RemoveContainer" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" Oct 11 01:41:09 crc kubenswrapper[4743]: E1011 01:41:09.092906 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:41:23 crc kubenswrapper[4743]: I1011 01:41:23.092726 4743 scope.go:117] "RemoveContainer" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" Oct 11 01:41:23 crc kubenswrapper[4743]: E1011 01:41:23.093621 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:41:37 crc kubenswrapper[4743]: I1011 01:41:37.091680 4743 scope.go:117] "RemoveContainer" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" Oct 11 01:41:37 crc kubenswrapper[4743]: E1011 01:41:37.092618 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:41:49 crc kubenswrapper[4743]: I1011 01:41:49.092607 4743 scope.go:117] "RemoveContainer" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" Oct 11 01:41:49 crc kubenswrapper[4743]: E1011 01:41:49.093445 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:42:03 crc kubenswrapper[4743]: I1011 01:42:03.091807 4743 scope.go:117] "RemoveContainer" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" Oct 11 01:42:03 crc kubenswrapper[4743]: E1011 01:42:03.092740 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:42:16 crc kubenswrapper[4743]: I1011 01:42:16.101124 4743 scope.go:117] "RemoveContainer" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" Oct 11 01:42:16 crc kubenswrapper[4743]: E1011 01:42:16.102328 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:42:28 crc kubenswrapper[4743]: I1011 01:42:28.092838 4743 scope.go:117] "RemoveContainer" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" Oct 11 01:42:28 crc kubenswrapper[4743]: E1011 01:42:28.093706 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:42:40 crc kubenswrapper[4743]: I1011 01:42:40.091683 4743 scope.go:117] "RemoveContainer" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" Oct 11 01:42:40 crc kubenswrapper[4743]: E1011 01:42:40.092861 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:42:54 crc kubenswrapper[4743]: I1011 01:42:54.091903 4743 scope.go:117] "RemoveContainer" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" Oct 11 01:42:54 crc kubenswrapper[4743]: E1011 01:42:54.092959 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:43:01 crc kubenswrapper[4743]: I1011 01:43:01.103233 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bvw5s"] Oct 11 01:43:01 crc kubenswrapper[4743]: E1011 01:43:01.105073 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f555b790-f6a4-4c68-8d8d-97fd07743aa8" containerName="logging-edpm-deployment-openstack-edpm-ipam" Oct 11 01:43:01 crc kubenswrapper[4743]: I1011 01:43:01.105109 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f555b790-f6a4-4c68-8d8d-97fd07743aa8" containerName="logging-edpm-deployment-openstack-edpm-ipam" Oct 11 01:43:01 crc kubenswrapper[4743]: I1011 01:43:01.105611 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f555b790-f6a4-4c68-8d8d-97fd07743aa8" containerName="logging-edpm-deployment-openstack-edpm-ipam" Oct 11 01:43:01 crc kubenswrapper[4743]: I1011 01:43:01.109293 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bvw5s" Oct 11 01:43:01 crc kubenswrapper[4743]: I1011 01:43:01.127823 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bvw5s"] Oct 11 01:43:01 crc kubenswrapper[4743]: I1011 01:43:01.275383 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a15b097-b3b3-4465-8063-6a4b5a7bdaa0-catalog-content\") pod \"certified-operators-bvw5s\" (UID: \"6a15b097-b3b3-4465-8063-6a4b5a7bdaa0\") " pod="openshift-marketplace/certified-operators-bvw5s" Oct 11 01:43:01 crc kubenswrapper[4743]: I1011 01:43:01.275515 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a15b097-b3b3-4465-8063-6a4b5a7bdaa0-utilities\") pod \"certified-operators-bvw5s\" (UID: \"6a15b097-b3b3-4465-8063-6a4b5a7bdaa0\") " pod="openshift-marketplace/certified-operators-bvw5s" Oct 11 01:43:01 crc kubenswrapper[4743]: I1011 01:43:01.275552 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbnlm\" (UniqueName: \"kubernetes.io/projected/6a15b097-b3b3-4465-8063-6a4b5a7bdaa0-kube-api-access-zbnlm\") pod \"certified-operators-bvw5s\" (UID: \"6a15b097-b3b3-4465-8063-6a4b5a7bdaa0\") " pod="openshift-marketplace/certified-operators-bvw5s" Oct 11 01:43:01 crc kubenswrapper[4743]: I1011 01:43:01.377713 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a15b097-b3b3-4465-8063-6a4b5a7bdaa0-utilities\") pod \"certified-operators-bvw5s\" (UID: \"6a15b097-b3b3-4465-8063-6a4b5a7bdaa0\") " pod="openshift-marketplace/certified-operators-bvw5s" Oct 11 01:43:01 crc kubenswrapper[4743]: I1011 01:43:01.377825 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbnlm\" (UniqueName: \"kubernetes.io/projected/6a15b097-b3b3-4465-8063-6a4b5a7bdaa0-kube-api-access-zbnlm\") pod \"certified-operators-bvw5s\" (UID: \"6a15b097-b3b3-4465-8063-6a4b5a7bdaa0\") " pod="openshift-marketplace/certified-operators-bvw5s" Oct 11 01:43:01 crc kubenswrapper[4743]: I1011 01:43:01.378023 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a15b097-b3b3-4465-8063-6a4b5a7bdaa0-catalog-content\") pod \"certified-operators-bvw5s\" (UID: \"6a15b097-b3b3-4465-8063-6a4b5a7bdaa0\") " pod="openshift-marketplace/certified-operators-bvw5s" Oct 11 01:43:01 crc kubenswrapper[4743]: I1011 01:43:01.378906 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a15b097-b3b3-4465-8063-6a4b5a7bdaa0-catalog-content\") pod \"certified-operators-bvw5s\" (UID: \"6a15b097-b3b3-4465-8063-6a4b5a7bdaa0\") " pod="openshift-marketplace/certified-operators-bvw5s" Oct 11 01:43:01 crc kubenswrapper[4743]: I1011 01:43:01.379363 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a15b097-b3b3-4465-8063-6a4b5a7bdaa0-utilities\") pod \"certified-operators-bvw5s\" (UID: \"6a15b097-b3b3-4465-8063-6a4b5a7bdaa0\") " pod="openshift-marketplace/certified-operators-bvw5s" Oct 11 01:43:01 crc kubenswrapper[4743]: I1011 01:43:01.415054 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbnlm\" (UniqueName: \"kubernetes.io/projected/6a15b097-b3b3-4465-8063-6a4b5a7bdaa0-kube-api-access-zbnlm\") pod \"certified-operators-bvw5s\" (UID: \"6a15b097-b3b3-4465-8063-6a4b5a7bdaa0\") " pod="openshift-marketplace/certified-operators-bvw5s" Oct 11 01:43:01 crc kubenswrapper[4743]: I1011 01:43:01.448753 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bvw5s" Oct 11 01:43:01 crc kubenswrapper[4743]: I1011 01:43:01.963069 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bvw5s"] Oct 11 01:43:02 crc kubenswrapper[4743]: I1011 01:43:02.297640 4743 generic.go:334] "Generic (PLEG): container finished" podID="6a15b097-b3b3-4465-8063-6a4b5a7bdaa0" containerID="3f556f27a37cf12ff379bb2704d1b3a3c6081d8c8642a6e55ccdb437dd99e9b8" exitCode=0 Oct 11 01:43:02 crc kubenswrapper[4743]: I1011 01:43:02.297699 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bvw5s" event={"ID":"6a15b097-b3b3-4465-8063-6a4b5a7bdaa0","Type":"ContainerDied","Data":"3f556f27a37cf12ff379bb2704d1b3a3c6081d8c8642a6e55ccdb437dd99e9b8"} Oct 11 01:43:02 crc kubenswrapper[4743]: I1011 01:43:02.297985 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bvw5s" event={"ID":"6a15b097-b3b3-4465-8063-6a4b5a7bdaa0","Type":"ContainerStarted","Data":"dd3afec5c599bdb7d19bf1ec663700ed80336f78422c19f1fa35e0abacefda25"} Oct 11 01:43:03 crc kubenswrapper[4743]: I1011 01:43:03.310551 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bvw5s" event={"ID":"6a15b097-b3b3-4465-8063-6a4b5a7bdaa0","Type":"ContainerStarted","Data":"2c891db7ec5aec6335b81e64a85ef5b488f652158311b21cf5d8d086f7a48b78"} Oct 11 01:43:04 crc kubenswrapper[4743]: I1011 01:43:04.325341 4743 generic.go:334] "Generic (PLEG): container finished" podID="6a15b097-b3b3-4465-8063-6a4b5a7bdaa0" containerID="2c891db7ec5aec6335b81e64a85ef5b488f652158311b21cf5d8d086f7a48b78" exitCode=0 Oct 11 01:43:04 crc kubenswrapper[4743]: I1011 01:43:04.325443 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bvw5s" event={"ID":"6a15b097-b3b3-4465-8063-6a4b5a7bdaa0","Type":"ContainerDied","Data":"2c891db7ec5aec6335b81e64a85ef5b488f652158311b21cf5d8d086f7a48b78"} Oct 11 01:43:05 crc kubenswrapper[4743]: I1011 01:43:05.088822 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9qxm5"] Oct 11 01:43:05 crc kubenswrapper[4743]: I1011 01:43:05.091766 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qxm5" Oct 11 01:43:05 crc kubenswrapper[4743]: I1011 01:43:05.099101 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qxm5"] Oct 11 01:43:05 crc kubenswrapper[4743]: I1011 01:43:05.271110 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/379d9fac-db55-487d-a67d-663479bdf122-catalog-content\") pod \"redhat-marketplace-9qxm5\" (UID: \"379d9fac-db55-487d-a67d-663479bdf122\") " pod="openshift-marketplace/redhat-marketplace-9qxm5" Oct 11 01:43:05 crc kubenswrapper[4743]: I1011 01:43:05.271280 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/379d9fac-db55-487d-a67d-663479bdf122-utilities\") pod \"redhat-marketplace-9qxm5\" (UID: \"379d9fac-db55-487d-a67d-663479bdf122\") " pod="openshift-marketplace/redhat-marketplace-9qxm5" Oct 11 01:43:05 crc kubenswrapper[4743]: I1011 01:43:05.271326 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh5nv\" (UniqueName: \"kubernetes.io/projected/379d9fac-db55-487d-a67d-663479bdf122-kube-api-access-bh5nv\") pod \"redhat-marketplace-9qxm5\" (UID: \"379d9fac-db55-487d-a67d-663479bdf122\") " pod="openshift-marketplace/redhat-marketplace-9qxm5" Oct 11 01:43:05 crc kubenswrapper[4743]: I1011 01:43:05.336499 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bvw5s" event={"ID":"6a15b097-b3b3-4465-8063-6a4b5a7bdaa0","Type":"ContainerStarted","Data":"8e4c5c9a01797c3fe944ae48aedf5b065c0521bdbcf329829d47fe956c824499"} Oct 11 01:43:05 crc kubenswrapper[4743]: I1011 01:43:05.362504 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bvw5s" podStartSLOduration=1.8373837929999999 podStartE2EDuration="4.362485934s" podCreationTimestamp="2025-10-11 01:43:01 +0000 UTC" firstStartedPulling="2025-10-11 01:43:02.299652269 +0000 UTC m=+3076.952632666" lastFinishedPulling="2025-10-11 01:43:04.82475442 +0000 UTC m=+3079.477734807" observedRunningTime="2025-10-11 01:43:05.357979071 +0000 UTC m=+3080.010959468" watchObservedRunningTime="2025-10-11 01:43:05.362485934 +0000 UTC m=+3080.015466341" Oct 11 01:43:05 crc kubenswrapper[4743]: I1011 01:43:05.373966 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/379d9fac-db55-487d-a67d-663479bdf122-utilities\") pod \"redhat-marketplace-9qxm5\" (UID: \"379d9fac-db55-487d-a67d-663479bdf122\") " pod="openshift-marketplace/redhat-marketplace-9qxm5" Oct 11 01:43:05 crc kubenswrapper[4743]: I1011 01:43:05.374035 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh5nv\" (UniqueName: \"kubernetes.io/projected/379d9fac-db55-487d-a67d-663479bdf122-kube-api-access-bh5nv\") pod \"redhat-marketplace-9qxm5\" (UID: \"379d9fac-db55-487d-a67d-663479bdf122\") " pod="openshift-marketplace/redhat-marketplace-9qxm5" Oct 11 01:43:05 crc kubenswrapper[4743]: I1011 01:43:05.374231 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/379d9fac-db55-487d-a67d-663479bdf122-catalog-content\") pod \"redhat-marketplace-9qxm5\" (UID: \"379d9fac-db55-487d-a67d-663479bdf122\") " pod="openshift-marketplace/redhat-marketplace-9qxm5" Oct 11 01:43:05 crc kubenswrapper[4743]: I1011 01:43:05.374499 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/379d9fac-db55-487d-a67d-663479bdf122-utilities\") pod \"redhat-marketplace-9qxm5\" (UID: \"379d9fac-db55-487d-a67d-663479bdf122\") " pod="openshift-marketplace/redhat-marketplace-9qxm5" Oct 11 01:43:05 crc kubenswrapper[4743]: I1011 01:43:05.374685 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/379d9fac-db55-487d-a67d-663479bdf122-catalog-content\") pod \"redhat-marketplace-9qxm5\" (UID: \"379d9fac-db55-487d-a67d-663479bdf122\") " pod="openshift-marketplace/redhat-marketplace-9qxm5" Oct 11 01:43:05 crc kubenswrapper[4743]: I1011 01:43:05.400788 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh5nv\" (UniqueName: \"kubernetes.io/projected/379d9fac-db55-487d-a67d-663479bdf122-kube-api-access-bh5nv\") pod \"redhat-marketplace-9qxm5\" (UID: \"379d9fac-db55-487d-a67d-663479bdf122\") " pod="openshift-marketplace/redhat-marketplace-9qxm5" Oct 11 01:43:05 crc kubenswrapper[4743]: I1011 01:43:05.408833 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qxm5" Oct 11 01:43:05 crc kubenswrapper[4743]: I1011 01:43:05.935526 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qxm5"] Oct 11 01:43:06 crc kubenswrapper[4743]: I1011 01:43:06.351446 4743 generic.go:334] "Generic (PLEG): container finished" podID="379d9fac-db55-487d-a67d-663479bdf122" containerID="7a88be5969988ba5bf08bd31da7502176d57de8c11036f966acc026358c1be7f" exitCode=0 Oct 11 01:43:06 crc kubenswrapper[4743]: I1011 01:43:06.351504 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qxm5" event={"ID":"379d9fac-db55-487d-a67d-663479bdf122","Type":"ContainerDied","Data":"7a88be5969988ba5bf08bd31da7502176d57de8c11036f966acc026358c1be7f"} Oct 11 01:43:06 crc kubenswrapper[4743]: I1011 01:43:06.352672 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qxm5" event={"ID":"379d9fac-db55-487d-a67d-663479bdf122","Type":"ContainerStarted","Data":"cd811f53055dd494501ca0717079a19af395642ae927781f19cb1c9044db6367"} Oct 11 01:43:07 crc kubenswrapper[4743]: I1011 01:43:07.364179 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qxm5" event={"ID":"379d9fac-db55-487d-a67d-663479bdf122","Type":"ContainerStarted","Data":"fc29434db0b9fcfa8d8ef14ec5663cea36f1c9f78e49731b9089c9f3ec71178a"} Oct 11 01:43:08 crc kubenswrapper[4743]: I1011 01:43:08.093086 4743 scope.go:117] "RemoveContainer" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" Oct 11 01:43:08 crc kubenswrapper[4743]: E1011 01:43:08.093913 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:43:08 crc kubenswrapper[4743]: I1011 01:43:08.380184 4743 generic.go:334] "Generic (PLEG): container finished" podID="379d9fac-db55-487d-a67d-663479bdf122" containerID="fc29434db0b9fcfa8d8ef14ec5663cea36f1c9f78e49731b9089c9f3ec71178a" exitCode=0 Oct 11 01:43:08 crc kubenswrapper[4743]: I1011 01:43:08.380249 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qxm5" event={"ID":"379d9fac-db55-487d-a67d-663479bdf122","Type":"ContainerDied","Data":"fc29434db0b9fcfa8d8ef14ec5663cea36f1c9f78e49731b9089c9f3ec71178a"} Oct 11 01:43:09 crc kubenswrapper[4743]: I1011 01:43:09.392141 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qxm5" event={"ID":"379d9fac-db55-487d-a67d-663479bdf122","Type":"ContainerStarted","Data":"ffbc7902ec923123b3e72e315eba4caeaa2cafec33fc6b4ada2169756d04bfdb"} Oct 11 01:43:09 crc kubenswrapper[4743]: I1011 01:43:09.426930 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9qxm5" podStartSLOduration=1.807696696 podStartE2EDuration="4.42690158s" podCreationTimestamp="2025-10-11 01:43:05 +0000 UTC" firstStartedPulling="2025-10-11 01:43:06.353427889 +0000 UTC m=+3081.006408296" lastFinishedPulling="2025-10-11 01:43:08.972632743 +0000 UTC m=+3083.625613180" observedRunningTime="2025-10-11 01:43:09.4177109 +0000 UTC m=+3084.070691307" watchObservedRunningTime="2025-10-11 01:43:09.42690158 +0000 UTC m=+3084.079882017" Oct 11 01:43:11 crc kubenswrapper[4743]: I1011 01:43:11.449750 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bvw5s" Oct 11 01:43:11 crc kubenswrapper[4743]: I1011 01:43:11.450986 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bvw5s" Oct 11 01:43:11 crc kubenswrapper[4743]: I1011 01:43:11.521685 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bvw5s" Oct 11 01:43:12 crc kubenswrapper[4743]: I1011 01:43:12.520302 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bvw5s" Oct 11 01:43:15 crc kubenswrapper[4743]: I1011 01:43:15.409502 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9qxm5" Oct 11 01:43:15 crc kubenswrapper[4743]: I1011 01:43:15.411417 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9qxm5" Oct 11 01:43:15 crc kubenswrapper[4743]: I1011 01:43:15.507568 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9qxm5" Oct 11 01:43:15 crc kubenswrapper[4743]: I1011 01:43:15.585124 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9qxm5" Oct 11 01:43:15 crc kubenswrapper[4743]: I1011 01:43:15.891910 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bvw5s"] Oct 11 01:43:15 crc kubenswrapper[4743]: I1011 01:43:15.892833 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bvw5s" podUID="6a15b097-b3b3-4465-8063-6a4b5a7bdaa0" containerName="registry-server" containerID="cri-o://8e4c5c9a01797c3fe944ae48aedf5b065c0521bdbcf329829d47fe956c824499" gracePeriod=2 Oct 11 01:43:16 crc kubenswrapper[4743]: I1011 01:43:16.492095 4743 generic.go:334] "Generic (PLEG): container finished" podID="6a15b097-b3b3-4465-8063-6a4b5a7bdaa0" containerID="8e4c5c9a01797c3fe944ae48aedf5b065c0521bdbcf329829d47fe956c824499" exitCode=0 Oct 11 01:43:16 crc kubenswrapper[4743]: I1011 01:43:16.492179 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bvw5s" event={"ID":"6a15b097-b3b3-4465-8063-6a4b5a7bdaa0","Type":"ContainerDied","Data":"8e4c5c9a01797c3fe944ae48aedf5b065c0521bdbcf329829d47fe956c824499"} Oct 11 01:43:16 crc kubenswrapper[4743]: I1011 01:43:16.492439 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bvw5s" event={"ID":"6a15b097-b3b3-4465-8063-6a4b5a7bdaa0","Type":"ContainerDied","Data":"dd3afec5c599bdb7d19bf1ec663700ed80336f78422c19f1fa35e0abacefda25"} Oct 11 01:43:16 crc kubenswrapper[4743]: I1011 01:43:16.492479 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd3afec5c599bdb7d19bf1ec663700ed80336f78422c19f1fa35e0abacefda25" Oct 11 01:43:16 crc kubenswrapper[4743]: I1011 01:43:16.502213 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bvw5s" Oct 11 01:43:16 crc kubenswrapper[4743]: I1011 01:43:16.672093 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a15b097-b3b3-4465-8063-6a4b5a7bdaa0-catalog-content\") pod \"6a15b097-b3b3-4465-8063-6a4b5a7bdaa0\" (UID: \"6a15b097-b3b3-4465-8063-6a4b5a7bdaa0\") " Oct 11 01:43:16 crc kubenswrapper[4743]: I1011 01:43:16.672191 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a15b097-b3b3-4465-8063-6a4b5a7bdaa0-utilities\") pod \"6a15b097-b3b3-4465-8063-6a4b5a7bdaa0\" (UID: \"6a15b097-b3b3-4465-8063-6a4b5a7bdaa0\") " Oct 11 01:43:16 crc kubenswrapper[4743]: I1011 01:43:16.672377 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbnlm\" (UniqueName: \"kubernetes.io/projected/6a15b097-b3b3-4465-8063-6a4b5a7bdaa0-kube-api-access-zbnlm\") pod \"6a15b097-b3b3-4465-8063-6a4b5a7bdaa0\" (UID: \"6a15b097-b3b3-4465-8063-6a4b5a7bdaa0\") " Oct 11 01:43:16 crc kubenswrapper[4743]: I1011 01:43:16.673347 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a15b097-b3b3-4465-8063-6a4b5a7bdaa0-utilities" (OuterVolumeSpecName: "utilities") pod "6a15b097-b3b3-4465-8063-6a4b5a7bdaa0" (UID: "6a15b097-b3b3-4465-8063-6a4b5a7bdaa0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:43:16 crc kubenswrapper[4743]: I1011 01:43:16.681151 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a15b097-b3b3-4465-8063-6a4b5a7bdaa0-kube-api-access-zbnlm" (OuterVolumeSpecName: "kube-api-access-zbnlm") pod "6a15b097-b3b3-4465-8063-6a4b5a7bdaa0" (UID: "6a15b097-b3b3-4465-8063-6a4b5a7bdaa0"). InnerVolumeSpecName "kube-api-access-zbnlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:43:16 crc kubenswrapper[4743]: I1011 01:43:16.722888 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a15b097-b3b3-4465-8063-6a4b5a7bdaa0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a15b097-b3b3-4465-8063-6a4b5a7bdaa0" (UID: "6a15b097-b3b3-4465-8063-6a4b5a7bdaa0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:43:16 crc kubenswrapper[4743]: I1011 01:43:16.775498 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a15b097-b3b3-4465-8063-6a4b5a7bdaa0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 01:43:16 crc kubenswrapper[4743]: I1011 01:43:16.775582 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a15b097-b3b3-4465-8063-6a4b5a7bdaa0-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 01:43:16 crc kubenswrapper[4743]: I1011 01:43:16.775600 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbnlm\" (UniqueName: \"kubernetes.io/projected/6a15b097-b3b3-4465-8063-6a4b5a7bdaa0-kube-api-access-zbnlm\") on node \"crc\" DevicePath \"\"" Oct 11 01:43:17 crc kubenswrapper[4743]: I1011 01:43:17.507273 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bvw5s" Oct 11 01:43:17 crc kubenswrapper[4743]: I1011 01:43:17.564893 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bvw5s"] Oct 11 01:43:17 crc kubenswrapper[4743]: I1011 01:43:17.579234 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bvw5s"] Oct 11 01:43:18 crc kubenswrapper[4743]: I1011 01:43:18.103526 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a15b097-b3b3-4465-8063-6a4b5a7bdaa0" path="/var/lib/kubelet/pods/6a15b097-b3b3-4465-8063-6a4b5a7bdaa0/volumes" Oct 11 01:43:19 crc kubenswrapper[4743]: I1011 01:43:19.289741 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qxm5"] Oct 11 01:43:19 crc kubenswrapper[4743]: I1011 01:43:19.290485 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9qxm5" podUID="379d9fac-db55-487d-a67d-663479bdf122" containerName="registry-server" containerID="cri-o://ffbc7902ec923123b3e72e315eba4caeaa2cafec33fc6b4ada2169756d04bfdb" gracePeriod=2 Oct 11 01:43:19 crc kubenswrapper[4743]: I1011 01:43:19.537634 4743 generic.go:334] "Generic (PLEG): container finished" podID="379d9fac-db55-487d-a67d-663479bdf122" containerID="ffbc7902ec923123b3e72e315eba4caeaa2cafec33fc6b4ada2169756d04bfdb" exitCode=0 Oct 11 01:43:19 crc kubenswrapper[4743]: I1011 01:43:19.537704 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qxm5" event={"ID":"379d9fac-db55-487d-a67d-663479bdf122","Type":"ContainerDied","Data":"ffbc7902ec923123b3e72e315eba4caeaa2cafec33fc6b4ada2169756d04bfdb"} Oct 11 01:43:19 crc kubenswrapper[4743]: I1011 01:43:19.785380 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qxm5" Oct 11 01:43:19 crc kubenswrapper[4743]: I1011 01:43:19.841552 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/379d9fac-db55-487d-a67d-663479bdf122-catalog-content\") pod \"379d9fac-db55-487d-a67d-663479bdf122\" (UID: \"379d9fac-db55-487d-a67d-663479bdf122\") " Oct 11 01:43:19 crc kubenswrapper[4743]: I1011 01:43:19.853872 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/379d9fac-db55-487d-a67d-663479bdf122-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "379d9fac-db55-487d-a67d-663479bdf122" (UID: "379d9fac-db55-487d-a67d-663479bdf122"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:43:19 crc kubenswrapper[4743]: I1011 01:43:19.944167 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/379d9fac-db55-487d-a67d-663479bdf122-utilities\") pod \"379d9fac-db55-487d-a67d-663479bdf122\" (UID: \"379d9fac-db55-487d-a67d-663479bdf122\") " Oct 11 01:43:19 crc kubenswrapper[4743]: I1011 01:43:19.944347 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh5nv\" (UniqueName: \"kubernetes.io/projected/379d9fac-db55-487d-a67d-663479bdf122-kube-api-access-bh5nv\") pod \"379d9fac-db55-487d-a67d-663479bdf122\" (UID: \"379d9fac-db55-487d-a67d-663479bdf122\") " Oct 11 01:43:19 crc kubenswrapper[4743]: I1011 01:43:19.944944 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/379d9fac-db55-487d-a67d-663479bdf122-utilities" (OuterVolumeSpecName: "utilities") pod "379d9fac-db55-487d-a67d-663479bdf122" (UID: "379d9fac-db55-487d-a67d-663479bdf122"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:43:19 crc kubenswrapper[4743]: I1011 01:43:19.945104 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/379d9fac-db55-487d-a67d-663479bdf122-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 01:43:19 crc kubenswrapper[4743]: I1011 01:43:19.945121 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/379d9fac-db55-487d-a67d-663479bdf122-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 01:43:19 crc kubenswrapper[4743]: I1011 01:43:19.950850 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/379d9fac-db55-487d-a67d-663479bdf122-kube-api-access-bh5nv" (OuterVolumeSpecName: "kube-api-access-bh5nv") pod "379d9fac-db55-487d-a67d-663479bdf122" (UID: "379d9fac-db55-487d-a67d-663479bdf122"). InnerVolumeSpecName "kube-api-access-bh5nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:43:20 crc kubenswrapper[4743]: I1011 01:43:20.045962 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bh5nv\" (UniqueName: \"kubernetes.io/projected/379d9fac-db55-487d-a67d-663479bdf122-kube-api-access-bh5nv\") on node \"crc\" DevicePath \"\"" Oct 11 01:43:20 crc kubenswrapper[4743]: I1011 01:43:20.554389 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qxm5" event={"ID":"379d9fac-db55-487d-a67d-663479bdf122","Type":"ContainerDied","Data":"cd811f53055dd494501ca0717079a19af395642ae927781f19cb1c9044db6367"} Oct 11 01:43:20 crc kubenswrapper[4743]: I1011 01:43:20.554465 4743 scope.go:117] "RemoveContainer" containerID="ffbc7902ec923123b3e72e315eba4caeaa2cafec33fc6b4ada2169756d04bfdb" Oct 11 01:43:20 crc kubenswrapper[4743]: I1011 01:43:20.554556 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qxm5" Oct 11 01:43:20 crc kubenswrapper[4743]: I1011 01:43:20.583132 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qxm5"] Oct 11 01:43:20 crc kubenswrapper[4743]: I1011 01:43:20.592517 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qxm5"] Oct 11 01:43:20 crc kubenswrapper[4743]: I1011 01:43:20.603564 4743 scope.go:117] "RemoveContainer" containerID="fc29434db0b9fcfa8d8ef14ec5663cea36f1c9f78e49731b9089c9f3ec71178a" Oct 11 01:43:20 crc kubenswrapper[4743]: I1011 01:43:20.626711 4743 scope.go:117] "RemoveContainer" containerID="7a88be5969988ba5bf08bd31da7502176d57de8c11036f966acc026358c1be7f" Oct 11 01:43:22 crc kubenswrapper[4743]: I1011 01:43:22.092586 4743 scope.go:117] "RemoveContainer" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" Oct 11 01:43:22 crc kubenswrapper[4743]: E1011 01:43:22.093101 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:43:22 crc kubenswrapper[4743]: I1011 01:43:22.109720 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="379d9fac-db55-487d-a67d-663479bdf122" path="/var/lib/kubelet/pods/379d9fac-db55-487d-a67d-663479bdf122/volumes" Oct 11 01:43:34 crc kubenswrapper[4743]: I1011 01:43:34.093252 4743 scope.go:117] "RemoveContainer" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" Oct 11 01:43:34 crc kubenswrapper[4743]: E1011 01:43:34.096393 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:43:46 crc kubenswrapper[4743]: I1011 01:43:46.108959 4743 scope.go:117] "RemoveContainer" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" Oct 11 01:43:46 crc kubenswrapper[4743]: E1011 01:43:46.112175 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:43:57 crc kubenswrapper[4743]: I1011 01:43:57.091739 4743 scope.go:117] "RemoveContainer" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" Oct 11 01:43:57 crc kubenswrapper[4743]: E1011 01:43:57.092498 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:44:11 crc kubenswrapper[4743]: I1011 01:44:11.092687 4743 scope.go:117] "RemoveContainer" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" Oct 11 01:44:11 crc kubenswrapper[4743]: E1011 01:44:11.094054 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:44:23 crc kubenswrapper[4743]: I1011 01:44:23.092679 4743 scope.go:117] "RemoveContainer" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" Oct 11 01:44:23 crc kubenswrapper[4743]: E1011 01:44:23.093460 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:44:35 crc kubenswrapper[4743]: I1011 01:44:35.091624 4743 scope.go:117] "RemoveContainer" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" Oct 11 01:44:35 crc kubenswrapper[4743]: E1011 01:44:35.092412 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:44:49 crc kubenswrapper[4743]: I1011 01:44:49.091755 4743 scope.go:117] "RemoveContainer" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" Oct 11 01:44:49 crc kubenswrapper[4743]: I1011 01:44:49.653974 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"40fabd8824e0053fbc030d07f0d398ca43bf5381e369efae8cdd092e89d91e84"} Oct 11 01:44:53 crc kubenswrapper[4743]: I1011 01:44:53.145293 4743 scope.go:117] "RemoveContainer" containerID="9eb66e455a518eeca6a79024dfc9cc191e3d6e0201632ad3357e2bc5f9c4ad2d" Oct 11 01:44:53 crc kubenswrapper[4743]: I1011 01:44:53.180987 4743 scope.go:117] "RemoveContainer" containerID="344dabdf65632a8fa0e88f8aeac59c3554d94fd545197f2be46904061a59b2f2" Oct 11 01:44:53 crc kubenswrapper[4743]: I1011 01:44:53.245111 4743 scope.go:117] "RemoveContainer" containerID="d3c36b882a00e7412fb2648d8e21f0a61ea8ace85a92e99a5c3aff12199c41b6" Oct 11 01:45:00 crc kubenswrapper[4743]: I1011 01:45:00.192631 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335785-z7bgx"] Oct 11 01:45:00 crc kubenswrapper[4743]: E1011 01:45:00.193445 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="379d9fac-db55-487d-a67d-663479bdf122" containerName="registry-server" Oct 11 01:45:00 crc kubenswrapper[4743]: I1011 01:45:00.193457 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="379d9fac-db55-487d-a67d-663479bdf122" containerName="registry-server" Oct 11 01:45:00 crc kubenswrapper[4743]: E1011 01:45:00.193471 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="379d9fac-db55-487d-a67d-663479bdf122" containerName="extract-utilities" Oct 11 01:45:00 crc kubenswrapper[4743]: I1011 01:45:00.193477 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="379d9fac-db55-487d-a67d-663479bdf122" containerName="extract-utilities" Oct 11 01:45:00 crc kubenswrapper[4743]: E1011 01:45:00.193495 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a15b097-b3b3-4465-8063-6a4b5a7bdaa0" containerName="registry-server" Oct 11 01:45:00 crc kubenswrapper[4743]: I1011 01:45:00.193501 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a15b097-b3b3-4465-8063-6a4b5a7bdaa0" containerName="registry-server" Oct 11 01:45:00 crc kubenswrapper[4743]: E1011 01:45:00.193519 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a15b097-b3b3-4465-8063-6a4b5a7bdaa0" containerName="extract-content" Oct 11 01:45:00 crc kubenswrapper[4743]: I1011 01:45:00.193524 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a15b097-b3b3-4465-8063-6a4b5a7bdaa0" containerName="extract-content" Oct 11 01:45:00 crc kubenswrapper[4743]: E1011 01:45:00.193537 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a15b097-b3b3-4465-8063-6a4b5a7bdaa0" containerName="extract-utilities" Oct 11 01:45:00 crc kubenswrapper[4743]: I1011 01:45:00.193544 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a15b097-b3b3-4465-8063-6a4b5a7bdaa0" containerName="extract-utilities" Oct 11 01:45:00 crc kubenswrapper[4743]: E1011 01:45:00.193554 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="379d9fac-db55-487d-a67d-663479bdf122" containerName="extract-content" Oct 11 01:45:00 crc kubenswrapper[4743]: I1011 01:45:00.193559 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="379d9fac-db55-487d-a67d-663479bdf122" containerName="extract-content" Oct 11 01:45:00 crc kubenswrapper[4743]: I1011 01:45:00.193740 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="379d9fac-db55-487d-a67d-663479bdf122" containerName="registry-server" Oct 11 01:45:00 crc kubenswrapper[4743]: I1011 01:45:00.193765 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a15b097-b3b3-4465-8063-6a4b5a7bdaa0" containerName="registry-server" Oct 11 01:45:00 crc kubenswrapper[4743]: I1011 01:45:00.194483 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335785-z7bgx" Oct 11 01:45:00 crc kubenswrapper[4743]: I1011 01:45:00.197034 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 11 01:45:00 crc kubenswrapper[4743]: I1011 01:45:00.197373 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 11 01:45:00 crc kubenswrapper[4743]: I1011 01:45:00.210067 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335785-z7bgx"] Oct 11 01:45:00 crc kubenswrapper[4743]: I1011 01:45:00.344448 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1fde7d0a-32a2-4eae-b215-ae5f6d819980-config-volume\") pod \"collect-profiles-29335785-z7bgx\" (UID: \"1fde7d0a-32a2-4eae-b215-ae5f6d819980\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335785-z7bgx" Oct 11 01:45:00 crc kubenswrapper[4743]: I1011 01:45:00.344498 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1fde7d0a-32a2-4eae-b215-ae5f6d819980-secret-volume\") pod \"collect-profiles-29335785-z7bgx\" (UID: \"1fde7d0a-32a2-4eae-b215-ae5f6d819980\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335785-z7bgx" Oct 11 01:45:00 crc kubenswrapper[4743]: I1011 01:45:00.344532 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnvfh\" (UniqueName: \"kubernetes.io/projected/1fde7d0a-32a2-4eae-b215-ae5f6d819980-kube-api-access-cnvfh\") pod \"collect-profiles-29335785-z7bgx\" (UID: \"1fde7d0a-32a2-4eae-b215-ae5f6d819980\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335785-z7bgx" Oct 11 01:45:00 crc kubenswrapper[4743]: I1011 01:45:00.447238 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1fde7d0a-32a2-4eae-b215-ae5f6d819980-config-volume\") pod \"collect-profiles-29335785-z7bgx\" (UID: \"1fde7d0a-32a2-4eae-b215-ae5f6d819980\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335785-z7bgx" Oct 11 01:45:00 crc kubenswrapper[4743]: I1011 01:45:00.447315 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1fde7d0a-32a2-4eae-b215-ae5f6d819980-secret-volume\") pod \"collect-profiles-29335785-z7bgx\" (UID: \"1fde7d0a-32a2-4eae-b215-ae5f6d819980\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335785-z7bgx" Oct 11 01:45:00 crc kubenswrapper[4743]: I1011 01:45:00.447399 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnvfh\" (UniqueName: \"kubernetes.io/projected/1fde7d0a-32a2-4eae-b215-ae5f6d819980-kube-api-access-cnvfh\") pod \"collect-profiles-29335785-z7bgx\" (UID: \"1fde7d0a-32a2-4eae-b215-ae5f6d819980\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335785-z7bgx" Oct 11 01:45:00 crc kubenswrapper[4743]: I1011 01:45:00.448807 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1fde7d0a-32a2-4eae-b215-ae5f6d819980-config-volume\") pod \"collect-profiles-29335785-z7bgx\" (UID: \"1fde7d0a-32a2-4eae-b215-ae5f6d819980\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335785-z7bgx" Oct 11 01:45:00 crc kubenswrapper[4743]: I1011 01:45:00.453467 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1fde7d0a-32a2-4eae-b215-ae5f6d819980-secret-volume\") pod \"collect-profiles-29335785-z7bgx\" (UID: \"1fde7d0a-32a2-4eae-b215-ae5f6d819980\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335785-z7bgx" Oct 11 01:45:00 crc kubenswrapper[4743]: I1011 01:45:00.466340 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnvfh\" (UniqueName: \"kubernetes.io/projected/1fde7d0a-32a2-4eae-b215-ae5f6d819980-kube-api-access-cnvfh\") pod \"collect-profiles-29335785-z7bgx\" (UID: \"1fde7d0a-32a2-4eae-b215-ae5f6d819980\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335785-z7bgx" Oct 11 01:45:00 crc kubenswrapper[4743]: I1011 01:45:00.517616 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335785-z7bgx" Oct 11 01:45:01 crc kubenswrapper[4743]: I1011 01:45:01.072310 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335785-z7bgx"] Oct 11 01:45:01 crc kubenswrapper[4743]: W1011 01:45:01.086012 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fde7d0a_32a2_4eae_b215_ae5f6d819980.slice/crio-415edd92a588f9d839db6f123eddd3df9448c743332e9845d588a151e1b90020 WatchSource:0}: Error finding container 415edd92a588f9d839db6f123eddd3df9448c743332e9845d588a151e1b90020: Status 404 returned error can't find the container with id 415edd92a588f9d839db6f123eddd3df9448c743332e9845d588a151e1b90020 Oct 11 01:45:01 crc kubenswrapper[4743]: I1011 01:45:01.801318 4743 generic.go:334] "Generic (PLEG): container finished" podID="1fde7d0a-32a2-4eae-b215-ae5f6d819980" containerID="fed630d91b09ea66cf637b8d41962e061b07d0b0cdb8dcdeadc322e82ad10a97" exitCode=0 Oct 11 01:45:01 crc kubenswrapper[4743]: I1011 01:45:01.801623 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335785-z7bgx" event={"ID":"1fde7d0a-32a2-4eae-b215-ae5f6d819980","Type":"ContainerDied","Data":"fed630d91b09ea66cf637b8d41962e061b07d0b0cdb8dcdeadc322e82ad10a97"} Oct 11 01:45:01 crc kubenswrapper[4743]: I1011 01:45:01.801663 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335785-z7bgx" event={"ID":"1fde7d0a-32a2-4eae-b215-ae5f6d819980","Type":"ContainerStarted","Data":"415edd92a588f9d839db6f123eddd3df9448c743332e9845d588a151e1b90020"} Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.249103 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335785-z7bgx" Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.410715 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1fde7d0a-32a2-4eae-b215-ae5f6d819980-secret-volume\") pod \"1fde7d0a-32a2-4eae-b215-ae5f6d819980\" (UID: \"1fde7d0a-32a2-4eae-b215-ae5f6d819980\") " Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.410846 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnvfh\" (UniqueName: \"kubernetes.io/projected/1fde7d0a-32a2-4eae-b215-ae5f6d819980-kube-api-access-cnvfh\") pod \"1fde7d0a-32a2-4eae-b215-ae5f6d819980\" (UID: \"1fde7d0a-32a2-4eae-b215-ae5f6d819980\") " Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.411054 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1fde7d0a-32a2-4eae-b215-ae5f6d819980-config-volume\") pod \"1fde7d0a-32a2-4eae-b215-ae5f6d819980\" (UID: \"1fde7d0a-32a2-4eae-b215-ae5f6d819980\") " Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.411642 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fde7d0a-32a2-4eae-b215-ae5f6d819980-config-volume" (OuterVolumeSpecName: "config-volume") pod "1fde7d0a-32a2-4eae-b215-ae5f6d819980" (UID: "1fde7d0a-32a2-4eae-b215-ae5f6d819980"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.412052 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1fde7d0a-32a2-4eae-b215-ae5f6d819980-config-volume\") on node \"crc\" DevicePath \"\"" Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.416817 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fde7d0a-32a2-4eae-b215-ae5f6d819980-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1fde7d0a-32a2-4eae-b215-ae5f6d819980" (UID: "1fde7d0a-32a2-4eae-b215-ae5f6d819980"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.423036 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fde7d0a-32a2-4eae-b215-ae5f6d819980-kube-api-access-cnvfh" (OuterVolumeSpecName: "kube-api-access-cnvfh") pod "1fde7d0a-32a2-4eae-b215-ae5f6d819980" (UID: "1fde7d0a-32a2-4eae-b215-ae5f6d819980"). InnerVolumeSpecName "kube-api-access-cnvfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.514235 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1fde7d0a-32a2-4eae-b215-ae5f6d819980-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.514290 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnvfh\" (UniqueName: \"kubernetes.io/projected/1fde7d0a-32a2-4eae-b215-ae5f6d819980-kube-api-access-cnvfh\") on node \"crc\" DevicePath \"\"" Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.706243 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-grhxs"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.731915 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.742921 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.752551 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.762420 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-pdmns"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.770342 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-grhxs"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.810884 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxgwq"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.819322 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zzzsh"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.826160 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335785-z7bgx" event={"ID":"1fde7d0a-32a2-4eae-b215-ae5f6d819980","Type":"ContainerDied","Data":"415edd92a588f9d839db6f123eddd3df9448c743332e9845d588a151e1b90020"} Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.826336 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="415edd92a588f9d839db6f123eddd3df9448c743332e9845d588a151e1b90020" Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.826253 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335785-z7bgx" Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.827890 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.835285 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2kmfr"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.842808 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6lpd"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.851441 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.858989 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.866766 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gbggg"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.873692 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9qtvs"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.884366 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.891978 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.900377 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.909231 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6lpd"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.917452 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.925093 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-hqwq9"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.933316 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.941723 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.950254 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.965058 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k8lss"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.973121 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-79pjt"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.982101 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gbggg"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.989948 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2x9z6"] Oct 11 01:45:03 crc kubenswrapper[4743]: I1011 01:45:03.996950 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-h7k2m"] Oct 11 01:45:04 crc kubenswrapper[4743]: I1011 01:45:04.003706 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mm4w8"] Oct 11 01:45:04 crc kubenswrapper[4743]: I1011 01:45:04.011143 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nd7b4"] Oct 11 01:45:04 crc kubenswrapper[4743]: I1011 01:45:04.019430 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-hqwq9"] Oct 11 01:45:04 crc kubenswrapper[4743]: I1011 01:45:04.027314 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bpxh7"] Oct 11 01:45:04 crc kubenswrapper[4743]: I1011 01:45:04.036943 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-4v975"] Oct 11 01:45:04 crc kubenswrapper[4743]: I1011 01:45:04.110593 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15ec1ee2-7130-47d3-8156-4352228590a6" path="/var/lib/kubelet/pods/15ec1ee2-7130-47d3-8156-4352228590a6/volumes" Oct 11 01:45:04 crc kubenswrapper[4743]: I1011 01:45:04.111684 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18dd49f1-d2cb-4e7b-b427-971bda666f14" path="/var/lib/kubelet/pods/18dd49f1-d2cb-4e7b-b427-971bda666f14/volumes" Oct 11 01:45:04 crc kubenswrapper[4743]: I1011 01:45:04.112789 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f790cf8-5f73-4587-b2df-d0e7ef6622b0" path="/var/lib/kubelet/pods/1f790cf8-5f73-4587-b2df-d0e7ef6622b0/volumes" Oct 11 01:45:04 crc kubenswrapper[4743]: I1011 01:45:04.113957 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fc7b43c-1b31-4510-bb3e-a3e3017bd93e" path="/var/lib/kubelet/pods/1fc7b43c-1b31-4510-bb3e-a3e3017bd93e/volumes" Oct 11 01:45:04 crc kubenswrapper[4743]: I1011 01:45:04.115968 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37d164e2-e69a-4faf-892c-b79e155a6c90" path="/var/lib/kubelet/pods/37d164e2-e69a-4faf-892c-b79e155a6c90/volumes" Oct 11 01:45:04 crc kubenswrapper[4743]: I1011 01:45:04.117053 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4255878f-68bf-41cc-8f1a-6e38ac2e2401" path="/var/lib/kubelet/pods/4255878f-68bf-41cc-8f1a-6e38ac2e2401/volumes" Oct 11 01:45:04 crc kubenswrapper[4743]: I1011 01:45:04.118142 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53edf58a-7be3-40ee-af4a-d110d1607356" path="/var/lib/kubelet/pods/53edf58a-7be3-40ee-af4a-d110d1607356/volumes" Oct 11 01:45:04 crc kubenswrapper[4743]: I1011 01:45:04.120190 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="566b2f7b-e36d-49d7-b985-dc1a39ad9253" path="/var/lib/kubelet/pods/566b2f7b-e36d-49d7-b985-dc1a39ad9253/volumes" Oct 11 01:45:04 crc kubenswrapper[4743]: I1011 01:45:04.121554 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346" path="/var/lib/kubelet/pods/61c7dd2d-7f02-47cb-8c1b-eadfa3fdf346/volumes" Oct 11 01:45:04 crc kubenswrapper[4743]: I1011 01:45:04.123023 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74e06001-e5b2-4a21-b3a0-887f814c87ef" path="/var/lib/kubelet/pods/74e06001-e5b2-4a21-b3a0-887f814c87ef/volumes" Oct 11 01:45:04 crc kubenswrapper[4743]: I1011 01:45:04.124078 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5" path="/var/lib/kubelet/pods/7ed08ed8-af5f-4ba5-94a9-5194c9cf5cd5/volumes" Oct 11 01:45:04 crc kubenswrapper[4743]: I1011 01:45:04.125922 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="830e5365-fa12-43db-b2da-5e3295796350" path="/var/lib/kubelet/pods/830e5365-fa12-43db-b2da-5e3295796350/volumes" Oct 11 01:45:04 crc kubenswrapper[4743]: I1011 01:45:04.126782 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="947f995f-28b3-4fbb-8ade-fa778c7fe05a" path="/var/lib/kubelet/pods/947f995f-28b3-4fbb-8ade-fa778c7fe05a/volumes" Oct 11 01:45:04 crc kubenswrapper[4743]: I1011 01:45:04.127745 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bebacb8f-bd48-4082-ac2c-80875645f5bf" path="/var/lib/kubelet/pods/bebacb8f-bd48-4082-ac2c-80875645f5bf/volumes" Oct 11 01:45:04 crc kubenswrapper[4743]: I1011 01:45:04.128880 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da7b50d8-258c-468b-8a7e-0ec7ae206309" path="/var/lib/kubelet/pods/da7b50d8-258c-468b-8a7e-0ec7ae206309/volumes" Oct 11 01:45:04 crc kubenswrapper[4743]: I1011 01:45:04.130392 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f555b790-f6a4-4c68-8d8d-97fd07743aa8" path="/var/lib/kubelet/pods/f555b790-f6a4-4c68-8d8d-97fd07743aa8/volumes" Oct 11 01:45:04 crc kubenswrapper[4743]: I1011 01:45:04.131448 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f766e457-c9a8-465e-b746-e9ef3bba860f" path="/var/lib/kubelet/pods/f766e457-c9a8-465e-b746-e9ef3bba860f/volumes" Oct 11 01:45:04 crc kubenswrapper[4743]: I1011 01:45:04.313777 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335740-fndp4"] Oct 11 01:45:04 crc kubenswrapper[4743]: I1011 01:45:04.324052 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335740-fndp4"] Oct 11 01:45:06 crc kubenswrapper[4743]: I1011 01:45:06.107083 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c46c590-571e-42eb-9fa4-a6ccabdc12a8" path="/var/lib/kubelet/pods/2c46c590-571e-42eb-9fa4-a6ccabdc12a8/volumes" Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.511902 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x"] Oct 11 01:45:08 crc kubenswrapper[4743]: E1011 01:45:08.512785 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fde7d0a-32a2-4eae-b215-ae5f6d819980" containerName="collect-profiles" Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.512801 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fde7d0a-32a2-4eae-b215-ae5f6d819980" containerName="collect-profiles" Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.513130 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fde7d0a-32a2-4eae-b215-ae5f6d819980" containerName="collect-profiles" Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.513918 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x" Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.516396 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.516644 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.516803 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.517011 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.517124 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.553995 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x"] Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.619354 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b9069bf9-41de-4faf-ad86-3913be33cb1a-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x\" (UID: \"b9069bf9-41de-4faf-ad86-3913be33cb1a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x" Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.619410 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b9069bf9-41de-4faf-ad86-3913be33cb1a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x\" (UID: \"b9069bf9-41de-4faf-ad86-3913be33cb1a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x" Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.619428 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kxlr\" (UniqueName: \"kubernetes.io/projected/b9069bf9-41de-4faf-ad86-3913be33cb1a-kube-api-access-6kxlr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x\" (UID: \"b9069bf9-41de-4faf-ad86-3913be33cb1a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x" Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.619548 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9069bf9-41de-4faf-ad86-3913be33cb1a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x\" (UID: \"b9069bf9-41de-4faf-ad86-3913be33cb1a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x" Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.619578 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9069bf9-41de-4faf-ad86-3913be33cb1a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x\" (UID: \"b9069bf9-41de-4faf-ad86-3913be33cb1a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x" Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.721197 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9069bf9-41de-4faf-ad86-3913be33cb1a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x\" (UID: \"b9069bf9-41de-4faf-ad86-3913be33cb1a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x" Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.721473 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9069bf9-41de-4faf-ad86-3913be33cb1a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x\" (UID: \"b9069bf9-41de-4faf-ad86-3913be33cb1a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x" Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.721583 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b9069bf9-41de-4faf-ad86-3913be33cb1a-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x\" (UID: \"b9069bf9-41de-4faf-ad86-3913be33cb1a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x" Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.721666 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b9069bf9-41de-4faf-ad86-3913be33cb1a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x\" (UID: \"b9069bf9-41de-4faf-ad86-3913be33cb1a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x" Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.721730 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kxlr\" (UniqueName: \"kubernetes.io/projected/b9069bf9-41de-4faf-ad86-3913be33cb1a-kube-api-access-6kxlr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x\" (UID: \"b9069bf9-41de-4faf-ad86-3913be33cb1a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x" Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.728963 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9069bf9-41de-4faf-ad86-3913be33cb1a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x\" (UID: \"b9069bf9-41de-4faf-ad86-3913be33cb1a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x" Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.729470 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b9069bf9-41de-4faf-ad86-3913be33cb1a-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x\" (UID: \"b9069bf9-41de-4faf-ad86-3913be33cb1a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x" Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.729513 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9069bf9-41de-4faf-ad86-3913be33cb1a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x\" (UID: \"b9069bf9-41de-4faf-ad86-3913be33cb1a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x" Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.731357 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b9069bf9-41de-4faf-ad86-3913be33cb1a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x\" (UID: \"b9069bf9-41de-4faf-ad86-3913be33cb1a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x" Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.737504 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kxlr\" (UniqueName: \"kubernetes.io/projected/b9069bf9-41de-4faf-ad86-3913be33cb1a-kube-api-access-6kxlr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x\" (UID: \"b9069bf9-41de-4faf-ad86-3913be33cb1a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x" Oct 11 01:45:08 crc kubenswrapper[4743]: I1011 01:45:08.834603 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x" Oct 11 01:45:09 crc kubenswrapper[4743]: I1011 01:45:09.485782 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x"] Oct 11 01:45:09 crc kubenswrapper[4743]: I1011 01:45:09.496121 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 01:45:09 crc kubenswrapper[4743]: I1011 01:45:09.953229 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x" event={"ID":"b9069bf9-41de-4faf-ad86-3913be33cb1a","Type":"ContainerStarted","Data":"4b5bc8d2bdfd1725fd7b16ebeed70b2563b3368ee4cba36a4a06ae91aa22506f"} Oct 11 01:45:10 crc kubenswrapper[4743]: I1011 01:45:10.963923 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x" event={"ID":"b9069bf9-41de-4faf-ad86-3913be33cb1a","Type":"ContainerStarted","Data":"0b25d2555ac6009f101bd57d9f19f8ce8fb08d59d138f09c1493834b5fee728b"} Oct 11 01:45:10 crc kubenswrapper[4743]: I1011 01:45:10.990803 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x" podStartSLOduration=2.495055202 podStartE2EDuration="2.990782123s" podCreationTimestamp="2025-10-11 01:45:08 +0000 UTC" firstStartedPulling="2025-10-11 01:45:09.495842706 +0000 UTC m=+3204.148823103" lastFinishedPulling="2025-10-11 01:45:09.991569587 +0000 UTC m=+3204.644550024" observedRunningTime="2025-10-11 01:45:10.98621201 +0000 UTC m=+3205.639192407" watchObservedRunningTime="2025-10-11 01:45:10.990782123 +0000 UTC m=+3205.643762520" Oct 11 01:45:24 crc kubenswrapper[4743]: I1011 01:45:24.138543 4743 generic.go:334] "Generic (PLEG): container finished" podID="b9069bf9-41de-4faf-ad86-3913be33cb1a" containerID="0b25d2555ac6009f101bd57d9f19f8ce8fb08d59d138f09c1493834b5fee728b" exitCode=0 Oct 11 01:45:24 crc kubenswrapper[4743]: I1011 01:45:24.138649 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x" event={"ID":"b9069bf9-41de-4faf-ad86-3913be33cb1a","Type":"ContainerDied","Data":"0b25d2555ac6009f101bd57d9f19f8ce8fb08d59d138f09c1493834b5fee728b"} Oct 11 01:45:25 crc kubenswrapper[4743]: I1011 01:45:25.698336 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x" Oct 11 01:45:25 crc kubenswrapper[4743]: I1011 01:45:25.730333 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kxlr\" (UniqueName: \"kubernetes.io/projected/b9069bf9-41de-4faf-ad86-3913be33cb1a-kube-api-access-6kxlr\") pod \"b9069bf9-41de-4faf-ad86-3913be33cb1a\" (UID: \"b9069bf9-41de-4faf-ad86-3913be33cb1a\") " Oct 11 01:45:25 crc kubenswrapper[4743]: I1011 01:45:25.730756 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9069bf9-41de-4faf-ad86-3913be33cb1a-repo-setup-combined-ca-bundle\") pod \"b9069bf9-41de-4faf-ad86-3913be33cb1a\" (UID: \"b9069bf9-41de-4faf-ad86-3913be33cb1a\") " Oct 11 01:45:25 crc kubenswrapper[4743]: I1011 01:45:25.730822 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b9069bf9-41de-4faf-ad86-3913be33cb1a-ceph\") pod \"b9069bf9-41de-4faf-ad86-3913be33cb1a\" (UID: \"b9069bf9-41de-4faf-ad86-3913be33cb1a\") " Oct 11 01:45:25 crc kubenswrapper[4743]: I1011 01:45:25.730930 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b9069bf9-41de-4faf-ad86-3913be33cb1a-ssh-key\") pod \"b9069bf9-41de-4faf-ad86-3913be33cb1a\" (UID: \"b9069bf9-41de-4faf-ad86-3913be33cb1a\") " Oct 11 01:45:25 crc kubenswrapper[4743]: I1011 01:45:25.731083 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9069bf9-41de-4faf-ad86-3913be33cb1a-inventory\") pod \"b9069bf9-41de-4faf-ad86-3913be33cb1a\" (UID: \"b9069bf9-41de-4faf-ad86-3913be33cb1a\") " Oct 11 01:45:25 crc kubenswrapper[4743]: I1011 01:45:25.735760 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9069bf9-41de-4faf-ad86-3913be33cb1a-ceph" (OuterVolumeSpecName: "ceph") pod "b9069bf9-41de-4faf-ad86-3913be33cb1a" (UID: "b9069bf9-41de-4faf-ad86-3913be33cb1a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:45:25 crc kubenswrapper[4743]: I1011 01:45:25.736581 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9069bf9-41de-4faf-ad86-3913be33cb1a-kube-api-access-6kxlr" (OuterVolumeSpecName: "kube-api-access-6kxlr") pod "b9069bf9-41de-4faf-ad86-3913be33cb1a" (UID: "b9069bf9-41de-4faf-ad86-3913be33cb1a"). InnerVolumeSpecName "kube-api-access-6kxlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:45:25 crc kubenswrapper[4743]: I1011 01:45:25.743547 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9069bf9-41de-4faf-ad86-3913be33cb1a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "b9069bf9-41de-4faf-ad86-3913be33cb1a" (UID: "b9069bf9-41de-4faf-ad86-3913be33cb1a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:45:25 crc kubenswrapper[4743]: I1011 01:45:25.771602 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9069bf9-41de-4faf-ad86-3913be33cb1a-inventory" (OuterVolumeSpecName: "inventory") pod "b9069bf9-41de-4faf-ad86-3913be33cb1a" (UID: "b9069bf9-41de-4faf-ad86-3913be33cb1a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:45:25 crc kubenswrapper[4743]: I1011 01:45:25.782035 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9069bf9-41de-4faf-ad86-3913be33cb1a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b9069bf9-41de-4faf-ad86-3913be33cb1a" (UID: "b9069bf9-41de-4faf-ad86-3913be33cb1a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:45:25 crc kubenswrapper[4743]: I1011 01:45:25.833222 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9069bf9-41de-4faf-ad86-3913be33cb1a-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:45:25 crc kubenswrapper[4743]: I1011 01:45:25.833256 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kxlr\" (UniqueName: \"kubernetes.io/projected/b9069bf9-41de-4faf-ad86-3913be33cb1a-kube-api-access-6kxlr\") on node \"crc\" DevicePath \"\"" Oct 11 01:45:25 crc kubenswrapper[4743]: I1011 01:45:25.833269 4743 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9069bf9-41de-4faf-ad86-3913be33cb1a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:45:25 crc kubenswrapper[4743]: I1011 01:45:25.833277 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b9069bf9-41de-4faf-ad86-3913be33cb1a-ceph\") on node \"crc\" DevicePath \"\"" Oct 11 01:45:25 crc kubenswrapper[4743]: I1011 01:45:25.833286 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b9069bf9-41de-4faf-ad86-3913be33cb1a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.159511 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x" event={"ID":"b9069bf9-41de-4faf-ad86-3913be33cb1a","Type":"ContainerDied","Data":"4b5bc8d2bdfd1725fd7b16ebeed70b2563b3368ee4cba36a4a06ae91aa22506f"} Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.159576 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b5bc8d2bdfd1725fd7b16ebeed70b2563b3368ee4cba36a4a06ae91aa22506f" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.159581 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.255811 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc"] Oct 11 01:45:26 crc kubenswrapper[4743]: E1011 01:45:26.256589 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9069bf9-41de-4faf-ad86-3913be33cb1a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.256688 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9069bf9-41de-4faf-ad86-3913be33cb1a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.257066 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9069bf9-41de-4faf-ad86-3913be33cb1a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.258279 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.260900 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.260931 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.261331 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.262668 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.263093 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.263359 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc"] Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.446786 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44bsx\" (UniqueName: \"kubernetes.io/projected/5ce1ff59-69f9-466b-926d-4785eb4df84f-kube-api-access-44bsx\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc\" (UID: \"5ce1ff59-69f9-466b-926d-4785eb4df84f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.446947 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ce1ff59-69f9-466b-926d-4785eb4df84f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc\" (UID: \"5ce1ff59-69f9-466b-926d-4785eb4df84f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.447021 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce1ff59-69f9-466b-926d-4785eb4df84f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc\" (UID: \"5ce1ff59-69f9-466b-926d-4785eb4df84f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.447110 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5ce1ff59-69f9-466b-926d-4785eb4df84f-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc\" (UID: \"5ce1ff59-69f9-466b-926d-4785eb4df84f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.447177 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ce1ff59-69f9-466b-926d-4785eb4df84f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc\" (UID: \"5ce1ff59-69f9-466b-926d-4785eb4df84f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.548550 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ce1ff59-69f9-466b-926d-4785eb4df84f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc\" (UID: \"5ce1ff59-69f9-466b-926d-4785eb4df84f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.548631 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce1ff59-69f9-466b-926d-4785eb4df84f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc\" (UID: \"5ce1ff59-69f9-466b-926d-4785eb4df84f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.548710 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5ce1ff59-69f9-466b-926d-4785eb4df84f-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc\" (UID: \"5ce1ff59-69f9-466b-926d-4785eb4df84f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.548740 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ce1ff59-69f9-466b-926d-4785eb4df84f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc\" (UID: \"5ce1ff59-69f9-466b-926d-4785eb4df84f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.548832 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44bsx\" (UniqueName: \"kubernetes.io/projected/5ce1ff59-69f9-466b-926d-4785eb4df84f-kube-api-access-44bsx\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc\" (UID: \"5ce1ff59-69f9-466b-926d-4785eb4df84f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.554422 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ce1ff59-69f9-466b-926d-4785eb4df84f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc\" (UID: \"5ce1ff59-69f9-466b-926d-4785eb4df84f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.555283 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce1ff59-69f9-466b-926d-4785eb4df84f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc\" (UID: \"5ce1ff59-69f9-466b-926d-4785eb4df84f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.555893 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ce1ff59-69f9-466b-926d-4785eb4df84f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc\" (UID: \"5ce1ff59-69f9-466b-926d-4785eb4df84f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.559797 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5ce1ff59-69f9-466b-926d-4785eb4df84f-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc\" (UID: \"5ce1ff59-69f9-466b-926d-4785eb4df84f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.584043 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44bsx\" (UniqueName: \"kubernetes.io/projected/5ce1ff59-69f9-466b-926d-4785eb4df84f-kube-api-access-44bsx\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc\" (UID: \"5ce1ff59-69f9-466b-926d-4785eb4df84f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc" Oct 11 01:45:26 crc kubenswrapper[4743]: I1011 01:45:26.883985 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc" Oct 11 01:45:27 crc kubenswrapper[4743]: I1011 01:45:27.479083 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc"] Oct 11 01:45:28 crc kubenswrapper[4743]: I1011 01:45:28.183348 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc" event={"ID":"5ce1ff59-69f9-466b-926d-4785eb4df84f","Type":"ContainerStarted","Data":"35d8a0a5b9a85e3ad1fea157a31fdbc0fa1bad79b8b3c3f043c6f07ed266c328"} Oct 11 01:45:28 crc kubenswrapper[4743]: I1011 01:45:28.183699 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc" event={"ID":"5ce1ff59-69f9-466b-926d-4785eb4df84f","Type":"ContainerStarted","Data":"b5b29069e51926aa719e9ba456d7b6b3d0c8bb1c10b0c03c1c3b24ceaea0d8fa"} Oct 11 01:45:28 crc kubenswrapper[4743]: I1011 01:45:28.204081 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc" podStartSLOduration=1.786191914 podStartE2EDuration="2.204063358s" podCreationTimestamp="2025-10-11 01:45:26 +0000 UTC" firstStartedPulling="2025-10-11 01:45:27.487448353 +0000 UTC m=+3222.140428750" lastFinishedPulling="2025-10-11 01:45:27.905319757 +0000 UTC m=+3222.558300194" observedRunningTime="2025-10-11 01:45:28.195966417 +0000 UTC m=+3222.848946824" watchObservedRunningTime="2025-10-11 01:45:28.204063358 +0000 UTC m=+3222.857043765" Oct 11 01:45:53 crc kubenswrapper[4743]: I1011 01:45:53.331709 4743 scope.go:117] "RemoveContainer" containerID="aeaf8587b8db64a255c5b029166a9362a4051b83cce8148721465e9e03f6e91e" Oct 11 01:45:53 crc kubenswrapper[4743]: I1011 01:45:53.425227 4743 scope.go:117] "RemoveContainer" containerID="9954638b31e5b269195f86d27aa2cc9adcdfc1813da96e97ac8cf5069b92922f" Oct 11 01:45:53 crc kubenswrapper[4743]: I1011 01:45:53.521037 4743 scope.go:117] "RemoveContainer" containerID="fddddb5adf6d9ef199f5626c1e325b4bbaf06841c1ed1819c9c4e434a634c96e" Oct 11 01:45:53 crc kubenswrapper[4743]: I1011 01:45:53.572327 4743 scope.go:117] "RemoveContainer" containerID="1f55586e20dac03f58a2e80d1f7ceea8f344878d3d78dc137749715f51568cce" Oct 11 01:45:53 crc kubenswrapper[4743]: I1011 01:45:53.667124 4743 scope.go:117] "RemoveContainer" containerID="0a4accac8eb229d5f59e86320d47a84e76855054259ad92d5660aecd9d182161" Oct 11 01:45:53 crc kubenswrapper[4743]: I1011 01:45:53.714006 4743 scope.go:117] "RemoveContainer" containerID="122eba6d93dbca833018bf9d8c603bbac03a07e43c41098372e569602536e126" Oct 11 01:45:53 crc kubenswrapper[4743]: I1011 01:45:53.765441 4743 scope.go:117] "RemoveContainer" containerID="50eea15b2740c117a462c411d8db4768720a017834258dfe8ecaf0b4e24b0453" Oct 11 01:45:53 crc kubenswrapper[4743]: I1011 01:45:53.822673 4743 scope.go:117] "RemoveContainer" containerID="66afe0758ed02251e593de3b2daeac8d2dbb2c88186d9e7047432c40f2dced08" Oct 11 01:45:53 crc kubenswrapper[4743]: I1011 01:45:53.859427 4743 scope.go:117] "RemoveContainer" containerID="5e69a52f2f2fcdef405ba2a5f48463835b5e6ecfab78a93354182fdc6b69905f" Oct 11 01:45:53 crc kubenswrapper[4743]: I1011 01:45:53.894633 4743 scope.go:117] "RemoveContainer" containerID="f3255b4ef4ec1544035bc349b0d79923e4e4805732f30735ed42be6491cd27ce" Oct 11 01:45:53 crc kubenswrapper[4743]: I1011 01:45:53.935670 4743 scope.go:117] "RemoveContainer" containerID="16c011b3dbb50faac3d0e841f0c5671da27839de7eace05fa026551671c0cca3" Oct 11 01:45:53 crc kubenswrapper[4743]: I1011 01:45:53.963807 4743 scope.go:117] "RemoveContainer" containerID="11579dcba3db904e38d5126991e25cd8c01bfe9e4b7659a4b740c2198691b63d" Oct 11 01:45:54 crc kubenswrapper[4743]: I1011 01:45:54.003637 4743 scope.go:117] "RemoveContainer" containerID="a5c45fb4c9e8a53dc8993d298f39bc21ac4e8328ec181a98d856ef78c7809701" Oct 11 01:45:54 crc kubenswrapper[4743]: I1011 01:45:54.054189 4743 scope.go:117] "RemoveContainer" containerID="7110baaaee30b5c52234aea1454deb596388bc12377152a69a39c536e4080d71" Oct 11 01:45:54 crc kubenswrapper[4743]: I1011 01:45:54.108483 4743 scope.go:117] "RemoveContainer" containerID="e48806dc0337703b86d04512c6fa718b8c44f1427f444fb92e8e70b4246812cb" Oct 11 01:45:54 crc kubenswrapper[4743]: I1011 01:45:54.163217 4743 scope.go:117] "RemoveContainer" containerID="319cc7abdc442387fa721383536d6b4b2212dab24aec428ede031df238824c38" Oct 11 01:45:54 crc kubenswrapper[4743]: I1011 01:45:54.228202 4743 scope.go:117] "RemoveContainer" containerID="fa1867cd662351046501c234689e2de6640e4bd19820fe084e7482f0ca1a8784" Oct 11 01:45:54 crc kubenswrapper[4743]: I1011 01:45:54.305249 4743 scope.go:117] "RemoveContainer" containerID="b543d07a48df3636a760ba323820411308602c28c463342486510a0946d99a1c" Oct 11 01:46:45 crc kubenswrapper[4743]: I1011 01:46:45.437958 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gggb6"] Oct 11 01:46:45 crc kubenswrapper[4743]: I1011 01:46:45.443193 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gggb6" Oct 11 01:46:45 crc kubenswrapper[4743]: I1011 01:46:45.466504 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gggb6"] Oct 11 01:46:45 crc kubenswrapper[4743]: I1011 01:46:45.638522 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30185ee3-9bba-4abf-aeec-e7f3f3ec7475-catalog-content\") pod \"redhat-operators-gggb6\" (UID: \"30185ee3-9bba-4abf-aeec-e7f3f3ec7475\") " pod="openshift-marketplace/redhat-operators-gggb6" Oct 11 01:46:45 crc kubenswrapper[4743]: I1011 01:46:45.638643 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nncv7\" (UniqueName: \"kubernetes.io/projected/30185ee3-9bba-4abf-aeec-e7f3f3ec7475-kube-api-access-nncv7\") pod \"redhat-operators-gggb6\" (UID: \"30185ee3-9bba-4abf-aeec-e7f3f3ec7475\") " pod="openshift-marketplace/redhat-operators-gggb6" Oct 11 01:46:45 crc kubenswrapper[4743]: I1011 01:46:45.638795 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30185ee3-9bba-4abf-aeec-e7f3f3ec7475-utilities\") pod \"redhat-operators-gggb6\" (UID: \"30185ee3-9bba-4abf-aeec-e7f3f3ec7475\") " pod="openshift-marketplace/redhat-operators-gggb6" Oct 11 01:46:45 crc kubenswrapper[4743]: I1011 01:46:45.740821 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30185ee3-9bba-4abf-aeec-e7f3f3ec7475-utilities\") pod \"redhat-operators-gggb6\" (UID: \"30185ee3-9bba-4abf-aeec-e7f3f3ec7475\") " pod="openshift-marketplace/redhat-operators-gggb6" Oct 11 01:46:45 crc kubenswrapper[4743]: I1011 01:46:45.741057 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30185ee3-9bba-4abf-aeec-e7f3f3ec7475-catalog-content\") pod \"redhat-operators-gggb6\" (UID: \"30185ee3-9bba-4abf-aeec-e7f3f3ec7475\") " pod="openshift-marketplace/redhat-operators-gggb6" Oct 11 01:46:45 crc kubenswrapper[4743]: I1011 01:46:45.741110 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nncv7\" (UniqueName: \"kubernetes.io/projected/30185ee3-9bba-4abf-aeec-e7f3f3ec7475-kube-api-access-nncv7\") pod \"redhat-operators-gggb6\" (UID: \"30185ee3-9bba-4abf-aeec-e7f3f3ec7475\") " pod="openshift-marketplace/redhat-operators-gggb6" Oct 11 01:46:45 crc kubenswrapper[4743]: I1011 01:46:45.741990 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30185ee3-9bba-4abf-aeec-e7f3f3ec7475-catalog-content\") pod \"redhat-operators-gggb6\" (UID: \"30185ee3-9bba-4abf-aeec-e7f3f3ec7475\") " pod="openshift-marketplace/redhat-operators-gggb6" Oct 11 01:46:45 crc kubenswrapper[4743]: I1011 01:46:45.742003 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30185ee3-9bba-4abf-aeec-e7f3f3ec7475-utilities\") pod \"redhat-operators-gggb6\" (UID: \"30185ee3-9bba-4abf-aeec-e7f3f3ec7475\") " pod="openshift-marketplace/redhat-operators-gggb6" Oct 11 01:46:45 crc kubenswrapper[4743]: I1011 01:46:45.764882 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nncv7\" (UniqueName: \"kubernetes.io/projected/30185ee3-9bba-4abf-aeec-e7f3f3ec7475-kube-api-access-nncv7\") pod \"redhat-operators-gggb6\" (UID: \"30185ee3-9bba-4abf-aeec-e7f3f3ec7475\") " pod="openshift-marketplace/redhat-operators-gggb6" Oct 11 01:46:45 crc kubenswrapper[4743]: I1011 01:46:45.771621 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gggb6" Oct 11 01:46:46 crc kubenswrapper[4743]: I1011 01:46:46.282278 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gggb6"] Oct 11 01:46:47 crc kubenswrapper[4743]: I1011 01:46:47.205342 4743 generic.go:334] "Generic (PLEG): container finished" podID="30185ee3-9bba-4abf-aeec-e7f3f3ec7475" containerID="bb822839e4b7b4907b3f3cf6010a757a2465c6000284afe89f5eafb7da6c9765" exitCode=0 Oct 11 01:46:47 crc kubenswrapper[4743]: I1011 01:46:47.205731 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gggb6" event={"ID":"30185ee3-9bba-4abf-aeec-e7f3f3ec7475","Type":"ContainerDied","Data":"bb822839e4b7b4907b3f3cf6010a757a2465c6000284afe89f5eafb7da6c9765"} Oct 11 01:46:47 crc kubenswrapper[4743]: I1011 01:46:47.205780 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gggb6" event={"ID":"30185ee3-9bba-4abf-aeec-e7f3f3ec7475","Type":"ContainerStarted","Data":"89ad4b717dcca4fa3d793f1e259a7c34840fa95b597bd159237df6801ff8d25e"} Oct 11 01:46:49 crc kubenswrapper[4743]: I1011 01:46:49.225848 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gggb6" event={"ID":"30185ee3-9bba-4abf-aeec-e7f3f3ec7475","Type":"ContainerStarted","Data":"e475fd43cbca75d55e0426433a43d72e5400ab2f9db5ccf75439c3cf294a8987"} Oct 11 01:46:50 crc kubenswrapper[4743]: I1011 01:46:50.246762 4743 generic.go:334] "Generic (PLEG): container finished" podID="30185ee3-9bba-4abf-aeec-e7f3f3ec7475" containerID="e475fd43cbca75d55e0426433a43d72e5400ab2f9db5ccf75439c3cf294a8987" exitCode=0 Oct 11 01:46:50 crc kubenswrapper[4743]: I1011 01:46:50.247008 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gggb6" event={"ID":"30185ee3-9bba-4abf-aeec-e7f3f3ec7475","Type":"ContainerDied","Data":"e475fd43cbca75d55e0426433a43d72e5400ab2f9db5ccf75439c3cf294a8987"} Oct 11 01:46:51 crc kubenswrapper[4743]: I1011 01:46:51.256067 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gggb6" event={"ID":"30185ee3-9bba-4abf-aeec-e7f3f3ec7475","Type":"ContainerStarted","Data":"eda16c735183e4df25310c008fa1d7fb494adc2f4cc838d568c41f037d8292e7"} Oct 11 01:46:51 crc kubenswrapper[4743]: I1011 01:46:51.280512 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gggb6" podStartSLOduration=2.6319320360000003 podStartE2EDuration="6.280493706s" podCreationTimestamp="2025-10-11 01:46:45 +0000 UTC" firstStartedPulling="2025-10-11 01:46:47.215067855 +0000 UTC m=+3301.868048262" lastFinishedPulling="2025-10-11 01:46:50.863629535 +0000 UTC m=+3305.516609932" observedRunningTime="2025-10-11 01:46:51.273891812 +0000 UTC m=+3305.926872209" watchObservedRunningTime="2025-10-11 01:46:51.280493706 +0000 UTC m=+3305.933474103" Oct 11 01:46:55 crc kubenswrapper[4743]: I1011 01:46:55.772611 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gggb6" Oct 11 01:46:55 crc kubenswrapper[4743]: I1011 01:46:55.775079 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gggb6" Oct 11 01:46:56 crc kubenswrapper[4743]: I1011 01:46:56.856489 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gggb6" podUID="30185ee3-9bba-4abf-aeec-e7f3f3ec7475" containerName="registry-server" probeResult="failure" output=< Oct 11 01:46:56 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Oct 11 01:46:56 crc kubenswrapper[4743]: > Oct 11 01:47:05 crc kubenswrapper[4743]: I1011 01:47:05.849766 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gggb6" Oct 11 01:47:05 crc kubenswrapper[4743]: I1011 01:47:05.931675 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gggb6" Oct 11 01:47:06 crc kubenswrapper[4743]: I1011 01:47:06.116633 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gggb6"] Oct 11 01:47:07 crc kubenswrapper[4743]: I1011 01:47:07.459066 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gggb6" podUID="30185ee3-9bba-4abf-aeec-e7f3f3ec7475" containerName="registry-server" containerID="cri-o://eda16c735183e4df25310c008fa1d7fb494adc2f4cc838d568c41f037d8292e7" gracePeriod=2 Oct 11 01:47:08 crc kubenswrapper[4743]: I1011 01:47:08.014630 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gggb6" Oct 11 01:47:08 crc kubenswrapper[4743]: I1011 01:47:08.111395 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30185ee3-9bba-4abf-aeec-e7f3f3ec7475-utilities\") pod \"30185ee3-9bba-4abf-aeec-e7f3f3ec7475\" (UID: \"30185ee3-9bba-4abf-aeec-e7f3f3ec7475\") " Oct 11 01:47:08 crc kubenswrapper[4743]: I1011 01:47:08.111494 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30185ee3-9bba-4abf-aeec-e7f3f3ec7475-catalog-content\") pod \"30185ee3-9bba-4abf-aeec-e7f3f3ec7475\" (UID: \"30185ee3-9bba-4abf-aeec-e7f3f3ec7475\") " Oct 11 01:47:08 crc kubenswrapper[4743]: I1011 01:47:08.111558 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nncv7\" (UniqueName: \"kubernetes.io/projected/30185ee3-9bba-4abf-aeec-e7f3f3ec7475-kube-api-access-nncv7\") pod \"30185ee3-9bba-4abf-aeec-e7f3f3ec7475\" (UID: \"30185ee3-9bba-4abf-aeec-e7f3f3ec7475\") " Oct 11 01:47:08 crc kubenswrapper[4743]: I1011 01:47:08.113069 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30185ee3-9bba-4abf-aeec-e7f3f3ec7475-utilities" (OuterVolumeSpecName: "utilities") pod "30185ee3-9bba-4abf-aeec-e7f3f3ec7475" (UID: "30185ee3-9bba-4abf-aeec-e7f3f3ec7475"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:47:08 crc kubenswrapper[4743]: I1011 01:47:08.118020 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30185ee3-9bba-4abf-aeec-e7f3f3ec7475-kube-api-access-nncv7" (OuterVolumeSpecName: "kube-api-access-nncv7") pod "30185ee3-9bba-4abf-aeec-e7f3f3ec7475" (UID: "30185ee3-9bba-4abf-aeec-e7f3f3ec7475"). InnerVolumeSpecName "kube-api-access-nncv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:47:08 crc kubenswrapper[4743]: I1011 01:47:08.201840 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30185ee3-9bba-4abf-aeec-e7f3f3ec7475-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30185ee3-9bba-4abf-aeec-e7f3f3ec7475" (UID: "30185ee3-9bba-4abf-aeec-e7f3f3ec7475"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:47:08 crc kubenswrapper[4743]: I1011 01:47:08.213784 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30185ee3-9bba-4abf-aeec-e7f3f3ec7475-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 01:47:08 crc kubenswrapper[4743]: I1011 01:47:08.214074 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30185ee3-9bba-4abf-aeec-e7f3f3ec7475-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 01:47:08 crc kubenswrapper[4743]: I1011 01:47:08.214085 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nncv7\" (UniqueName: \"kubernetes.io/projected/30185ee3-9bba-4abf-aeec-e7f3f3ec7475-kube-api-access-nncv7\") on node \"crc\" DevicePath \"\"" Oct 11 01:47:08 crc kubenswrapper[4743]: I1011 01:47:08.470852 4743 generic.go:334] "Generic (PLEG): container finished" podID="30185ee3-9bba-4abf-aeec-e7f3f3ec7475" containerID="eda16c735183e4df25310c008fa1d7fb494adc2f4cc838d568c41f037d8292e7" exitCode=0 Oct 11 01:47:08 crc kubenswrapper[4743]: I1011 01:47:08.470911 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gggb6" event={"ID":"30185ee3-9bba-4abf-aeec-e7f3f3ec7475","Type":"ContainerDied","Data":"eda16c735183e4df25310c008fa1d7fb494adc2f4cc838d568c41f037d8292e7"} Oct 11 01:47:08 crc kubenswrapper[4743]: I1011 01:47:08.470934 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gggb6" event={"ID":"30185ee3-9bba-4abf-aeec-e7f3f3ec7475","Type":"ContainerDied","Data":"89ad4b717dcca4fa3d793f1e259a7c34840fa95b597bd159237df6801ff8d25e"} Oct 11 01:47:08 crc kubenswrapper[4743]: I1011 01:47:08.470950 4743 scope.go:117] "RemoveContainer" containerID="eda16c735183e4df25310c008fa1d7fb494adc2f4cc838d568c41f037d8292e7" Oct 11 01:47:08 crc kubenswrapper[4743]: I1011 01:47:08.471058 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gggb6" Oct 11 01:47:08 crc kubenswrapper[4743]: I1011 01:47:08.508506 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gggb6"] Oct 11 01:47:08 crc kubenswrapper[4743]: I1011 01:47:08.516970 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gggb6"] Oct 11 01:47:08 crc kubenswrapper[4743]: I1011 01:47:08.517178 4743 scope.go:117] "RemoveContainer" containerID="e475fd43cbca75d55e0426433a43d72e5400ab2f9db5ccf75439c3cf294a8987" Oct 11 01:47:08 crc kubenswrapper[4743]: I1011 01:47:08.543849 4743 scope.go:117] "RemoveContainer" containerID="bb822839e4b7b4907b3f3cf6010a757a2465c6000284afe89f5eafb7da6c9765" Oct 11 01:47:08 crc kubenswrapper[4743]: I1011 01:47:08.588994 4743 scope.go:117] "RemoveContainer" containerID="eda16c735183e4df25310c008fa1d7fb494adc2f4cc838d568c41f037d8292e7" Oct 11 01:47:08 crc kubenswrapper[4743]: E1011 01:47:08.589473 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eda16c735183e4df25310c008fa1d7fb494adc2f4cc838d568c41f037d8292e7\": container with ID starting with eda16c735183e4df25310c008fa1d7fb494adc2f4cc838d568c41f037d8292e7 not found: ID does not exist" containerID="eda16c735183e4df25310c008fa1d7fb494adc2f4cc838d568c41f037d8292e7" Oct 11 01:47:08 crc kubenswrapper[4743]: I1011 01:47:08.589502 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eda16c735183e4df25310c008fa1d7fb494adc2f4cc838d568c41f037d8292e7"} err="failed to get container status \"eda16c735183e4df25310c008fa1d7fb494adc2f4cc838d568c41f037d8292e7\": rpc error: code = NotFound desc = could not find container \"eda16c735183e4df25310c008fa1d7fb494adc2f4cc838d568c41f037d8292e7\": container with ID starting with eda16c735183e4df25310c008fa1d7fb494adc2f4cc838d568c41f037d8292e7 not found: ID does not exist" Oct 11 01:47:08 crc kubenswrapper[4743]: I1011 01:47:08.589522 4743 scope.go:117] "RemoveContainer" containerID="e475fd43cbca75d55e0426433a43d72e5400ab2f9db5ccf75439c3cf294a8987" Oct 11 01:47:08 crc kubenswrapper[4743]: E1011 01:47:08.589782 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e475fd43cbca75d55e0426433a43d72e5400ab2f9db5ccf75439c3cf294a8987\": container with ID starting with e475fd43cbca75d55e0426433a43d72e5400ab2f9db5ccf75439c3cf294a8987 not found: ID does not exist" containerID="e475fd43cbca75d55e0426433a43d72e5400ab2f9db5ccf75439c3cf294a8987" Oct 11 01:47:08 crc kubenswrapper[4743]: I1011 01:47:08.589798 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e475fd43cbca75d55e0426433a43d72e5400ab2f9db5ccf75439c3cf294a8987"} err="failed to get container status \"e475fd43cbca75d55e0426433a43d72e5400ab2f9db5ccf75439c3cf294a8987\": rpc error: code = NotFound desc = could not find container \"e475fd43cbca75d55e0426433a43d72e5400ab2f9db5ccf75439c3cf294a8987\": container with ID starting with e475fd43cbca75d55e0426433a43d72e5400ab2f9db5ccf75439c3cf294a8987 not found: ID does not exist" Oct 11 01:47:08 crc kubenswrapper[4743]: I1011 01:47:08.589812 4743 scope.go:117] "RemoveContainer" containerID="bb822839e4b7b4907b3f3cf6010a757a2465c6000284afe89f5eafb7da6c9765" Oct 11 01:47:08 crc kubenswrapper[4743]: E1011 01:47:08.590103 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb822839e4b7b4907b3f3cf6010a757a2465c6000284afe89f5eafb7da6c9765\": container with ID starting with bb822839e4b7b4907b3f3cf6010a757a2465c6000284afe89f5eafb7da6c9765 not found: ID does not exist" containerID="bb822839e4b7b4907b3f3cf6010a757a2465c6000284afe89f5eafb7da6c9765" Oct 11 01:47:08 crc kubenswrapper[4743]: I1011 01:47:08.590127 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb822839e4b7b4907b3f3cf6010a757a2465c6000284afe89f5eafb7da6c9765"} err="failed to get container status \"bb822839e4b7b4907b3f3cf6010a757a2465c6000284afe89f5eafb7da6c9765\": rpc error: code = NotFound desc = could not find container \"bb822839e4b7b4907b3f3cf6010a757a2465c6000284afe89f5eafb7da6c9765\": container with ID starting with bb822839e4b7b4907b3f3cf6010a757a2465c6000284afe89f5eafb7da6c9765 not found: ID does not exist" Oct 11 01:47:10 crc kubenswrapper[4743]: I1011 01:47:10.105197 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30185ee3-9bba-4abf-aeec-e7f3f3ec7475" path="/var/lib/kubelet/pods/30185ee3-9bba-4abf-aeec-e7f3f3ec7475/volumes" Oct 11 01:47:14 crc kubenswrapper[4743]: I1011 01:47:14.458370 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:47:14 crc kubenswrapper[4743]: I1011 01:47:14.459009 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:47:24 crc kubenswrapper[4743]: I1011 01:47:24.633597 4743 generic.go:334] "Generic (PLEG): container finished" podID="5ce1ff59-69f9-466b-926d-4785eb4df84f" containerID="35d8a0a5b9a85e3ad1fea157a31fdbc0fa1bad79b8b3c3f043c6f07ed266c328" exitCode=0 Oct 11 01:47:24 crc kubenswrapper[4743]: I1011 01:47:24.633710 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc" event={"ID":"5ce1ff59-69f9-466b-926d-4785eb4df84f","Type":"ContainerDied","Data":"35d8a0a5b9a85e3ad1fea157a31fdbc0fa1bad79b8b3c3f043c6f07ed266c328"} Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.138936 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.250119 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44bsx\" (UniqueName: \"kubernetes.io/projected/5ce1ff59-69f9-466b-926d-4785eb4df84f-kube-api-access-44bsx\") pod \"5ce1ff59-69f9-466b-926d-4785eb4df84f\" (UID: \"5ce1ff59-69f9-466b-926d-4785eb4df84f\") " Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.250155 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ce1ff59-69f9-466b-926d-4785eb4df84f-inventory\") pod \"5ce1ff59-69f9-466b-926d-4785eb4df84f\" (UID: \"5ce1ff59-69f9-466b-926d-4785eb4df84f\") " Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.250242 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ce1ff59-69f9-466b-926d-4785eb4df84f-ssh-key\") pod \"5ce1ff59-69f9-466b-926d-4785eb4df84f\" (UID: \"5ce1ff59-69f9-466b-926d-4785eb4df84f\") " Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.250328 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce1ff59-69f9-466b-926d-4785eb4df84f-bootstrap-combined-ca-bundle\") pod \"5ce1ff59-69f9-466b-926d-4785eb4df84f\" (UID: \"5ce1ff59-69f9-466b-926d-4785eb4df84f\") " Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.250424 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5ce1ff59-69f9-466b-926d-4785eb4df84f-ceph\") pod \"5ce1ff59-69f9-466b-926d-4785eb4df84f\" (UID: \"5ce1ff59-69f9-466b-926d-4785eb4df84f\") " Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.256145 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce1ff59-69f9-466b-926d-4785eb4df84f-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "5ce1ff59-69f9-466b-926d-4785eb4df84f" (UID: "5ce1ff59-69f9-466b-926d-4785eb4df84f"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.256239 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce1ff59-69f9-466b-926d-4785eb4df84f-ceph" (OuterVolumeSpecName: "ceph") pod "5ce1ff59-69f9-466b-926d-4785eb4df84f" (UID: "5ce1ff59-69f9-466b-926d-4785eb4df84f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.260258 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ce1ff59-69f9-466b-926d-4785eb4df84f-kube-api-access-44bsx" (OuterVolumeSpecName: "kube-api-access-44bsx") pod "5ce1ff59-69f9-466b-926d-4785eb4df84f" (UID: "5ce1ff59-69f9-466b-926d-4785eb4df84f"). InnerVolumeSpecName "kube-api-access-44bsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.279579 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce1ff59-69f9-466b-926d-4785eb4df84f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5ce1ff59-69f9-466b-926d-4785eb4df84f" (UID: "5ce1ff59-69f9-466b-926d-4785eb4df84f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.281443 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce1ff59-69f9-466b-926d-4785eb4df84f-inventory" (OuterVolumeSpecName: "inventory") pod "5ce1ff59-69f9-466b-926d-4785eb4df84f" (UID: "5ce1ff59-69f9-466b-926d-4785eb4df84f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.352603 4743 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce1ff59-69f9-466b-926d-4785eb4df84f-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.352641 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5ce1ff59-69f9-466b-926d-4785eb4df84f-ceph\") on node \"crc\" DevicePath \"\"" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.352655 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44bsx\" (UniqueName: \"kubernetes.io/projected/5ce1ff59-69f9-466b-926d-4785eb4df84f-kube-api-access-44bsx\") on node \"crc\" DevicePath \"\"" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.352668 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ce1ff59-69f9-466b-926d-4785eb4df84f-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.352681 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ce1ff59-69f9-466b-926d-4785eb4df84f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.657043 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc" event={"ID":"5ce1ff59-69f9-466b-926d-4785eb4df84f","Type":"ContainerDied","Data":"b5b29069e51926aa719e9ba456d7b6b3d0c8bb1c10b0c03c1c3b24ceaea0d8fa"} Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.657098 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5b29069e51926aa719e9ba456d7b6b3d0c8bb1c10b0c03c1c3b24ceaea0d8fa" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.657175 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.797806 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr"] Oct 11 01:47:26 crc kubenswrapper[4743]: E1011 01:47:26.798337 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce1ff59-69f9-466b-926d-4785eb4df84f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.798362 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce1ff59-69f9-466b-926d-4785eb4df84f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 11 01:47:26 crc kubenswrapper[4743]: E1011 01:47:26.798384 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30185ee3-9bba-4abf-aeec-e7f3f3ec7475" containerName="registry-server" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.798393 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="30185ee3-9bba-4abf-aeec-e7f3f3ec7475" containerName="registry-server" Oct 11 01:47:26 crc kubenswrapper[4743]: E1011 01:47:26.798414 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30185ee3-9bba-4abf-aeec-e7f3f3ec7475" containerName="extract-content" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.798424 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="30185ee3-9bba-4abf-aeec-e7f3f3ec7475" containerName="extract-content" Oct 11 01:47:26 crc kubenswrapper[4743]: E1011 01:47:26.798469 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30185ee3-9bba-4abf-aeec-e7f3f3ec7475" containerName="extract-utilities" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.798478 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="30185ee3-9bba-4abf-aeec-e7f3f3ec7475" containerName="extract-utilities" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.798711 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="30185ee3-9bba-4abf-aeec-e7f3f3ec7475" containerName="registry-server" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.798740 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ce1ff59-69f9-466b-926d-4785eb4df84f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.799688 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.802957 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.803187 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.803437 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.803606 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.804311 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.816628 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr"] Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.965122 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5205ba97-c5be-49b8-a4a6-2570d1b602d2-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr\" (UID: \"5205ba97-c5be-49b8-a4a6-2570d1b602d2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.965265 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5205ba97-c5be-49b8-a4a6-2570d1b602d2-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr\" (UID: \"5205ba97-c5be-49b8-a4a6-2570d1b602d2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.965613 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj4mz\" (UniqueName: \"kubernetes.io/projected/5205ba97-c5be-49b8-a4a6-2570d1b602d2-kube-api-access-bj4mz\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr\" (UID: \"5205ba97-c5be-49b8-a4a6-2570d1b602d2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr" Oct 11 01:47:26 crc kubenswrapper[4743]: I1011 01:47:26.965809 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5205ba97-c5be-49b8-a4a6-2570d1b602d2-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr\" (UID: \"5205ba97-c5be-49b8-a4a6-2570d1b602d2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr" Oct 11 01:47:27 crc kubenswrapper[4743]: I1011 01:47:27.068111 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5205ba97-c5be-49b8-a4a6-2570d1b602d2-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr\" (UID: \"5205ba97-c5be-49b8-a4a6-2570d1b602d2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr" Oct 11 01:47:27 crc kubenswrapper[4743]: I1011 01:47:27.068663 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj4mz\" (UniqueName: \"kubernetes.io/projected/5205ba97-c5be-49b8-a4a6-2570d1b602d2-kube-api-access-bj4mz\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr\" (UID: \"5205ba97-c5be-49b8-a4a6-2570d1b602d2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr" Oct 11 01:47:27 crc kubenswrapper[4743]: I1011 01:47:27.068946 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5205ba97-c5be-49b8-a4a6-2570d1b602d2-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr\" (UID: \"5205ba97-c5be-49b8-a4a6-2570d1b602d2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr" Oct 11 01:47:27 crc kubenswrapper[4743]: I1011 01:47:27.069154 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5205ba97-c5be-49b8-a4a6-2570d1b602d2-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr\" (UID: \"5205ba97-c5be-49b8-a4a6-2570d1b602d2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr" Oct 11 01:47:27 crc kubenswrapper[4743]: I1011 01:47:27.074084 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5205ba97-c5be-49b8-a4a6-2570d1b602d2-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr\" (UID: \"5205ba97-c5be-49b8-a4a6-2570d1b602d2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr" Oct 11 01:47:27 crc kubenswrapper[4743]: I1011 01:47:27.074408 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5205ba97-c5be-49b8-a4a6-2570d1b602d2-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr\" (UID: \"5205ba97-c5be-49b8-a4a6-2570d1b602d2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr" Oct 11 01:47:27 crc kubenswrapper[4743]: I1011 01:47:27.079503 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5205ba97-c5be-49b8-a4a6-2570d1b602d2-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr\" (UID: \"5205ba97-c5be-49b8-a4a6-2570d1b602d2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr" Oct 11 01:47:27 crc kubenswrapper[4743]: I1011 01:47:27.086316 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj4mz\" (UniqueName: \"kubernetes.io/projected/5205ba97-c5be-49b8-a4a6-2570d1b602d2-kube-api-access-bj4mz\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr\" (UID: \"5205ba97-c5be-49b8-a4a6-2570d1b602d2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr" Oct 11 01:47:27 crc kubenswrapper[4743]: I1011 01:47:27.115570 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr" Oct 11 01:47:27 crc kubenswrapper[4743]: I1011 01:47:27.765279 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr"] Oct 11 01:47:27 crc kubenswrapper[4743]: W1011 01:47:27.771218 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5205ba97_c5be_49b8_a4a6_2570d1b602d2.slice/crio-280c2c89df1807292244b2a6abce6b7774c3e9ae61332658cb7ebd3fa7a7dc16 WatchSource:0}: Error finding container 280c2c89df1807292244b2a6abce6b7774c3e9ae61332658cb7ebd3fa7a7dc16: Status 404 returned error can't find the container with id 280c2c89df1807292244b2a6abce6b7774c3e9ae61332658cb7ebd3fa7a7dc16 Oct 11 01:47:28 crc kubenswrapper[4743]: I1011 01:47:28.681300 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr" event={"ID":"5205ba97-c5be-49b8-a4a6-2570d1b602d2","Type":"ContainerStarted","Data":"6ebec36cf72ce313b2e98f65e0dd89d5e2d8679f027dc3a5cdd2a8313c925529"} Oct 11 01:47:28 crc kubenswrapper[4743]: I1011 01:47:28.681623 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr" event={"ID":"5205ba97-c5be-49b8-a4a6-2570d1b602d2","Type":"ContainerStarted","Data":"280c2c89df1807292244b2a6abce6b7774c3e9ae61332658cb7ebd3fa7a7dc16"} Oct 11 01:47:28 crc kubenswrapper[4743]: I1011 01:47:28.698788 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr" podStartSLOduration=2.079097285 podStartE2EDuration="2.69876577s" podCreationTimestamp="2025-10-11 01:47:26 +0000 UTC" firstStartedPulling="2025-10-11 01:47:27.776337793 +0000 UTC m=+3342.429318190" lastFinishedPulling="2025-10-11 01:47:28.396006278 +0000 UTC m=+3343.048986675" observedRunningTime="2025-10-11 01:47:28.697707593 +0000 UTC m=+3343.350688000" watchObservedRunningTime="2025-10-11 01:47:28.69876577 +0000 UTC m=+3343.351746177" Oct 11 01:47:44 crc kubenswrapper[4743]: I1011 01:47:44.458252 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:47:44 crc kubenswrapper[4743]: I1011 01:47:44.459251 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:48:03 crc kubenswrapper[4743]: I1011 01:48:03.174893 4743 generic.go:334] "Generic (PLEG): container finished" podID="5205ba97-c5be-49b8-a4a6-2570d1b602d2" containerID="6ebec36cf72ce313b2e98f65e0dd89d5e2d8679f027dc3a5cdd2a8313c925529" exitCode=0 Oct 11 01:48:03 crc kubenswrapper[4743]: I1011 01:48:03.174967 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr" event={"ID":"5205ba97-c5be-49b8-a4a6-2570d1b602d2","Type":"ContainerDied","Data":"6ebec36cf72ce313b2e98f65e0dd89d5e2d8679f027dc3a5cdd2a8313c925529"} Oct 11 01:48:04 crc kubenswrapper[4743]: I1011 01:48:04.614935 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr" Oct 11 01:48:04 crc kubenswrapper[4743]: I1011 01:48:04.782378 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5205ba97-c5be-49b8-a4a6-2570d1b602d2-inventory\") pod \"5205ba97-c5be-49b8-a4a6-2570d1b602d2\" (UID: \"5205ba97-c5be-49b8-a4a6-2570d1b602d2\") " Oct 11 01:48:04 crc kubenswrapper[4743]: I1011 01:48:04.782597 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5205ba97-c5be-49b8-a4a6-2570d1b602d2-ceph\") pod \"5205ba97-c5be-49b8-a4a6-2570d1b602d2\" (UID: \"5205ba97-c5be-49b8-a4a6-2570d1b602d2\") " Oct 11 01:48:04 crc kubenswrapper[4743]: I1011 01:48:04.782644 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5205ba97-c5be-49b8-a4a6-2570d1b602d2-ssh-key\") pod \"5205ba97-c5be-49b8-a4a6-2570d1b602d2\" (UID: \"5205ba97-c5be-49b8-a4a6-2570d1b602d2\") " Oct 11 01:48:04 crc kubenswrapper[4743]: I1011 01:48:04.782665 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj4mz\" (UniqueName: \"kubernetes.io/projected/5205ba97-c5be-49b8-a4a6-2570d1b602d2-kube-api-access-bj4mz\") pod \"5205ba97-c5be-49b8-a4a6-2570d1b602d2\" (UID: \"5205ba97-c5be-49b8-a4a6-2570d1b602d2\") " Oct 11 01:48:04 crc kubenswrapper[4743]: I1011 01:48:04.789119 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5205ba97-c5be-49b8-a4a6-2570d1b602d2-kube-api-access-bj4mz" (OuterVolumeSpecName: "kube-api-access-bj4mz") pod "5205ba97-c5be-49b8-a4a6-2570d1b602d2" (UID: "5205ba97-c5be-49b8-a4a6-2570d1b602d2"). InnerVolumeSpecName "kube-api-access-bj4mz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:48:04 crc kubenswrapper[4743]: I1011 01:48:04.795000 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5205ba97-c5be-49b8-a4a6-2570d1b602d2-ceph" (OuterVolumeSpecName: "ceph") pod "5205ba97-c5be-49b8-a4a6-2570d1b602d2" (UID: "5205ba97-c5be-49b8-a4a6-2570d1b602d2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:48:04 crc kubenswrapper[4743]: I1011 01:48:04.811537 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5205ba97-c5be-49b8-a4a6-2570d1b602d2-inventory" (OuterVolumeSpecName: "inventory") pod "5205ba97-c5be-49b8-a4a6-2570d1b602d2" (UID: "5205ba97-c5be-49b8-a4a6-2570d1b602d2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:48:04 crc kubenswrapper[4743]: I1011 01:48:04.829260 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5205ba97-c5be-49b8-a4a6-2570d1b602d2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5205ba97-c5be-49b8-a4a6-2570d1b602d2" (UID: "5205ba97-c5be-49b8-a4a6-2570d1b602d2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:48:04 crc kubenswrapper[4743]: I1011 01:48:04.886078 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5205ba97-c5be-49b8-a4a6-2570d1b602d2-ceph\") on node \"crc\" DevicePath \"\"" Oct 11 01:48:04 crc kubenswrapper[4743]: I1011 01:48:04.886343 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5205ba97-c5be-49b8-a4a6-2570d1b602d2-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:48:04 crc kubenswrapper[4743]: I1011 01:48:04.886356 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bj4mz\" (UniqueName: \"kubernetes.io/projected/5205ba97-c5be-49b8-a4a6-2570d1b602d2-kube-api-access-bj4mz\") on node \"crc\" DevicePath \"\"" Oct 11 01:48:04 crc kubenswrapper[4743]: I1011 01:48:04.886366 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5205ba97-c5be-49b8-a4a6-2570d1b602d2-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.199174 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr" event={"ID":"5205ba97-c5be-49b8-a4a6-2570d1b602d2","Type":"ContainerDied","Data":"280c2c89df1807292244b2a6abce6b7774c3e9ae61332658cb7ebd3fa7a7dc16"} Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.199242 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="280c2c89df1807292244b2a6abce6b7774c3e9ae61332658cb7ebd3fa7a7dc16" Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.199326 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr" Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.303840 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h"] Oct 11 01:48:05 crc kubenswrapper[4743]: E1011 01:48:05.304350 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5205ba97-c5be-49b8-a4a6-2570d1b602d2" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.304368 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5205ba97-c5be-49b8-a4a6-2570d1b602d2" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.304584 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5205ba97-c5be-49b8-a4a6-2570d1b602d2" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.305354 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h" Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.310767 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.310882 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.310884 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.311086 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.311162 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.315533 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h"] Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.394761 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cb69ff06-2c84-40a1-805b-349c4fbfe3ba-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h\" (UID: \"cb69ff06-2c84-40a1-805b-349c4fbfe3ba\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h" Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.395322 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cb69ff06-2c84-40a1-805b-349c4fbfe3ba-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h\" (UID: \"cb69ff06-2c84-40a1-805b-349c4fbfe3ba\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h" Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.395390 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb69ff06-2c84-40a1-805b-349c4fbfe3ba-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h\" (UID: \"cb69ff06-2c84-40a1-805b-349c4fbfe3ba\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h" Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.395835 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npwxq\" (UniqueName: \"kubernetes.io/projected/cb69ff06-2c84-40a1-805b-349c4fbfe3ba-kube-api-access-npwxq\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h\" (UID: \"cb69ff06-2c84-40a1-805b-349c4fbfe3ba\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h" Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.497565 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cb69ff06-2c84-40a1-805b-349c4fbfe3ba-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h\" (UID: \"cb69ff06-2c84-40a1-805b-349c4fbfe3ba\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h" Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.497618 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb69ff06-2c84-40a1-805b-349c4fbfe3ba-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h\" (UID: \"cb69ff06-2c84-40a1-805b-349c4fbfe3ba\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h" Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.497796 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npwxq\" (UniqueName: \"kubernetes.io/projected/cb69ff06-2c84-40a1-805b-349c4fbfe3ba-kube-api-access-npwxq\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h\" (UID: \"cb69ff06-2c84-40a1-805b-349c4fbfe3ba\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h" Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.497842 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cb69ff06-2c84-40a1-805b-349c4fbfe3ba-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h\" (UID: \"cb69ff06-2c84-40a1-805b-349c4fbfe3ba\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h" Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.501945 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb69ff06-2c84-40a1-805b-349c4fbfe3ba-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h\" (UID: \"cb69ff06-2c84-40a1-805b-349c4fbfe3ba\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h" Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.502742 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cb69ff06-2c84-40a1-805b-349c4fbfe3ba-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h\" (UID: \"cb69ff06-2c84-40a1-805b-349c4fbfe3ba\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h" Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.504055 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cb69ff06-2c84-40a1-805b-349c4fbfe3ba-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h\" (UID: \"cb69ff06-2c84-40a1-805b-349c4fbfe3ba\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h" Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.517764 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npwxq\" (UniqueName: \"kubernetes.io/projected/cb69ff06-2c84-40a1-805b-349c4fbfe3ba-kube-api-access-npwxq\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h\" (UID: \"cb69ff06-2c84-40a1-805b-349c4fbfe3ba\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h" Oct 11 01:48:05 crc kubenswrapper[4743]: I1011 01:48:05.624130 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h" Oct 11 01:48:06 crc kubenswrapper[4743]: I1011 01:48:06.195603 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h"] Oct 11 01:48:06 crc kubenswrapper[4743]: W1011 01:48:06.200138 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb69ff06_2c84_40a1_805b_349c4fbfe3ba.slice/crio-39b6b618a33c586e68e82f9100dea6fbdb688a220a1305154d353104c07063b7 WatchSource:0}: Error finding container 39b6b618a33c586e68e82f9100dea6fbdb688a220a1305154d353104c07063b7: Status 404 returned error can't find the container with id 39b6b618a33c586e68e82f9100dea6fbdb688a220a1305154d353104c07063b7 Oct 11 01:48:06 crc kubenswrapper[4743]: I1011 01:48:06.216063 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h" event={"ID":"cb69ff06-2c84-40a1-805b-349c4fbfe3ba","Type":"ContainerStarted","Data":"39b6b618a33c586e68e82f9100dea6fbdb688a220a1305154d353104c07063b7"} Oct 11 01:48:07 crc kubenswrapper[4743]: I1011 01:48:07.233241 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h" event={"ID":"cb69ff06-2c84-40a1-805b-349c4fbfe3ba","Type":"ContainerStarted","Data":"8d45e399241e90c6fd43bc0a7813fc933577d5dc50dbb3b63622fb9fb921ed97"} Oct 11 01:48:07 crc kubenswrapper[4743]: I1011 01:48:07.259197 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h" podStartSLOduration=1.869011128 podStartE2EDuration="2.259177404s" podCreationTimestamp="2025-10-11 01:48:05 +0000 UTC" firstStartedPulling="2025-10-11 01:48:06.202104849 +0000 UTC m=+3380.855085246" lastFinishedPulling="2025-10-11 01:48:06.592271095 +0000 UTC m=+3381.245251522" observedRunningTime="2025-10-11 01:48:07.257193725 +0000 UTC m=+3381.910174192" watchObservedRunningTime="2025-10-11 01:48:07.259177404 +0000 UTC m=+3381.912157811" Oct 11 01:48:14 crc kubenswrapper[4743]: I1011 01:48:14.337510 4743 generic.go:334] "Generic (PLEG): container finished" podID="cb69ff06-2c84-40a1-805b-349c4fbfe3ba" containerID="8d45e399241e90c6fd43bc0a7813fc933577d5dc50dbb3b63622fb9fb921ed97" exitCode=0 Oct 11 01:48:14 crc kubenswrapper[4743]: I1011 01:48:14.337655 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h" event={"ID":"cb69ff06-2c84-40a1-805b-349c4fbfe3ba","Type":"ContainerDied","Data":"8d45e399241e90c6fd43bc0a7813fc933577d5dc50dbb3b63622fb9fb921ed97"} Oct 11 01:48:14 crc kubenswrapper[4743]: I1011 01:48:14.458631 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:48:14 crc kubenswrapper[4743]: I1011 01:48:14.458703 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:48:14 crc kubenswrapper[4743]: I1011 01:48:14.458754 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 01:48:14 crc kubenswrapper[4743]: I1011 01:48:14.459916 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"40fabd8824e0053fbc030d07f0d398ca43bf5381e369efae8cdd092e89d91e84"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 01:48:14 crc kubenswrapper[4743]: I1011 01:48:14.459995 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://40fabd8824e0053fbc030d07f0d398ca43bf5381e369efae8cdd092e89d91e84" gracePeriod=600 Oct 11 01:48:15 crc kubenswrapper[4743]: I1011 01:48:15.353282 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="40fabd8824e0053fbc030d07f0d398ca43bf5381e369efae8cdd092e89d91e84" exitCode=0 Oct 11 01:48:15 crc kubenswrapper[4743]: I1011 01:48:15.354220 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"40fabd8824e0053fbc030d07f0d398ca43bf5381e369efae8cdd092e89d91e84"} Oct 11 01:48:15 crc kubenswrapper[4743]: I1011 01:48:15.354262 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393"} Oct 11 01:48:15 crc kubenswrapper[4743]: I1011 01:48:15.354289 4743 scope.go:117] "RemoveContainer" containerID="978234a1553219d6d97cc060a945d4762c94dc6227d3827ebf03581b02816723" Oct 11 01:48:15 crc kubenswrapper[4743]: I1011 01:48:15.882326 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h" Oct 11 01:48:15 crc kubenswrapper[4743]: I1011 01:48:15.932388 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb69ff06-2c84-40a1-805b-349c4fbfe3ba-inventory\") pod \"cb69ff06-2c84-40a1-805b-349c4fbfe3ba\" (UID: \"cb69ff06-2c84-40a1-805b-349c4fbfe3ba\") " Oct 11 01:48:15 crc kubenswrapper[4743]: I1011 01:48:15.932737 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cb69ff06-2c84-40a1-805b-349c4fbfe3ba-ssh-key\") pod \"cb69ff06-2c84-40a1-805b-349c4fbfe3ba\" (UID: \"cb69ff06-2c84-40a1-805b-349c4fbfe3ba\") " Oct 11 01:48:15 crc kubenswrapper[4743]: I1011 01:48:15.933054 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npwxq\" (UniqueName: \"kubernetes.io/projected/cb69ff06-2c84-40a1-805b-349c4fbfe3ba-kube-api-access-npwxq\") pod \"cb69ff06-2c84-40a1-805b-349c4fbfe3ba\" (UID: \"cb69ff06-2c84-40a1-805b-349c4fbfe3ba\") " Oct 11 01:48:15 crc kubenswrapper[4743]: I1011 01:48:15.933933 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cb69ff06-2c84-40a1-805b-349c4fbfe3ba-ceph\") pod \"cb69ff06-2c84-40a1-805b-349c4fbfe3ba\" (UID: \"cb69ff06-2c84-40a1-805b-349c4fbfe3ba\") " Oct 11 01:48:15 crc kubenswrapper[4743]: I1011 01:48:15.938802 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb69ff06-2c84-40a1-805b-349c4fbfe3ba-kube-api-access-npwxq" (OuterVolumeSpecName: "kube-api-access-npwxq") pod "cb69ff06-2c84-40a1-805b-349c4fbfe3ba" (UID: "cb69ff06-2c84-40a1-805b-349c4fbfe3ba"). InnerVolumeSpecName "kube-api-access-npwxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:48:15 crc kubenswrapper[4743]: I1011 01:48:15.940620 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb69ff06-2c84-40a1-805b-349c4fbfe3ba-ceph" (OuterVolumeSpecName: "ceph") pod "cb69ff06-2c84-40a1-805b-349c4fbfe3ba" (UID: "cb69ff06-2c84-40a1-805b-349c4fbfe3ba"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:48:15 crc kubenswrapper[4743]: I1011 01:48:15.972428 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb69ff06-2c84-40a1-805b-349c4fbfe3ba-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cb69ff06-2c84-40a1-805b-349c4fbfe3ba" (UID: "cb69ff06-2c84-40a1-805b-349c4fbfe3ba"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:48:15 crc kubenswrapper[4743]: I1011 01:48:15.979029 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb69ff06-2c84-40a1-805b-349c4fbfe3ba-inventory" (OuterVolumeSpecName: "inventory") pod "cb69ff06-2c84-40a1-805b-349c4fbfe3ba" (UID: "cb69ff06-2c84-40a1-805b-349c4fbfe3ba"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.037498 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cb69ff06-2c84-40a1-805b-349c4fbfe3ba-ceph\") on node \"crc\" DevicePath \"\"" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.037534 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb69ff06-2c84-40a1-805b-349c4fbfe3ba-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.037544 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cb69ff06-2c84-40a1-805b-349c4fbfe3ba-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.037554 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npwxq\" (UniqueName: \"kubernetes.io/projected/cb69ff06-2c84-40a1-805b-349c4fbfe3ba-kube-api-access-npwxq\") on node \"crc\" DevicePath \"\"" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.367733 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.367768 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h" event={"ID":"cb69ff06-2c84-40a1-805b-349c4fbfe3ba","Type":"ContainerDied","Data":"39b6b618a33c586e68e82f9100dea6fbdb688a220a1305154d353104c07063b7"} Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.368268 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39b6b618a33c586e68e82f9100dea6fbdb688a220a1305154d353104c07063b7" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.455519 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-lj96c"] Oct 11 01:48:16 crc kubenswrapper[4743]: E1011 01:48:16.456183 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb69ff06-2c84-40a1-805b-349c4fbfe3ba" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.456211 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb69ff06-2c84-40a1-805b-349c4fbfe3ba" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.456545 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb69ff06-2c84-40a1-805b-349c4fbfe3ba" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.457618 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lj96c" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.466441 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-lj96c"] Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.492773 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.492879 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.493033 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.493107 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.500379 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.548971 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/09ec7d44-c723-4a16-a24e-d473280d1321-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lj96c\" (UID: \"09ec7d44-c723-4a16-a24e-d473280d1321\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lj96c" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.549039 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrvt6\" (UniqueName: \"kubernetes.io/projected/09ec7d44-c723-4a16-a24e-d473280d1321-kube-api-access-lrvt6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lj96c\" (UID: \"09ec7d44-c723-4a16-a24e-d473280d1321\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lj96c" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.549128 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/09ec7d44-c723-4a16-a24e-d473280d1321-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lj96c\" (UID: \"09ec7d44-c723-4a16-a24e-d473280d1321\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lj96c" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.549823 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09ec7d44-c723-4a16-a24e-d473280d1321-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lj96c\" (UID: \"09ec7d44-c723-4a16-a24e-d473280d1321\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lj96c" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.652135 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09ec7d44-c723-4a16-a24e-d473280d1321-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lj96c\" (UID: \"09ec7d44-c723-4a16-a24e-d473280d1321\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lj96c" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.652256 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/09ec7d44-c723-4a16-a24e-d473280d1321-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lj96c\" (UID: \"09ec7d44-c723-4a16-a24e-d473280d1321\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lj96c" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.652294 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/09ec7d44-c723-4a16-a24e-d473280d1321-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lj96c\" (UID: \"09ec7d44-c723-4a16-a24e-d473280d1321\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lj96c" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.652328 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrvt6\" (UniqueName: \"kubernetes.io/projected/09ec7d44-c723-4a16-a24e-d473280d1321-kube-api-access-lrvt6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lj96c\" (UID: \"09ec7d44-c723-4a16-a24e-d473280d1321\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lj96c" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.659900 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/09ec7d44-c723-4a16-a24e-d473280d1321-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lj96c\" (UID: \"09ec7d44-c723-4a16-a24e-d473280d1321\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lj96c" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.663375 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09ec7d44-c723-4a16-a24e-d473280d1321-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lj96c\" (UID: \"09ec7d44-c723-4a16-a24e-d473280d1321\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lj96c" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.664180 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/09ec7d44-c723-4a16-a24e-d473280d1321-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lj96c\" (UID: \"09ec7d44-c723-4a16-a24e-d473280d1321\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lj96c" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.687598 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrvt6\" (UniqueName: \"kubernetes.io/projected/09ec7d44-c723-4a16-a24e-d473280d1321-kube-api-access-lrvt6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lj96c\" (UID: \"09ec7d44-c723-4a16-a24e-d473280d1321\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lj96c" Oct 11 01:48:16 crc kubenswrapper[4743]: I1011 01:48:16.815751 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lj96c" Oct 11 01:48:17 crc kubenswrapper[4743]: I1011 01:48:17.339207 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-lj96c"] Oct 11 01:48:17 crc kubenswrapper[4743]: I1011 01:48:17.382342 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lj96c" event={"ID":"09ec7d44-c723-4a16-a24e-d473280d1321","Type":"ContainerStarted","Data":"a78d6cc9a3f4bcf4877dab2527c756221c02b6fdf89406a45507bf86e70b9c91"} Oct 11 01:48:18 crc kubenswrapper[4743]: I1011 01:48:18.396660 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lj96c" event={"ID":"09ec7d44-c723-4a16-a24e-d473280d1321","Type":"ContainerStarted","Data":"7562f3594e3c91028ec2e7fd6ad13cc3c6ad308f6109ff787b2751e356378b11"} Oct 11 01:48:18 crc kubenswrapper[4743]: I1011 01:48:18.418726 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lj96c" podStartSLOduration=1.9823878339999998 podStartE2EDuration="2.418700837s" podCreationTimestamp="2025-10-11 01:48:16 +0000 UTC" firstStartedPulling="2025-10-11 01:48:17.357995491 +0000 UTC m=+3392.010975888" lastFinishedPulling="2025-10-11 01:48:17.794308454 +0000 UTC m=+3392.447288891" observedRunningTime="2025-10-11 01:48:18.412693498 +0000 UTC m=+3393.065673955" watchObservedRunningTime="2025-10-11 01:48:18.418700837 +0000 UTC m=+3393.071681264" Oct 11 01:49:09 crc kubenswrapper[4743]: I1011 01:49:09.040693 4743 generic.go:334] "Generic (PLEG): container finished" podID="09ec7d44-c723-4a16-a24e-d473280d1321" containerID="7562f3594e3c91028ec2e7fd6ad13cc3c6ad308f6109ff787b2751e356378b11" exitCode=0 Oct 11 01:49:09 crc kubenswrapper[4743]: I1011 01:49:09.040760 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lj96c" event={"ID":"09ec7d44-c723-4a16-a24e-d473280d1321","Type":"ContainerDied","Data":"7562f3594e3c91028ec2e7fd6ad13cc3c6ad308f6109ff787b2751e356378b11"} Oct 11 01:49:10 crc kubenswrapper[4743]: I1011 01:49:10.639777 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lj96c" Oct 11 01:49:10 crc kubenswrapper[4743]: I1011 01:49:10.751781 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09ec7d44-c723-4a16-a24e-d473280d1321-inventory\") pod \"09ec7d44-c723-4a16-a24e-d473280d1321\" (UID: \"09ec7d44-c723-4a16-a24e-d473280d1321\") " Oct 11 01:49:10 crc kubenswrapper[4743]: I1011 01:49:10.751910 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/09ec7d44-c723-4a16-a24e-d473280d1321-ssh-key\") pod \"09ec7d44-c723-4a16-a24e-d473280d1321\" (UID: \"09ec7d44-c723-4a16-a24e-d473280d1321\") " Oct 11 01:49:10 crc kubenswrapper[4743]: I1011 01:49:10.752161 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/09ec7d44-c723-4a16-a24e-d473280d1321-ceph\") pod \"09ec7d44-c723-4a16-a24e-d473280d1321\" (UID: \"09ec7d44-c723-4a16-a24e-d473280d1321\") " Oct 11 01:49:10 crc kubenswrapper[4743]: I1011 01:49:10.752286 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrvt6\" (UniqueName: \"kubernetes.io/projected/09ec7d44-c723-4a16-a24e-d473280d1321-kube-api-access-lrvt6\") pod \"09ec7d44-c723-4a16-a24e-d473280d1321\" (UID: \"09ec7d44-c723-4a16-a24e-d473280d1321\") " Oct 11 01:49:10 crc kubenswrapper[4743]: I1011 01:49:10.758428 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ec7d44-c723-4a16-a24e-d473280d1321-ceph" (OuterVolumeSpecName: "ceph") pod "09ec7d44-c723-4a16-a24e-d473280d1321" (UID: "09ec7d44-c723-4a16-a24e-d473280d1321"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:49:10 crc kubenswrapper[4743]: I1011 01:49:10.760888 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ec7d44-c723-4a16-a24e-d473280d1321-kube-api-access-lrvt6" (OuterVolumeSpecName: "kube-api-access-lrvt6") pod "09ec7d44-c723-4a16-a24e-d473280d1321" (UID: "09ec7d44-c723-4a16-a24e-d473280d1321"). InnerVolumeSpecName "kube-api-access-lrvt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:49:10 crc kubenswrapper[4743]: I1011 01:49:10.787101 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ec7d44-c723-4a16-a24e-d473280d1321-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "09ec7d44-c723-4a16-a24e-d473280d1321" (UID: "09ec7d44-c723-4a16-a24e-d473280d1321"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:49:10 crc kubenswrapper[4743]: I1011 01:49:10.791445 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ec7d44-c723-4a16-a24e-d473280d1321-inventory" (OuterVolumeSpecName: "inventory") pod "09ec7d44-c723-4a16-a24e-d473280d1321" (UID: "09ec7d44-c723-4a16-a24e-d473280d1321"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:49:10 crc kubenswrapper[4743]: I1011 01:49:10.856526 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/09ec7d44-c723-4a16-a24e-d473280d1321-ceph\") on node \"crc\" DevicePath \"\"" Oct 11 01:49:10 crc kubenswrapper[4743]: I1011 01:49:10.856581 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrvt6\" (UniqueName: \"kubernetes.io/projected/09ec7d44-c723-4a16-a24e-d473280d1321-kube-api-access-lrvt6\") on node \"crc\" DevicePath \"\"" Oct 11 01:49:10 crc kubenswrapper[4743]: I1011 01:49:10.856602 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09ec7d44-c723-4a16-a24e-d473280d1321-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:49:10 crc kubenswrapper[4743]: I1011 01:49:10.856618 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/09ec7d44-c723-4a16-a24e-d473280d1321-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.074740 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lj96c" event={"ID":"09ec7d44-c723-4a16-a24e-d473280d1321","Type":"ContainerDied","Data":"a78d6cc9a3f4bcf4877dab2527c756221c02b6fdf89406a45507bf86e70b9c91"} Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.075112 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a78d6cc9a3f4bcf4877dab2527c756221c02b6fdf89406a45507bf86e70b9c91" Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.074786 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lj96c" Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.179959 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg"] Oct 11 01:49:11 crc kubenswrapper[4743]: E1011 01:49:11.180403 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09ec7d44-c723-4a16-a24e-d473280d1321" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.180418 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="09ec7d44-c723-4a16-a24e-d473280d1321" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.180621 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="09ec7d44-c723-4a16-a24e-d473280d1321" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.181329 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg" Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.183841 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.184027 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.184339 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.184584 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.184590 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.198813 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg"] Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.265162 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3523070b-145c-4c82-9623-b4a9f2a32c11-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg\" (UID: \"3523070b-145c-4c82-9623-b4a9f2a32c11\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg" Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.265414 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3523070b-145c-4c82-9623-b4a9f2a32c11-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg\" (UID: \"3523070b-145c-4c82-9623-b4a9f2a32c11\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg" Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.265784 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvwrs\" (UniqueName: \"kubernetes.io/projected/3523070b-145c-4c82-9623-b4a9f2a32c11-kube-api-access-mvwrs\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg\" (UID: \"3523070b-145c-4c82-9623-b4a9f2a32c11\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg" Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.265891 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3523070b-145c-4c82-9623-b4a9f2a32c11-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg\" (UID: \"3523070b-145c-4c82-9623-b4a9f2a32c11\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg" Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.368012 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3523070b-145c-4c82-9623-b4a9f2a32c11-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg\" (UID: \"3523070b-145c-4c82-9623-b4a9f2a32c11\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg" Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.368125 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3523070b-145c-4c82-9623-b4a9f2a32c11-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg\" (UID: \"3523070b-145c-4c82-9623-b4a9f2a32c11\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg" Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.368196 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3523070b-145c-4c82-9623-b4a9f2a32c11-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg\" (UID: \"3523070b-145c-4c82-9623-b4a9f2a32c11\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg" Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.368319 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvwrs\" (UniqueName: \"kubernetes.io/projected/3523070b-145c-4c82-9623-b4a9f2a32c11-kube-api-access-mvwrs\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg\" (UID: \"3523070b-145c-4c82-9623-b4a9f2a32c11\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg" Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.372671 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3523070b-145c-4c82-9623-b4a9f2a32c11-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg\" (UID: \"3523070b-145c-4c82-9623-b4a9f2a32c11\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg" Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.374655 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3523070b-145c-4c82-9623-b4a9f2a32c11-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg\" (UID: \"3523070b-145c-4c82-9623-b4a9f2a32c11\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg" Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.375713 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3523070b-145c-4c82-9623-b4a9f2a32c11-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg\" (UID: \"3523070b-145c-4c82-9623-b4a9f2a32c11\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg" Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.400447 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvwrs\" (UniqueName: \"kubernetes.io/projected/3523070b-145c-4c82-9623-b4a9f2a32c11-kube-api-access-mvwrs\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg\" (UID: \"3523070b-145c-4c82-9623-b4a9f2a32c11\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg" Oct 11 01:49:11 crc kubenswrapper[4743]: I1011 01:49:11.508407 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg" Oct 11 01:49:12 crc kubenswrapper[4743]: I1011 01:49:12.129706 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg"] Oct 11 01:49:13 crc kubenswrapper[4743]: I1011 01:49:13.096897 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg" event={"ID":"3523070b-145c-4c82-9623-b4a9f2a32c11","Type":"ContainerStarted","Data":"4bb8a5fac62092a8ea0d7baaf591d4e29bd147a35b3a8378400da3cc7eb61fba"} Oct 11 01:49:13 crc kubenswrapper[4743]: I1011 01:49:13.099121 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg" event={"ID":"3523070b-145c-4c82-9623-b4a9f2a32c11","Type":"ContainerStarted","Data":"8d7ab1a357c3d4ac4d87180d27a3ea36e17fa2064ee19cddb8a2867b8fce17d9"} Oct 11 01:49:13 crc kubenswrapper[4743]: I1011 01:49:13.128484 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg" podStartSLOduration=1.5882366449999998 podStartE2EDuration="2.128463814s" podCreationTimestamp="2025-10-11 01:49:11 +0000 UTC" firstStartedPulling="2025-10-11 01:49:12.141359429 +0000 UTC m=+3446.794339836" lastFinishedPulling="2025-10-11 01:49:12.681586568 +0000 UTC m=+3447.334567005" observedRunningTime="2025-10-11 01:49:13.118409621 +0000 UTC m=+3447.771390028" watchObservedRunningTime="2025-10-11 01:49:13.128463814 +0000 UTC m=+3447.781444211" Oct 11 01:49:18 crc kubenswrapper[4743]: I1011 01:49:18.156242 4743 generic.go:334] "Generic (PLEG): container finished" podID="3523070b-145c-4c82-9623-b4a9f2a32c11" containerID="4bb8a5fac62092a8ea0d7baaf591d4e29bd147a35b3a8378400da3cc7eb61fba" exitCode=0 Oct 11 01:49:18 crc kubenswrapper[4743]: I1011 01:49:18.156315 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg" event={"ID":"3523070b-145c-4c82-9623-b4a9f2a32c11","Type":"ContainerDied","Data":"4bb8a5fac62092a8ea0d7baaf591d4e29bd147a35b3a8378400da3cc7eb61fba"} Oct 11 01:49:19 crc kubenswrapper[4743]: I1011 01:49:19.686735 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg" Oct 11 01:49:19 crc kubenswrapper[4743]: I1011 01:49:19.871103 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3523070b-145c-4c82-9623-b4a9f2a32c11-ssh-key\") pod \"3523070b-145c-4c82-9623-b4a9f2a32c11\" (UID: \"3523070b-145c-4c82-9623-b4a9f2a32c11\") " Oct 11 01:49:19 crc kubenswrapper[4743]: I1011 01:49:19.871204 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvwrs\" (UniqueName: \"kubernetes.io/projected/3523070b-145c-4c82-9623-b4a9f2a32c11-kube-api-access-mvwrs\") pod \"3523070b-145c-4c82-9623-b4a9f2a32c11\" (UID: \"3523070b-145c-4c82-9623-b4a9f2a32c11\") " Oct 11 01:49:19 crc kubenswrapper[4743]: I1011 01:49:19.871428 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3523070b-145c-4c82-9623-b4a9f2a32c11-inventory\") pod \"3523070b-145c-4c82-9623-b4a9f2a32c11\" (UID: \"3523070b-145c-4c82-9623-b4a9f2a32c11\") " Oct 11 01:49:19 crc kubenswrapper[4743]: I1011 01:49:19.871470 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3523070b-145c-4c82-9623-b4a9f2a32c11-ceph\") pod \"3523070b-145c-4c82-9623-b4a9f2a32c11\" (UID: \"3523070b-145c-4c82-9623-b4a9f2a32c11\") " Oct 11 01:49:19 crc kubenswrapper[4743]: I1011 01:49:19.877497 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3523070b-145c-4c82-9623-b4a9f2a32c11-ceph" (OuterVolumeSpecName: "ceph") pod "3523070b-145c-4c82-9623-b4a9f2a32c11" (UID: "3523070b-145c-4c82-9623-b4a9f2a32c11"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:49:19 crc kubenswrapper[4743]: I1011 01:49:19.882513 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3523070b-145c-4c82-9623-b4a9f2a32c11-kube-api-access-mvwrs" (OuterVolumeSpecName: "kube-api-access-mvwrs") pod "3523070b-145c-4c82-9623-b4a9f2a32c11" (UID: "3523070b-145c-4c82-9623-b4a9f2a32c11"). InnerVolumeSpecName "kube-api-access-mvwrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:49:19 crc kubenswrapper[4743]: I1011 01:49:19.906769 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3523070b-145c-4c82-9623-b4a9f2a32c11-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3523070b-145c-4c82-9623-b4a9f2a32c11" (UID: "3523070b-145c-4c82-9623-b4a9f2a32c11"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:49:19 crc kubenswrapper[4743]: I1011 01:49:19.914768 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3523070b-145c-4c82-9623-b4a9f2a32c11-inventory" (OuterVolumeSpecName: "inventory") pod "3523070b-145c-4c82-9623-b4a9f2a32c11" (UID: "3523070b-145c-4c82-9623-b4a9f2a32c11"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:49:19 crc kubenswrapper[4743]: I1011 01:49:19.973583 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3523070b-145c-4c82-9623-b4a9f2a32c11-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:49:19 crc kubenswrapper[4743]: I1011 01:49:19.974009 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3523070b-145c-4c82-9623-b4a9f2a32c11-ceph\") on node \"crc\" DevicePath \"\"" Oct 11 01:49:19 crc kubenswrapper[4743]: I1011 01:49:19.974021 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3523070b-145c-4c82-9623-b4a9f2a32c11-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:49:19 crc kubenswrapper[4743]: I1011 01:49:19.974036 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvwrs\" (UniqueName: \"kubernetes.io/projected/3523070b-145c-4c82-9623-b4a9f2a32c11-kube-api-access-mvwrs\") on node \"crc\" DevicePath \"\"" Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.204648 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg" event={"ID":"3523070b-145c-4c82-9623-b4a9f2a32c11","Type":"ContainerDied","Data":"8d7ab1a357c3d4ac4d87180d27a3ea36e17fa2064ee19cddb8a2867b8fce17d9"} Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.204724 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d7ab1a357c3d4ac4d87180d27a3ea36e17fa2064ee19cddb8a2867b8fce17d9" Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.204835 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg" Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.289762 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr"] Oct 11 01:49:20 crc kubenswrapper[4743]: E1011 01:49:20.290281 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3523070b-145c-4c82-9623-b4a9f2a32c11" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.290301 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3523070b-145c-4c82-9623-b4a9f2a32c11" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.290530 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3523070b-145c-4c82-9623-b4a9f2a32c11" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.291460 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr" Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.302444 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr"] Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.340606 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.341314 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.341910 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.342644 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.344382 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.383350 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/43197ff3-1a5a-4c2f-a836-aa22d055d415-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr\" (UID: \"43197ff3-1a5a-4c2f-a836-aa22d055d415\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr" Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.383436 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43197ff3-1a5a-4c2f-a836-aa22d055d415-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr\" (UID: \"43197ff3-1a5a-4c2f-a836-aa22d055d415\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr" Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.383490 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6qqb\" (UniqueName: \"kubernetes.io/projected/43197ff3-1a5a-4c2f-a836-aa22d055d415-kube-api-access-k6qqb\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr\" (UID: \"43197ff3-1a5a-4c2f-a836-aa22d055d415\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr" Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.383525 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43197ff3-1a5a-4c2f-a836-aa22d055d415-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr\" (UID: \"43197ff3-1a5a-4c2f-a836-aa22d055d415\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr" Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.486038 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/43197ff3-1a5a-4c2f-a836-aa22d055d415-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr\" (UID: \"43197ff3-1a5a-4c2f-a836-aa22d055d415\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr" Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.486132 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43197ff3-1a5a-4c2f-a836-aa22d055d415-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr\" (UID: \"43197ff3-1a5a-4c2f-a836-aa22d055d415\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr" Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.486181 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6qqb\" (UniqueName: \"kubernetes.io/projected/43197ff3-1a5a-4c2f-a836-aa22d055d415-kube-api-access-k6qqb\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr\" (UID: \"43197ff3-1a5a-4c2f-a836-aa22d055d415\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr" Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.486203 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43197ff3-1a5a-4c2f-a836-aa22d055d415-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr\" (UID: \"43197ff3-1a5a-4c2f-a836-aa22d055d415\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr" Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.489884 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43197ff3-1a5a-4c2f-a836-aa22d055d415-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr\" (UID: \"43197ff3-1a5a-4c2f-a836-aa22d055d415\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr" Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.490241 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43197ff3-1a5a-4c2f-a836-aa22d055d415-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr\" (UID: \"43197ff3-1a5a-4c2f-a836-aa22d055d415\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr" Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.494266 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/43197ff3-1a5a-4c2f-a836-aa22d055d415-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr\" (UID: \"43197ff3-1a5a-4c2f-a836-aa22d055d415\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr" Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.502951 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6qqb\" (UniqueName: \"kubernetes.io/projected/43197ff3-1a5a-4c2f-a836-aa22d055d415-kube-api-access-k6qqb\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr\" (UID: \"43197ff3-1a5a-4c2f-a836-aa22d055d415\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr" Oct 11 01:49:20 crc kubenswrapper[4743]: I1011 01:49:20.669259 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr" Oct 11 01:49:21 crc kubenswrapper[4743]: I1011 01:49:21.277314 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr"] Oct 11 01:49:21 crc kubenswrapper[4743]: W1011 01:49:21.282611 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43197ff3_1a5a_4c2f_a836_aa22d055d415.slice/crio-d14d0333c0773fd06e41e79d7e9212a2bdb27e920a3bbe7afb8412d7e6099835 WatchSource:0}: Error finding container d14d0333c0773fd06e41e79d7e9212a2bdb27e920a3bbe7afb8412d7e6099835: Status 404 returned error can't find the container with id d14d0333c0773fd06e41e79d7e9212a2bdb27e920a3bbe7afb8412d7e6099835 Oct 11 01:49:22 crc kubenswrapper[4743]: I1011 01:49:22.232117 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr" event={"ID":"43197ff3-1a5a-4c2f-a836-aa22d055d415","Type":"ContainerStarted","Data":"fb345715246971e0920b42167a20e078d3f9a0245bf3ac5c462c974c38ea1486"} Oct 11 01:49:22 crc kubenswrapper[4743]: I1011 01:49:22.232501 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr" event={"ID":"43197ff3-1a5a-4c2f-a836-aa22d055d415","Type":"ContainerStarted","Data":"d14d0333c0773fd06e41e79d7e9212a2bdb27e920a3bbe7afb8412d7e6099835"} Oct 11 01:49:22 crc kubenswrapper[4743]: I1011 01:49:22.258218 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr" podStartSLOduration=1.69443905 podStartE2EDuration="2.25819582s" podCreationTimestamp="2025-10-11 01:49:20 +0000 UTC" firstStartedPulling="2025-10-11 01:49:21.285768798 +0000 UTC m=+3455.938749195" lastFinishedPulling="2025-10-11 01:49:21.849525538 +0000 UTC m=+3456.502505965" observedRunningTime="2025-10-11 01:49:22.25079288 +0000 UTC m=+3456.903773277" watchObservedRunningTime="2025-10-11 01:49:22.25819582 +0000 UTC m=+3456.911176237" Oct 11 01:49:35 crc kubenswrapper[4743]: I1011 01:49:35.158029 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-29vns"] Oct 11 01:49:35 crc kubenswrapper[4743]: I1011 01:49:35.162452 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-29vns" Oct 11 01:49:35 crc kubenswrapper[4743]: I1011 01:49:35.182436 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-29vns"] Oct 11 01:49:35 crc kubenswrapper[4743]: I1011 01:49:35.325683 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26ccv\" (UniqueName: \"kubernetes.io/projected/83785583-fae5-4725-b478-c6da383526dc-kube-api-access-26ccv\") pod \"community-operators-29vns\" (UID: \"83785583-fae5-4725-b478-c6da383526dc\") " pod="openshift-marketplace/community-operators-29vns" Oct 11 01:49:35 crc kubenswrapper[4743]: I1011 01:49:35.325806 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83785583-fae5-4725-b478-c6da383526dc-utilities\") pod \"community-operators-29vns\" (UID: \"83785583-fae5-4725-b478-c6da383526dc\") " pod="openshift-marketplace/community-operators-29vns" Oct 11 01:49:35 crc kubenswrapper[4743]: I1011 01:49:35.326129 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83785583-fae5-4725-b478-c6da383526dc-catalog-content\") pod \"community-operators-29vns\" (UID: \"83785583-fae5-4725-b478-c6da383526dc\") " pod="openshift-marketplace/community-operators-29vns" Oct 11 01:49:35 crc kubenswrapper[4743]: I1011 01:49:35.428254 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26ccv\" (UniqueName: \"kubernetes.io/projected/83785583-fae5-4725-b478-c6da383526dc-kube-api-access-26ccv\") pod \"community-operators-29vns\" (UID: \"83785583-fae5-4725-b478-c6da383526dc\") " pod="openshift-marketplace/community-operators-29vns" Oct 11 01:49:35 crc kubenswrapper[4743]: I1011 01:49:35.428422 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83785583-fae5-4725-b478-c6da383526dc-utilities\") pod \"community-operators-29vns\" (UID: \"83785583-fae5-4725-b478-c6da383526dc\") " pod="openshift-marketplace/community-operators-29vns" Oct 11 01:49:35 crc kubenswrapper[4743]: I1011 01:49:35.428528 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83785583-fae5-4725-b478-c6da383526dc-catalog-content\") pod \"community-operators-29vns\" (UID: \"83785583-fae5-4725-b478-c6da383526dc\") " pod="openshift-marketplace/community-operators-29vns" Oct 11 01:49:35 crc kubenswrapper[4743]: I1011 01:49:35.429104 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83785583-fae5-4725-b478-c6da383526dc-utilities\") pod \"community-operators-29vns\" (UID: \"83785583-fae5-4725-b478-c6da383526dc\") " pod="openshift-marketplace/community-operators-29vns" Oct 11 01:49:35 crc kubenswrapper[4743]: I1011 01:49:35.429126 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83785583-fae5-4725-b478-c6da383526dc-catalog-content\") pod \"community-operators-29vns\" (UID: \"83785583-fae5-4725-b478-c6da383526dc\") " pod="openshift-marketplace/community-operators-29vns" Oct 11 01:49:35 crc kubenswrapper[4743]: I1011 01:49:35.452329 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26ccv\" (UniqueName: \"kubernetes.io/projected/83785583-fae5-4725-b478-c6da383526dc-kube-api-access-26ccv\") pod \"community-operators-29vns\" (UID: \"83785583-fae5-4725-b478-c6da383526dc\") " pod="openshift-marketplace/community-operators-29vns" Oct 11 01:49:35 crc kubenswrapper[4743]: I1011 01:49:35.486974 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-29vns" Oct 11 01:49:36 crc kubenswrapper[4743]: I1011 01:49:36.003847 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-29vns"] Oct 11 01:49:36 crc kubenswrapper[4743]: W1011 01:49:36.008555 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83785583_fae5_4725_b478_c6da383526dc.slice/crio-2ad5c874ff6212bce62ab20059dafed68f0ce508e89fcf1ad0e9c463d6ffdcd5 WatchSource:0}: Error finding container 2ad5c874ff6212bce62ab20059dafed68f0ce508e89fcf1ad0e9c463d6ffdcd5: Status 404 returned error can't find the container with id 2ad5c874ff6212bce62ab20059dafed68f0ce508e89fcf1ad0e9c463d6ffdcd5 Oct 11 01:49:36 crc kubenswrapper[4743]: I1011 01:49:36.373365 4743 generic.go:334] "Generic (PLEG): container finished" podID="83785583-fae5-4725-b478-c6da383526dc" containerID="9aed9e207c7c95381d60a83d200102985d311ee585bea94f58a165d0967f8b51" exitCode=0 Oct 11 01:49:36 crc kubenswrapper[4743]: I1011 01:49:36.373436 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29vns" event={"ID":"83785583-fae5-4725-b478-c6da383526dc","Type":"ContainerDied","Data":"9aed9e207c7c95381d60a83d200102985d311ee585bea94f58a165d0967f8b51"} Oct 11 01:49:36 crc kubenswrapper[4743]: I1011 01:49:36.373700 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29vns" event={"ID":"83785583-fae5-4725-b478-c6da383526dc","Type":"ContainerStarted","Data":"2ad5c874ff6212bce62ab20059dafed68f0ce508e89fcf1ad0e9c463d6ffdcd5"} Oct 11 01:49:37 crc kubenswrapper[4743]: I1011 01:49:37.387065 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29vns" event={"ID":"83785583-fae5-4725-b478-c6da383526dc","Type":"ContainerStarted","Data":"4c396568edaf9b1b8c06a5a0a5a9a203852919c4248f37a3f9914bca3ca5b464"} Oct 11 01:49:38 crc kubenswrapper[4743]: I1011 01:49:38.397666 4743 generic.go:334] "Generic (PLEG): container finished" podID="83785583-fae5-4725-b478-c6da383526dc" containerID="4c396568edaf9b1b8c06a5a0a5a9a203852919c4248f37a3f9914bca3ca5b464" exitCode=0 Oct 11 01:49:38 crc kubenswrapper[4743]: I1011 01:49:38.397991 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29vns" event={"ID":"83785583-fae5-4725-b478-c6da383526dc","Type":"ContainerDied","Data":"4c396568edaf9b1b8c06a5a0a5a9a203852919c4248f37a3f9914bca3ca5b464"} Oct 11 01:49:39 crc kubenswrapper[4743]: I1011 01:49:39.418943 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29vns" event={"ID":"83785583-fae5-4725-b478-c6da383526dc","Type":"ContainerStarted","Data":"446a31b0a4047874932c1768b2600c7aa323a54f4de7c71fc05ec86a50aceb0e"} Oct 11 01:49:39 crc kubenswrapper[4743]: I1011 01:49:39.449540 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-29vns" podStartSLOduration=1.919548592 podStartE2EDuration="4.449514781s" podCreationTimestamp="2025-10-11 01:49:35 +0000 UTC" firstStartedPulling="2025-10-11 01:49:36.375838409 +0000 UTC m=+3471.028818816" lastFinishedPulling="2025-10-11 01:49:38.905804568 +0000 UTC m=+3473.558785005" observedRunningTime="2025-10-11 01:49:39.439708514 +0000 UTC m=+3474.092688941" watchObservedRunningTime="2025-10-11 01:49:39.449514781 +0000 UTC m=+3474.102495218" Oct 11 01:49:45 crc kubenswrapper[4743]: I1011 01:49:45.488154 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-29vns" Oct 11 01:49:45 crc kubenswrapper[4743]: I1011 01:49:45.488713 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-29vns" Oct 11 01:49:45 crc kubenswrapper[4743]: I1011 01:49:45.551492 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-29vns" Oct 11 01:49:46 crc kubenswrapper[4743]: I1011 01:49:46.547641 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-29vns" Oct 11 01:49:46 crc kubenswrapper[4743]: I1011 01:49:46.611423 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-29vns"] Oct 11 01:49:48 crc kubenswrapper[4743]: I1011 01:49:48.504998 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-29vns" podUID="83785583-fae5-4725-b478-c6da383526dc" containerName="registry-server" containerID="cri-o://446a31b0a4047874932c1768b2600c7aa323a54f4de7c71fc05ec86a50aceb0e" gracePeriod=2 Oct 11 01:49:48 crc kubenswrapper[4743]: I1011 01:49:48.985661 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-29vns" Oct 11 01:49:49 crc kubenswrapper[4743]: I1011 01:49:49.136246 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83785583-fae5-4725-b478-c6da383526dc-utilities\") pod \"83785583-fae5-4725-b478-c6da383526dc\" (UID: \"83785583-fae5-4725-b478-c6da383526dc\") " Oct 11 01:49:49 crc kubenswrapper[4743]: I1011 01:49:49.136497 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83785583-fae5-4725-b478-c6da383526dc-catalog-content\") pod \"83785583-fae5-4725-b478-c6da383526dc\" (UID: \"83785583-fae5-4725-b478-c6da383526dc\") " Oct 11 01:49:49 crc kubenswrapper[4743]: I1011 01:49:49.136749 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26ccv\" (UniqueName: \"kubernetes.io/projected/83785583-fae5-4725-b478-c6da383526dc-kube-api-access-26ccv\") pod \"83785583-fae5-4725-b478-c6da383526dc\" (UID: \"83785583-fae5-4725-b478-c6da383526dc\") " Oct 11 01:49:49 crc kubenswrapper[4743]: I1011 01:49:49.141900 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83785583-fae5-4725-b478-c6da383526dc-utilities" (OuterVolumeSpecName: "utilities") pod "83785583-fae5-4725-b478-c6da383526dc" (UID: "83785583-fae5-4725-b478-c6da383526dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:49:49 crc kubenswrapper[4743]: I1011 01:49:49.144206 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83785583-fae5-4725-b478-c6da383526dc-kube-api-access-26ccv" (OuterVolumeSpecName: "kube-api-access-26ccv") pod "83785583-fae5-4725-b478-c6da383526dc" (UID: "83785583-fae5-4725-b478-c6da383526dc"). InnerVolumeSpecName "kube-api-access-26ccv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:49:49 crc kubenswrapper[4743]: I1011 01:49:49.191353 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83785583-fae5-4725-b478-c6da383526dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83785583-fae5-4725-b478-c6da383526dc" (UID: "83785583-fae5-4725-b478-c6da383526dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:49:49 crc kubenswrapper[4743]: I1011 01:49:49.239929 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83785583-fae5-4725-b478-c6da383526dc-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 01:49:49 crc kubenswrapper[4743]: I1011 01:49:49.239958 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83785583-fae5-4725-b478-c6da383526dc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 01:49:49 crc kubenswrapper[4743]: I1011 01:49:49.239968 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26ccv\" (UniqueName: \"kubernetes.io/projected/83785583-fae5-4725-b478-c6da383526dc-kube-api-access-26ccv\") on node \"crc\" DevicePath \"\"" Oct 11 01:49:49 crc kubenswrapper[4743]: I1011 01:49:49.516697 4743 generic.go:334] "Generic (PLEG): container finished" podID="83785583-fae5-4725-b478-c6da383526dc" containerID="446a31b0a4047874932c1768b2600c7aa323a54f4de7c71fc05ec86a50aceb0e" exitCode=0 Oct 11 01:49:49 crc kubenswrapper[4743]: I1011 01:49:49.517086 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29vns" event={"ID":"83785583-fae5-4725-b478-c6da383526dc","Type":"ContainerDied","Data":"446a31b0a4047874932c1768b2600c7aa323a54f4de7c71fc05ec86a50aceb0e"} Oct 11 01:49:49 crc kubenswrapper[4743]: I1011 01:49:49.517143 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29vns" event={"ID":"83785583-fae5-4725-b478-c6da383526dc","Type":"ContainerDied","Data":"2ad5c874ff6212bce62ab20059dafed68f0ce508e89fcf1ad0e9c463d6ffdcd5"} Oct 11 01:49:49 crc kubenswrapper[4743]: I1011 01:49:49.517189 4743 scope.go:117] "RemoveContainer" containerID="446a31b0a4047874932c1768b2600c7aa323a54f4de7c71fc05ec86a50aceb0e" Oct 11 01:49:49 crc kubenswrapper[4743]: I1011 01:49:49.517554 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-29vns" Oct 11 01:49:49 crc kubenswrapper[4743]: I1011 01:49:49.548753 4743 scope.go:117] "RemoveContainer" containerID="4c396568edaf9b1b8c06a5a0a5a9a203852919c4248f37a3f9914bca3ca5b464" Oct 11 01:49:49 crc kubenswrapper[4743]: I1011 01:49:49.576127 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-29vns"] Oct 11 01:49:49 crc kubenswrapper[4743]: I1011 01:49:49.586579 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-29vns"] Oct 11 01:49:49 crc kubenswrapper[4743]: I1011 01:49:49.611996 4743 scope.go:117] "RemoveContainer" containerID="9aed9e207c7c95381d60a83d200102985d311ee585bea94f58a165d0967f8b51" Oct 11 01:49:49 crc kubenswrapper[4743]: I1011 01:49:49.649450 4743 scope.go:117] "RemoveContainer" containerID="446a31b0a4047874932c1768b2600c7aa323a54f4de7c71fc05ec86a50aceb0e" Oct 11 01:49:49 crc kubenswrapper[4743]: E1011 01:49:49.649944 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"446a31b0a4047874932c1768b2600c7aa323a54f4de7c71fc05ec86a50aceb0e\": container with ID starting with 446a31b0a4047874932c1768b2600c7aa323a54f4de7c71fc05ec86a50aceb0e not found: ID does not exist" containerID="446a31b0a4047874932c1768b2600c7aa323a54f4de7c71fc05ec86a50aceb0e" Oct 11 01:49:49 crc kubenswrapper[4743]: I1011 01:49:49.649979 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"446a31b0a4047874932c1768b2600c7aa323a54f4de7c71fc05ec86a50aceb0e"} err="failed to get container status \"446a31b0a4047874932c1768b2600c7aa323a54f4de7c71fc05ec86a50aceb0e\": rpc error: code = NotFound desc = could not find container \"446a31b0a4047874932c1768b2600c7aa323a54f4de7c71fc05ec86a50aceb0e\": container with ID starting with 446a31b0a4047874932c1768b2600c7aa323a54f4de7c71fc05ec86a50aceb0e not found: ID does not exist" Oct 11 01:49:49 crc kubenswrapper[4743]: I1011 01:49:49.650004 4743 scope.go:117] "RemoveContainer" containerID="4c396568edaf9b1b8c06a5a0a5a9a203852919c4248f37a3f9914bca3ca5b464" Oct 11 01:49:49 crc kubenswrapper[4743]: E1011 01:49:49.650286 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c396568edaf9b1b8c06a5a0a5a9a203852919c4248f37a3f9914bca3ca5b464\": container with ID starting with 4c396568edaf9b1b8c06a5a0a5a9a203852919c4248f37a3f9914bca3ca5b464 not found: ID does not exist" containerID="4c396568edaf9b1b8c06a5a0a5a9a203852919c4248f37a3f9914bca3ca5b464" Oct 11 01:49:49 crc kubenswrapper[4743]: I1011 01:49:49.650312 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c396568edaf9b1b8c06a5a0a5a9a203852919c4248f37a3f9914bca3ca5b464"} err="failed to get container status \"4c396568edaf9b1b8c06a5a0a5a9a203852919c4248f37a3f9914bca3ca5b464\": rpc error: code = NotFound desc = could not find container \"4c396568edaf9b1b8c06a5a0a5a9a203852919c4248f37a3f9914bca3ca5b464\": container with ID starting with 4c396568edaf9b1b8c06a5a0a5a9a203852919c4248f37a3f9914bca3ca5b464 not found: ID does not exist" Oct 11 01:49:49 crc kubenswrapper[4743]: I1011 01:49:49.650328 4743 scope.go:117] "RemoveContainer" containerID="9aed9e207c7c95381d60a83d200102985d311ee585bea94f58a165d0967f8b51" Oct 11 01:49:49 crc kubenswrapper[4743]: E1011 01:49:49.650551 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9aed9e207c7c95381d60a83d200102985d311ee585bea94f58a165d0967f8b51\": container with ID starting with 9aed9e207c7c95381d60a83d200102985d311ee585bea94f58a165d0967f8b51 not found: ID does not exist" containerID="9aed9e207c7c95381d60a83d200102985d311ee585bea94f58a165d0967f8b51" Oct 11 01:49:49 crc kubenswrapper[4743]: I1011 01:49:49.650575 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aed9e207c7c95381d60a83d200102985d311ee585bea94f58a165d0967f8b51"} err="failed to get container status \"9aed9e207c7c95381d60a83d200102985d311ee585bea94f58a165d0967f8b51\": rpc error: code = NotFound desc = could not find container \"9aed9e207c7c95381d60a83d200102985d311ee585bea94f58a165d0967f8b51\": container with ID starting with 9aed9e207c7c95381d60a83d200102985d311ee585bea94f58a165d0967f8b51 not found: ID does not exist" Oct 11 01:49:50 crc kubenswrapper[4743]: I1011 01:49:50.110903 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83785583-fae5-4725-b478-c6da383526dc" path="/var/lib/kubelet/pods/83785583-fae5-4725-b478-c6da383526dc/volumes" Oct 11 01:49:54 crc kubenswrapper[4743]: I1011 01:49:54.859849 4743 scope.go:117] "RemoveContainer" containerID="8e4c5c9a01797c3fe944ae48aedf5b065c0521bdbcf329829d47fe956c824499" Oct 11 01:49:54 crc kubenswrapper[4743]: I1011 01:49:54.898293 4743 scope.go:117] "RemoveContainer" containerID="3f556f27a37cf12ff379bb2704d1b3a3c6081d8c8642a6e55ccdb437dd99e9b8" Oct 11 01:49:54 crc kubenswrapper[4743]: I1011 01:49:54.930775 4743 scope.go:117] "RemoveContainer" containerID="2c891db7ec5aec6335b81e64a85ef5b488f652158311b21cf5d8d086f7a48b78" Oct 11 01:50:14 crc kubenswrapper[4743]: I1011 01:50:14.458006 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:50:14 crc kubenswrapper[4743]: I1011 01:50:14.458583 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:50:28 crc kubenswrapper[4743]: I1011 01:50:28.951282 4743 generic.go:334] "Generic (PLEG): container finished" podID="43197ff3-1a5a-4c2f-a836-aa22d055d415" containerID="fb345715246971e0920b42167a20e078d3f9a0245bf3ac5c462c974c38ea1486" exitCode=0 Oct 11 01:50:28 crc kubenswrapper[4743]: I1011 01:50:28.951473 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr" event={"ID":"43197ff3-1a5a-4c2f-a836-aa22d055d415","Type":"ContainerDied","Data":"fb345715246971e0920b42167a20e078d3f9a0245bf3ac5c462c974c38ea1486"} Oct 11 01:50:30 crc kubenswrapper[4743]: I1011 01:50:30.431178 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr" Oct 11 01:50:30 crc kubenswrapper[4743]: I1011 01:50:30.487554 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6qqb\" (UniqueName: \"kubernetes.io/projected/43197ff3-1a5a-4c2f-a836-aa22d055d415-kube-api-access-k6qqb\") pod \"43197ff3-1a5a-4c2f-a836-aa22d055d415\" (UID: \"43197ff3-1a5a-4c2f-a836-aa22d055d415\") " Oct 11 01:50:30 crc kubenswrapper[4743]: I1011 01:50:30.487648 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/43197ff3-1a5a-4c2f-a836-aa22d055d415-ceph\") pod \"43197ff3-1a5a-4c2f-a836-aa22d055d415\" (UID: \"43197ff3-1a5a-4c2f-a836-aa22d055d415\") " Oct 11 01:50:30 crc kubenswrapper[4743]: I1011 01:50:30.487698 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43197ff3-1a5a-4c2f-a836-aa22d055d415-inventory\") pod \"43197ff3-1a5a-4c2f-a836-aa22d055d415\" (UID: \"43197ff3-1a5a-4c2f-a836-aa22d055d415\") " Oct 11 01:50:30 crc kubenswrapper[4743]: I1011 01:50:30.487809 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43197ff3-1a5a-4c2f-a836-aa22d055d415-ssh-key\") pod \"43197ff3-1a5a-4c2f-a836-aa22d055d415\" (UID: \"43197ff3-1a5a-4c2f-a836-aa22d055d415\") " Oct 11 01:50:30 crc kubenswrapper[4743]: I1011 01:50:30.495111 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43197ff3-1a5a-4c2f-a836-aa22d055d415-ceph" (OuterVolumeSpecName: "ceph") pod "43197ff3-1a5a-4c2f-a836-aa22d055d415" (UID: "43197ff3-1a5a-4c2f-a836-aa22d055d415"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:50:30 crc kubenswrapper[4743]: I1011 01:50:30.501160 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43197ff3-1a5a-4c2f-a836-aa22d055d415-kube-api-access-k6qqb" (OuterVolumeSpecName: "kube-api-access-k6qqb") pod "43197ff3-1a5a-4c2f-a836-aa22d055d415" (UID: "43197ff3-1a5a-4c2f-a836-aa22d055d415"). InnerVolumeSpecName "kube-api-access-k6qqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:50:30 crc kubenswrapper[4743]: I1011 01:50:30.541384 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43197ff3-1a5a-4c2f-a836-aa22d055d415-inventory" (OuterVolumeSpecName: "inventory") pod "43197ff3-1a5a-4c2f-a836-aa22d055d415" (UID: "43197ff3-1a5a-4c2f-a836-aa22d055d415"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:50:30 crc kubenswrapper[4743]: I1011 01:50:30.549846 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43197ff3-1a5a-4c2f-a836-aa22d055d415-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "43197ff3-1a5a-4c2f-a836-aa22d055d415" (UID: "43197ff3-1a5a-4c2f-a836-aa22d055d415"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:50:30 crc kubenswrapper[4743]: I1011 01:50:30.595432 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6qqb\" (UniqueName: \"kubernetes.io/projected/43197ff3-1a5a-4c2f-a836-aa22d055d415-kube-api-access-k6qqb\") on node \"crc\" DevicePath \"\"" Oct 11 01:50:30 crc kubenswrapper[4743]: I1011 01:50:30.595673 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/43197ff3-1a5a-4c2f-a836-aa22d055d415-ceph\") on node \"crc\" DevicePath \"\"" Oct 11 01:50:30 crc kubenswrapper[4743]: I1011 01:50:30.595767 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43197ff3-1a5a-4c2f-a836-aa22d055d415-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:50:30 crc kubenswrapper[4743]: I1011 01:50:30.595890 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43197ff3-1a5a-4c2f-a836-aa22d055d415-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:50:30 crc kubenswrapper[4743]: I1011 01:50:30.977573 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr" event={"ID":"43197ff3-1a5a-4c2f-a836-aa22d055d415","Type":"ContainerDied","Data":"d14d0333c0773fd06e41e79d7e9212a2bdb27e920a3bbe7afb8412d7e6099835"} Oct 11 01:50:30 crc kubenswrapper[4743]: I1011 01:50:30.977810 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d14d0333c0773fd06e41e79d7e9212a2bdb27e920a3bbe7afb8412d7e6099835" Oct 11 01:50:30 crc kubenswrapper[4743]: I1011 01:50:30.977686 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.086935 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-7kvlt"] Oct 11 01:50:31 crc kubenswrapper[4743]: E1011 01:50:31.087778 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83785583-fae5-4725-b478-c6da383526dc" containerName="extract-utilities" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.087953 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="83785583-fae5-4725-b478-c6da383526dc" containerName="extract-utilities" Oct 11 01:50:31 crc kubenswrapper[4743]: E1011 01:50:31.088080 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83785583-fae5-4725-b478-c6da383526dc" containerName="registry-server" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.088185 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="83785583-fae5-4725-b478-c6da383526dc" containerName="registry-server" Oct 11 01:50:31 crc kubenswrapper[4743]: E1011 01:50:31.088304 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43197ff3-1a5a-4c2f-a836-aa22d055d415" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.088399 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="43197ff3-1a5a-4c2f-a836-aa22d055d415" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:50:31 crc kubenswrapper[4743]: E1011 01:50:31.088476 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83785583-fae5-4725-b478-c6da383526dc" containerName="extract-content" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.088551 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="83785583-fae5-4725-b478-c6da383526dc" containerName="extract-content" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.088933 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="43197ff3-1a5a-4c2f-a836-aa22d055d415" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.089043 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="83785583-fae5-4725-b478-c6da383526dc" containerName="registry-server" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.090161 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-7kvlt" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.092829 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.093761 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-7kvlt"] Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.093974 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.094116 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.098090 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.099275 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.105316 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf55h\" (UniqueName: \"kubernetes.io/projected/45eba0ce-a54b-4530-a391-35572fb868aa-kube-api-access-gf55h\") pod \"ssh-known-hosts-edpm-deployment-7kvlt\" (UID: \"45eba0ce-a54b-4530-a391-35572fb868aa\") " pod="openstack/ssh-known-hosts-edpm-deployment-7kvlt" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.105424 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/45eba0ce-a54b-4530-a391-35572fb868aa-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-7kvlt\" (UID: \"45eba0ce-a54b-4530-a391-35572fb868aa\") " pod="openstack/ssh-known-hosts-edpm-deployment-7kvlt" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.105486 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/45eba0ce-a54b-4530-a391-35572fb868aa-ceph\") pod \"ssh-known-hosts-edpm-deployment-7kvlt\" (UID: \"45eba0ce-a54b-4530-a391-35572fb868aa\") " pod="openstack/ssh-known-hosts-edpm-deployment-7kvlt" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.105529 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/45eba0ce-a54b-4530-a391-35572fb868aa-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-7kvlt\" (UID: \"45eba0ce-a54b-4530-a391-35572fb868aa\") " pod="openstack/ssh-known-hosts-edpm-deployment-7kvlt" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.207989 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf55h\" (UniqueName: \"kubernetes.io/projected/45eba0ce-a54b-4530-a391-35572fb868aa-kube-api-access-gf55h\") pod \"ssh-known-hosts-edpm-deployment-7kvlt\" (UID: \"45eba0ce-a54b-4530-a391-35572fb868aa\") " pod="openstack/ssh-known-hosts-edpm-deployment-7kvlt" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.208048 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/45eba0ce-a54b-4530-a391-35572fb868aa-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-7kvlt\" (UID: \"45eba0ce-a54b-4530-a391-35572fb868aa\") " pod="openstack/ssh-known-hosts-edpm-deployment-7kvlt" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.208082 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/45eba0ce-a54b-4530-a391-35572fb868aa-ceph\") pod \"ssh-known-hosts-edpm-deployment-7kvlt\" (UID: \"45eba0ce-a54b-4530-a391-35572fb868aa\") " pod="openstack/ssh-known-hosts-edpm-deployment-7kvlt" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.208114 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/45eba0ce-a54b-4530-a391-35572fb868aa-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-7kvlt\" (UID: \"45eba0ce-a54b-4530-a391-35572fb868aa\") " pod="openstack/ssh-known-hosts-edpm-deployment-7kvlt" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.221665 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/45eba0ce-a54b-4530-a391-35572fb868aa-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-7kvlt\" (UID: \"45eba0ce-a54b-4530-a391-35572fb868aa\") " pod="openstack/ssh-known-hosts-edpm-deployment-7kvlt" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.221781 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/45eba0ce-a54b-4530-a391-35572fb868aa-ceph\") pod \"ssh-known-hosts-edpm-deployment-7kvlt\" (UID: \"45eba0ce-a54b-4530-a391-35572fb868aa\") " pod="openstack/ssh-known-hosts-edpm-deployment-7kvlt" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.222434 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/45eba0ce-a54b-4530-a391-35572fb868aa-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-7kvlt\" (UID: \"45eba0ce-a54b-4530-a391-35572fb868aa\") " pod="openstack/ssh-known-hosts-edpm-deployment-7kvlt" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.236528 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf55h\" (UniqueName: \"kubernetes.io/projected/45eba0ce-a54b-4530-a391-35572fb868aa-kube-api-access-gf55h\") pod \"ssh-known-hosts-edpm-deployment-7kvlt\" (UID: \"45eba0ce-a54b-4530-a391-35572fb868aa\") " pod="openstack/ssh-known-hosts-edpm-deployment-7kvlt" Oct 11 01:50:31 crc kubenswrapper[4743]: I1011 01:50:31.414378 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-7kvlt" Oct 11 01:50:32 crc kubenswrapper[4743]: I1011 01:50:32.054050 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 01:50:32 crc kubenswrapper[4743]: I1011 01:50:32.056837 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-7kvlt"] Oct 11 01:50:32 crc kubenswrapper[4743]: I1011 01:50:32.994336 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-7kvlt" event={"ID":"45eba0ce-a54b-4530-a391-35572fb868aa","Type":"ContainerStarted","Data":"b5c16c5b85651bbfbca270423a4c8f8d80bbb77db58dfe521798fa36ca12c4db"} Oct 11 01:50:32 crc kubenswrapper[4743]: I1011 01:50:32.994806 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-7kvlt" event={"ID":"45eba0ce-a54b-4530-a391-35572fb868aa","Type":"ContainerStarted","Data":"18ee9dc1df278a8f8e5558e1d01f49f783dd4de03c117416204127e98d43699e"} Oct 11 01:50:33 crc kubenswrapper[4743]: I1011 01:50:33.027950 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-7kvlt" podStartSLOduration=1.504970227 podStartE2EDuration="2.027933588s" podCreationTimestamp="2025-10-11 01:50:31 +0000 UTC" firstStartedPulling="2025-10-11 01:50:32.053741574 +0000 UTC m=+3526.706721971" lastFinishedPulling="2025-10-11 01:50:32.576704895 +0000 UTC m=+3527.229685332" observedRunningTime="2025-10-11 01:50:33.022155658 +0000 UTC m=+3527.675136185" watchObservedRunningTime="2025-10-11 01:50:33.027933588 +0000 UTC m=+3527.680913985" Oct 11 01:50:44 crc kubenswrapper[4743]: I1011 01:50:44.458844 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:50:44 crc kubenswrapper[4743]: I1011 01:50:44.459524 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:50:47 crc kubenswrapper[4743]: I1011 01:50:47.177295 4743 generic.go:334] "Generic (PLEG): container finished" podID="45eba0ce-a54b-4530-a391-35572fb868aa" containerID="b5c16c5b85651bbfbca270423a4c8f8d80bbb77db58dfe521798fa36ca12c4db" exitCode=0 Oct 11 01:50:47 crc kubenswrapper[4743]: I1011 01:50:47.177778 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-7kvlt" event={"ID":"45eba0ce-a54b-4530-a391-35572fb868aa","Type":"ContainerDied","Data":"b5c16c5b85651bbfbca270423a4c8f8d80bbb77db58dfe521798fa36ca12c4db"} Oct 11 01:50:48 crc kubenswrapper[4743]: I1011 01:50:48.657099 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-7kvlt" Oct 11 01:50:48 crc kubenswrapper[4743]: I1011 01:50:48.819958 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/45eba0ce-a54b-4530-a391-35572fb868aa-ceph\") pod \"45eba0ce-a54b-4530-a391-35572fb868aa\" (UID: \"45eba0ce-a54b-4530-a391-35572fb868aa\") " Oct 11 01:50:48 crc kubenswrapper[4743]: I1011 01:50:48.820373 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/45eba0ce-a54b-4530-a391-35572fb868aa-ssh-key-openstack-edpm-ipam\") pod \"45eba0ce-a54b-4530-a391-35572fb868aa\" (UID: \"45eba0ce-a54b-4530-a391-35572fb868aa\") " Oct 11 01:50:48 crc kubenswrapper[4743]: I1011 01:50:48.820626 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf55h\" (UniqueName: \"kubernetes.io/projected/45eba0ce-a54b-4530-a391-35572fb868aa-kube-api-access-gf55h\") pod \"45eba0ce-a54b-4530-a391-35572fb868aa\" (UID: \"45eba0ce-a54b-4530-a391-35572fb868aa\") " Oct 11 01:50:48 crc kubenswrapper[4743]: I1011 01:50:48.820887 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/45eba0ce-a54b-4530-a391-35572fb868aa-inventory-0\") pod \"45eba0ce-a54b-4530-a391-35572fb868aa\" (UID: \"45eba0ce-a54b-4530-a391-35572fb868aa\") " Oct 11 01:50:48 crc kubenswrapper[4743]: I1011 01:50:48.825435 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45eba0ce-a54b-4530-a391-35572fb868aa-ceph" (OuterVolumeSpecName: "ceph") pod "45eba0ce-a54b-4530-a391-35572fb868aa" (UID: "45eba0ce-a54b-4530-a391-35572fb868aa"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:50:48 crc kubenswrapper[4743]: I1011 01:50:48.827486 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45eba0ce-a54b-4530-a391-35572fb868aa-kube-api-access-gf55h" (OuterVolumeSpecName: "kube-api-access-gf55h") pod "45eba0ce-a54b-4530-a391-35572fb868aa" (UID: "45eba0ce-a54b-4530-a391-35572fb868aa"). InnerVolumeSpecName "kube-api-access-gf55h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:50:48 crc kubenswrapper[4743]: I1011 01:50:48.852246 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45eba0ce-a54b-4530-a391-35572fb868aa-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "45eba0ce-a54b-4530-a391-35572fb868aa" (UID: "45eba0ce-a54b-4530-a391-35572fb868aa"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:50:48 crc kubenswrapper[4743]: I1011 01:50:48.852672 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45eba0ce-a54b-4530-a391-35572fb868aa-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "45eba0ce-a54b-4530-a391-35572fb868aa" (UID: "45eba0ce-a54b-4530-a391-35572fb868aa"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:50:48 crc kubenswrapper[4743]: I1011 01:50:48.922991 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/45eba0ce-a54b-4530-a391-35572fb868aa-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 11 01:50:48 crc kubenswrapper[4743]: I1011 01:50:48.923029 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf55h\" (UniqueName: \"kubernetes.io/projected/45eba0ce-a54b-4530-a391-35572fb868aa-kube-api-access-gf55h\") on node \"crc\" DevicePath \"\"" Oct 11 01:50:48 crc kubenswrapper[4743]: I1011 01:50:48.923039 4743 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/45eba0ce-a54b-4530-a391-35572fb868aa-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:50:48 crc kubenswrapper[4743]: I1011 01:50:48.923049 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/45eba0ce-a54b-4530-a391-35572fb868aa-ceph\") on node \"crc\" DevicePath \"\"" Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.215845 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-7kvlt" event={"ID":"45eba0ce-a54b-4530-a391-35572fb868aa","Type":"ContainerDied","Data":"18ee9dc1df278a8f8e5558e1d01f49f783dd4de03c117416204127e98d43699e"} Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.216378 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18ee9dc1df278a8f8e5558e1d01f49f783dd4de03c117416204127e98d43699e" Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.215942 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-7kvlt" Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.294910 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-lrpfr"] Oct 11 01:50:49 crc kubenswrapper[4743]: E1011 01:50:49.295552 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45eba0ce-a54b-4530-a391-35572fb868aa" containerName="ssh-known-hosts-edpm-deployment" Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.295577 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="45eba0ce-a54b-4530-a391-35572fb868aa" containerName="ssh-known-hosts-edpm-deployment" Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.296083 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="45eba0ce-a54b-4530-a391-35572fb868aa" containerName="ssh-known-hosts-edpm-deployment" Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.297247 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lrpfr" Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.312027 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.312104 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.312259 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.312656 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.312816 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.312988 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-lrpfr"] Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.434669 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08cb63b9-9798-4e9f-9df8-7a1676dbe1f8-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lrpfr\" (UID: \"08cb63b9-9798-4e9f-9df8-7a1676dbe1f8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lrpfr" Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.434739 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/08cb63b9-9798-4e9f-9df8-7a1676dbe1f8-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lrpfr\" (UID: \"08cb63b9-9798-4e9f-9df8-7a1676dbe1f8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lrpfr" Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.434782 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/08cb63b9-9798-4e9f-9df8-7a1676dbe1f8-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lrpfr\" (UID: \"08cb63b9-9798-4e9f-9df8-7a1676dbe1f8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lrpfr" Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.434834 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx56z\" (UniqueName: \"kubernetes.io/projected/08cb63b9-9798-4e9f-9df8-7a1676dbe1f8-kube-api-access-rx56z\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lrpfr\" (UID: \"08cb63b9-9798-4e9f-9df8-7a1676dbe1f8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lrpfr" Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.536453 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx56z\" (UniqueName: \"kubernetes.io/projected/08cb63b9-9798-4e9f-9df8-7a1676dbe1f8-kube-api-access-rx56z\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lrpfr\" (UID: \"08cb63b9-9798-4e9f-9df8-7a1676dbe1f8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lrpfr" Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.536639 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08cb63b9-9798-4e9f-9df8-7a1676dbe1f8-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lrpfr\" (UID: \"08cb63b9-9798-4e9f-9df8-7a1676dbe1f8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lrpfr" Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.536682 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/08cb63b9-9798-4e9f-9df8-7a1676dbe1f8-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lrpfr\" (UID: \"08cb63b9-9798-4e9f-9df8-7a1676dbe1f8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lrpfr" Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.536730 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/08cb63b9-9798-4e9f-9df8-7a1676dbe1f8-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lrpfr\" (UID: \"08cb63b9-9798-4e9f-9df8-7a1676dbe1f8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lrpfr" Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.541041 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/08cb63b9-9798-4e9f-9df8-7a1676dbe1f8-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lrpfr\" (UID: \"08cb63b9-9798-4e9f-9df8-7a1676dbe1f8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lrpfr" Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.542067 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/08cb63b9-9798-4e9f-9df8-7a1676dbe1f8-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lrpfr\" (UID: \"08cb63b9-9798-4e9f-9df8-7a1676dbe1f8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lrpfr" Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.546925 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08cb63b9-9798-4e9f-9df8-7a1676dbe1f8-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lrpfr\" (UID: \"08cb63b9-9798-4e9f-9df8-7a1676dbe1f8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lrpfr" Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.557001 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx56z\" (UniqueName: \"kubernetes.io/projected/08cb63b9-9798-4e9f-9df8-7a1676dbe1f8-kube-api-access-rx56z\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lrpfr\" (UID: \"08cb63b9-9798-4e9f-9df8-7a1676dbe1f8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lrpfr" Oct 11 01:50:49 crc kubenswrapper[4743]: I1011 01:50:49.635510 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lrpfr" Oct 11 01:50:50 crc kubenswrapper[4743]: I1011 01:50:50.181738 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-lrpfr"] Oct 11 01:50:50 crc kubenswrapper[4743]: W1011 01:50:50.184515 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08cb63b9_9798_4e9f_9df8_7a1676dbe1f8.slice/crio-14667c3d7e9345aa84624d7d9c6255b62234f53e443224563182d720c5d141da WatchSource:0}: Error finding container 14667c3d7e9345aa84624d7d9c6255b62234f53e443224563182d720c5d141da: Status 404 returned error can't find the container with id 14667c3d7e9345aa84624d7d9c6255b62234f53e443224563182d720c5d141da Oct 11 01:50:50 crc kubenswrapper[4743]: I1011 01:50:50.225148 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lrpfr" event={"ID":"08cb63b9-9798-4e9f-9df8-7a1676dbe1f8","Type":"ContainerStarted","Data":"14667c3d7e9345aa84624d7d9c6255b62234f53e443224563182d720c5d141da"} Oct 11 01:50:51 crc kubenswrapper[4743]: I1011 01:50:51.237365 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lrpfr" event={"ID":"08cb63b9-9798-4e9f-9df8-7a1676dbe1f8","Type":"ContainerStarted","Data":"db60a4a60286cf3d58b66fe856af1b8c6afc589d4f9c86a83ac484257cc16b05"} Oct 11 01:50:51 crc kubenswrapper[4743]: I1011 01:50:51.259318 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lrpfr" podStartSLOduration=1.860961775 podStartE2EDuration="2.259301147s" podCreationTimestamp="2025-10-11 01:50:49 +0000 UTC" firstStartedPulling="2025-10-11 01:50:50.187047856 +0000 UTC m=+3544.840028253" lastFinishedPulling="2025-10-11 01:50:50.585387228 +0000 UTC m=+3545.238367625" observedRunningTime="2025-10-11 01:50:51.255268519 +0000 UTC m=+3545.908248916" watchObservedRunningTime="2025-10-11 01:50:51.259301147 +0000 UTC m=+3545.912281544" Oct 11 01:51:02 crc kubenswrapper[4743]: I1011 01:51:02.372421 4743 generic.go:334] "Generic (PLEG): container finished" podID="08cb63b9-9798-4e9f-9df8-7a1676dbe1f8" containerID="db60a4a60286cf3d58b66fe856af1b8c6afc589d4f9c86a83ac484257cc16b05" exitCode=0 Oct 11 01:51:02 crc kubenswrapper[4743]: I1011 01:51:02.372534 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lrpfr" event={"ID":"08cb63b9-9798-4e9f-9df8-7a1676dbe1f8","Type":"ContainerDied","Data":"db60a4a60286cf3d58b66fe856af1b8c6afc589d4f9c86a83ac484257cc16b05"} Oct 11 01:51:03 crc kubenswrapper[4743]: I1011 01:51:03.997266 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lrpfr" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.111157 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/08cb63b9-9798-4e9f-9df8-7a1676dbe1f8-ssh-key\") pod \"08cb63b9-9798-4e9f-9df8-7a1676dbe1f8\" (UID: \"08cb63b9-9798-4e9f-9df8-7a1676dbe1f8\") " Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.111242 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08cb63b9-9798-4e9f-9df8-7a1676dbe1f8-inventory\") pod \"08cb63b9-9798-4e9f-9df8-7a1676dbe1f8\" (UID: \"08cb63b9-9798-4e9f-9df8-7a1676dbe1f8\") " Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.111319 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/08cb63b9-9798-4e9f-9df8-7a1676dbe1f8-ceph\") pod \"08cb63b9-9798-4e9f-9df8-7a1676dbe1f8\" (UID: \"08cb63b9-9798-4e9f-9df8-7a1676dbe1f8\") " Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.111354 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx56z\" (UniqueName: \"kubernetes.io/projected/08cb63b9-9798-4e9f-9df8-7a1676dbe1f8-kube-api-access-rx56z\") pod \"08cb63b9-9798-4e9f-9df8-7a1676dbe1f8\" (UID: \"08cb63b9-9798-4e9f-9df8-7a1676dbe1f8\") " Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.118473 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08cb63b9-9798-4e9f-9df8-7a1676dbe1f8-kube-api-access-rx56z" (OuterVolumeSpecName: "kube-api-access-rx56z") pod "08cb63b9-9798-4e9f-9df8-7a1676dbe1f8" (UID: "08cb63b9-9798-4e9f-9df8-7a1676dbe1f8"). InnerVolumeSpecName "kube-api-access-rx56z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.118695 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08cb63b9-9798-4e9f-9df8-7a1676dbe1f8-ceph" (OuterVolumeSpecName: "ceph") pod "08cb63b9-9798-4e9f-9df8-7a1676dbe1f8" (UID: "08cb63b9-9798-4e9f-9df8-7a1676dbe1f8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.141441 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08cb63b9-9798-4e9f-9df8-7a1676dbe1f8-inventory" (OuterVolumeSpecName: "inventory") pod "08cb63b9-9798-4e9f-9df8-7a1676dbe1f8" (UID: "08cb63b9-9798-4e9f-9df8-7a1676dbe1f8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.156431 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08cb63b9-9798-4e9f-9df8-7a1676dbe1f8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "08cb63b9-9798-4e9f-9df8-7a1676dbe1f8" (UID: "08cb63b9-9798-4e9f-9df8-7a1676dbe1f8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.215569 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/08cb63b9-9798-4e9f-9df8-7a1676dbe1f8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.215640 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08cb63b9-9798-4e9f-9df8-7a1676dbe1f8-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.215662 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/08cb63b9-9798-4e9f-9df8-7a1676dbe1f8-ceph\") on node \"crc\" DevicePath \"\"" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.215683 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx56z\" (UniqueName: \"kubernetes.io/projected/08cb63b9-9798-4e9f-9df8-7a1676dbe1f8-kube-api-access-rx56z\") on node \"crc\" DevicePath \"\"" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.400635 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lrpfr" event={"ID":"08cb63b9-9798-4e9f-9df8-7a1676dbe1f8","Type":"ContainerDied","Data":"14667c3d7e9345aa84624d7d9c6255b62234f53e443224563182d720c5d141da"} Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.401007 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14667c3d7e9345aa84624d7d9c6255b62234f53e443224563182d720c5d141da" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.400923 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lrpfr" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.485001 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp"] Oct 11 01:51:04 crc kubenswrapper[4743]: E1011 01:51:04.485557 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08cb63b9-9798-4e9f-9df8-7a1676dbe1f8" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.485577 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="08cb63b9-9798-4e9f-9df8-7a1676dbe1f8" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.485811 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="08cb63b9-9798-4e9f-9df8-7a1676dbe1f8" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.486610 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.489322 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.489513 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.489720 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.489885 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.490797 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.504627 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp"] Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.624337 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs29h\" (UniqueName: \"kubernetes.io/projected/dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4-kube-api-access-cs29h\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp\" (UID: \"dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.624426 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp\" (UID: \"dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.624490 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp\" (UID: \"dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.624638 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp\" (UID: \"dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.727034 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp\" (UID: \"dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.727183 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs29h\" (UniqueName: \"kubernetes.io/projected/dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4-kube-api-access-cs29h\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp\" (UID: \"dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.727237 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp\" (UID: \"dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.727282 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp\" (UID: \"dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.732262 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp\" (UID: \"dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.732754 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp\" (UID: \"dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.738068 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp\" (UID: \"dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.743129 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs29h\" (UniqueName: \"kubernetes.io/projected/dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4-kube-api-access-cs29h\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp\" (UID: \"dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp" Oct 11 01:51:04 crc kubenswrapper[4743]: I1011 01:51:04.807952 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp" Oct 11 01:51:05 crc kubenswrapper[4743]: I1011 01:51:05.326741 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp"] Oct 11 01:51:05 crc kubenswrapper[4743]: I1011 01:51:05.413203 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp" event={"ID":"dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4","Type":"ContainerStarted","Data":"ad29a325b800b5c24ca452c236ae4cf7245357335b39aac42c5402c283726d0b"} Oct 11 01:51:06 crc kubenswrapper[4743]: I1011 01:51:06.428261 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp" event={"ID":"dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4","Type":"ContainerStarted","Data":"7c268662d0a1462b8242fb60846c6144634492c0d2bd753b26ae13b17d6f1375"} Oct 11 01:51:14 crc kubenswrapper[4743]: I1011 01:51:14.458447 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:51:14 crc kubenswrapper[4743]: I1011 01:51:14.459136 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:51:14 crc kubenswrapper[4743]: I1011 01:51:14.459194 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 01:51:14 crc kubenswrapper[4743]: I1011 01:51:14.460120 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 01:51:14 crc kubenswrapper[4743]: I1011 01:51:14.460185 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" gracePeriod=600 Oct 11 01:51:14 crc kubenswrapper[4743]: E1011 01:51:14.586344 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:51:15 crc kubenswrapper[4743]: I1011 01:51:15.543781 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" exitCode=0 Oct 11 01:51:15 crc kubenswrapper[4743]: I1011 01:51:15.543876 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393"} Oct 11 01:51:15 crc kubenswrapper[4743]: I1011 01:51:15.544214 4743 scope.go:117] "RemoveContainer" containerID="40fabd8824e0053fbc030d07f0d398ca43bf5381e369efae8cdd092e89d91e84" Oct 11 01:51:15 crc kubenswrapper[4743]: I1011 01:51:15.545036 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:51:15 crc kubenswrapper[4743]: E1011 01:51:15.545430 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:51:15 crc kubenswrapper[4743]: I1011 01:51:15.566282 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp" podStartSLOduration=11.146100051 podStartE2EDuration="11.566261731s" podCreationTimestamp="2025-10-11 01:51:04 +0000 UTC" firstStartedPulling="2025-10-11 01:51:05.336307409 +0000 UTC m=+3559.989287816" lastFinishedPulling="2025-10-11 01:51:05.756469059 +0000 UTC m=+3560.409449496" observedRunningTime="2025-10-11 01:51:06.456935221 +0000 UTC m=+3561.109915628" watchObservedRunningTime="2025-10-11 01:51:15.566261731 +0000 UTC m=+3570.219242138" Oct 11 01:51:19 crc kubenswrapper[4743]: I1011 01:51:19.599293 4743 generic.go:334] "Generic (PLEG): container finished" podID="dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4" containerID="7c268662d0a1462b8242fb60846c6144634492c0d2bd753b26ae13b17d6f1375" exitCode=0 Oct 11 01:51:19 crc kubenswrapper[4743]: I1011 01:51:19.599480 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp" event={"ID":"dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4","Type":"ContainerDied","Data":"7c268662d0a1462b8242fb60846c6144634492c0d2bd753b26ae13b17d6f1375"} Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.100755 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.232594 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4-inventory\") pod \"dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4\" (UID: \"dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4\") " Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.232826 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs29h\" (UniqueName: \"kubernetes.io/projected/dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4-kube-api-access-cs29h\") pod \"dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4\" (UID: \"dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4\") " Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.232882 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4-ssh-key\") pod \"dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4\" (UID: \"dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4\") " Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.232940 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4-ceph\") pod \"dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4\" (UID: \"dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4\") " Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.248127 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4-ceph" (OuterVolumeSpecName: "ceph") pod "dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4" (UID: "dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.248450 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4-kube-api-access-cs29h" (OuterVolumeSpecName: "kube-api-access-cs29h") pod "dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4" (UID: "dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4"). InnerVolumeSpecName "kube-api-access-cs29h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.265724 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4-inventory" (OuterVolumeSpecName: "inventory") pod "dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4" (UID: "dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.266038 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4" (UID: "dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.337048 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.337306 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cs29h\" (UniqueName: \"kubernetes.io/projected/dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4-kube-api-access-cs29h\") on node \"crc\" DevicePath \"\"" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.337423 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.337507 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4-ceph\") on node \"crc\" DevicePath \"\"" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.622379 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp" event={"ID":"dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4","Type":"ContainerDied","Data":"ad29a325b800b5c24ca452c236ae4cf7245357335b39aac42c5402c283726d0b"} Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.622432 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad29a325b800b5c24ca452c236ae4cf7245357335b39aac42c5402c283726d0b" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.622955 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.731081 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9"] Oct 11 01:51:21 crc kubenswrapper[4743]: E1011 01:51:21.731683 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.731708 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.731973 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.732975 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.742513 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.743164 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.743468 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.743741 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.743921 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.743407 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.744175 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.744793 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.745106 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.745376 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.753414 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9"] Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.850147 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.850227 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.850274 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.851112 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.851160 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.851183 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.851201 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.851221 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbh6s\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-kube-api-access-wbh6s\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.851255 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.851301 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.851337 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.851356 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.851380 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.851422 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.851478 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.851510 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.851541 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.954237 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.954308 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.954339 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.954360 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.954428 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.954525 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.954576 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.954611 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.954666 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.954697 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.954729 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.954764 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.954807 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.954831 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.954858 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.954977 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbh6s\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-kube-api-access-wbh6s\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.955015 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.961217 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.961249 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.961734 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.962237 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.962406 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.964080 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.964216 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.965225 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.965466 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.966249 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.966611 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.971763 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.971832 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.971926 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.972091 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.976268 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:21 crc kubenswrapper[4743]: I1011 01:51:21.976358 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbh6s\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-kube-api-access-wbh6s\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:22 crc kubenswrapper[4743]: I1011 01:51:22.062349 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:51:22 crc kubenswrapper[4743]: I1011 01:51:22.628514 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9"] Oct 11 01:51:22 crc kubenswrapper[4743]: W1011 01:51:22.639830 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57fc6fbe_24cd_4185_a91e_dd39258e8d05.slice/crio-66ac8a9c3c676555db72d5ea4f3dfe1722b8bdcdde2e1ecdb6e614d4f0720d0f WatchSource:0}: Error finding container 66ac8a9c3c676555db72d5ea4f3dfe1722b8bdcdde2e1ecdb6e614d4f0720d0f: Status 404 returned error can't find the container with id 66ac8a9c3c676555db72d5ea4f3dfe1722b8bdcdde2e1ecdb6e614d4f0720d0f Oct 11 01:51:23 crc kubenswrapper[4743]: I1011 01:51:23.652969 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" event={"ID":"57fc6fbe-24cd-4185-a91e-dd39258e8d05","Type":"ContainerStarted","Data":"05cb64a786707fd92504c0f0f2854b9b7951b682415d294303490efd19258938"} Oct 11 01:51:23 crc kubenswrapper[4743]: I1011 01:51:23.653760 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" event={"ID":"57fc6fbe-24cd-4185-a91e-dd39258e8d05","Type":"ContainerStarted","Data":"66ac8a9c3c676555db72d5ea4f3dfe1722b8bdcdde2e1ecdb6e614d4f0720d0f"} Oct 11 01:51:23 crc kubenswrapper[4743]: I1011 01:51:23.694087 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" podStartSLOduration=2.277976749 podStartE2EDuration="2.69405449s" podCreationTimestamp="2025-10-11 01:51:21 +0000 UTC" firstStartedPulling="2025-10-11 01:51:22.645000382 +0000 UTC m=+3577.297980779" lastFinishedPulling="2025-10-11 01:51:23.061078113 +0000 UTC m=+3577.714058520" observedRunningTime="2025-10-11 01:51:23.683684869 +0000 UTC m=+3578.336665306" watchObservedRunningTime="2025-10-11 01:51:23.69405449 +0000 UTC m=+3578.347034967" Oct 11 01:51:28 crc kubenswrapper[4743]: I1011 01:51:28.091557 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:51:28 crc kubenswrapper[4743]: E1011 01:51:28.092625 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:51:42 crc kubenswrapper[4743]: I1011 01:51:42.093175 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:51:42 crc kubenswrapper[4743]: E1011 01:51:42.094550 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:51:57 crc kubenswrapper[4743]: I1011 01:51:57.092222 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:51:57 crc kubenswrapper[4743]: E1011 01:51:57.093051 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:52:11 crc kubenswrapper[4743]: I1011 01:52:11.092389 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:52:11 crc kubenswrapper[4743]: E1011 01:52:11.094661 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:52:22 crc kubenswrapper[4743]: I1011 01:52:22.093015 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:52:22 crc kubenswrapper[4743]: E1011 01:52:22.094297 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:52:34 crc kubenswrapper[4743]: I1011 01:52:34.565361 4743 generic.go:334] "Generic (PLEG): container finished" podID="57fc6fbe-24cd-4185-a91e-dd39258e8d05" containerID="05cb64a786707fd92504c0f0f2854b9b7951b682415d294303490efd19258938" exitCode=0 Oct 11 01:52:34 crc kubenswrapper[4743]: I1011 01:52:34.565587 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" event={"ID":"57fc6fbe-24cd-4185-a91e-dd39258e8d05","Type":"ContainerDied","Data":"05cb64a786707fd92504c0f0f2854b9b7951b682415d294303490efd19258938"} Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.099281 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:52:36 crc kubenswrapper[4743]: E1011 01:52:36.099968 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.143377 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.190316 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-ssh-key\") pod \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.190364 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbh6s\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-kube-api-access-wbh6s\") pod \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.190414 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.190488 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-neutron-metadata-combined-ca-bundle\") pod \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.190519 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-libvirt-combined-ca-bundle\") pod \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.190614 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-telemetry-power-monitoring-combined-ca-bundle\") pod \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.190667 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-ovn-default-certs-0\") pod \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.190686 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.190710 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-ceph\") pod \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.190743 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-nova-combined-ca-bundle\") pod \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.190775 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.190813 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-inventory\") pod \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.190846 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-ovn-combined-ca-bundle\") pod \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.190890 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-repo-setup-combined-ca-bundle\") pod \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.190927 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.190998 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-bootstrap-combined-ca-bundle\") pod \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.191047 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-telemetry-combined-ca-bundle\") pod \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\" (UID: \"57fc6fbe-24cd-4185-a91e-dd39258e8d05\") " Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.200198 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "57fc6fbe-24cd-4185-a91e-dd39258e8d05" (UID: "57fc6fbe-24cd-4185-a91e-dd39258e8d05"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.200251 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "57fc6fbe-24cd-4185-a91e-dd39258e8d05" (UID: "57fc6fbe-24cd-4185-a91e-dd39258e8d05"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.200814 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "57fc6fbe-24cd-4185-a91e-dd39258e8d05" (UID: "57fc6fbe-24cd-4185-a91e-dd39258e8d05"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.201633 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "57fc6fbe-24cd-4185-a91e-dd39258e8d05" (UID: "57fc6fbe-24cd-4185-a91e-dd39258e8d05"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.201672 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "57fc6fbe-24cd-4185-a91e-dd39258e8d05" (UID: "57fc6fbe-24cd-4185-a91e-dd39258e8d05"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.201833 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "57fc6fbe-24cd-4185-a91e-dd39258e8d05" (UID: "57fc6fbe-24cd-4185-a91e-dd39258e8d05"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.202173 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-kube-api-access-wbh6s" (OuterVolumeSpecName: "kube-api-access-wbh6s") pod "57fc6fbe-24cd-4185-a91e-dd39258e8d05" (UID: "57fc6fbe-24cd-4185-a91e-dd39258e8d05"). InnerVolumeSpecName "kube-api-access-wbh6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.202613 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "57fc6fbe-24cd-4185-a91e-dd39258e8d05" (UID: "57fc6fbe-24cd-4185-a91e-dd39258e8d05"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.202818 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "57fc6fbe-24cd-4185-a91e-dd39258e8d05" (UID: "57fc6fbe-24cd-4185-a91e-dd39258e8d05"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.203480 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-ceph" (OuterVolumeSpecName: "ceph") pod "57fc6fbe-24cd-4185-a91e-dd39258e8d05" (UID: "57fc6fbe-24cd-4185-a91e-dd39258e8d05"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.204504 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "57fc6fbe-24cd-4185-a91e-dd39258e8d05" (UID: "57fc6fbe-24cd-4185-a91e-dd39258e8d05"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.204680 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "57fc6fbe-24cd-4185-a91e-dd39258e8d05" (UID: "57fc6fbe-24cd-4185-a91e-dd39258e8d05"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.204921 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "57fc6fbe-24cd-4185-a91e-dd39258e8d05" (UID: "57fc6fbe-24cd-4185-a91e-dd39258e8d05"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.206021 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "57fc6fbe-24cd-4185-a91e-dd39258e8d05" (UID: "57fc6fbe-24cd-4185-a91e-dd39258e8d05"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.212945 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "57fc6fbe-24cd-4185-a91e-dd39258e8d05" (UID: "57fc6fbe-24cd-4185-a91e-dd39258e8d05"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.230102 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-inventory" (OuterVolumeSpecName: "inventory") pod "57fc6fbe-24cd-4185-a91e-dd39258e8d05" (UID: "57fc6fbe-24cd-4185-a91e-dd39258e8d05"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.231339 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "57fc6fbe-24cd-4185-a91e-dd39258e8d05" (UID: "57fc6fbe-24cd-4185-a91e-dd39258e8d05"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.293430 4743 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.293460 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.293473 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbh6s\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-kube-api-access-wbh6s\") on node \"crc\" DevicePath \"\"" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.293483 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.293494 4743 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.293506 4743 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.293515 4743 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.293524 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.293536 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.293547 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-ceph\") on node \"crc\" DevicePath \"\"" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.293555 4743 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.293564 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.293573 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.293583 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.293593 4743 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.293601 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/57fc6fbe-24cd-4185-a91e-dd39258e8d05-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.293610 4743 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57fc6fbe-24cd-4185-a91e-dd39258e8d05-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.590973 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" event={"ID":"57fc6fbe-24cd-4185-a91e-dd39258e8d05","Type":"ContainerDied","Data":"66ac8a9c3c676555db72d5ea4f3dfe1722b8bdcdde2e1ecdb6e614d4f0720d0f"} Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.591020 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66ac8a9c3c676555db72d5ea4f3dfe1722b8bdcdde2e1ecdb6e614d4f0720d0f" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.591093 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.714215 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd"] Oct 11 01:52:36 crc kubenswrapper[4743]: E1011 01:52:36.714590 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57fc6fbe-24cd-4185-a91e-dd39258e8d05" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.714609 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="57fc6fbe-24cd-4185-a91e-dd39258e8d05" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.714824 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="57fc6fbe-24cd-4185-a91e-dd39258e8d05" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.715578 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.717681 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.717688 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.719458 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.720150 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.725599 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.735979 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd"] Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.804066 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a7d527b-9f7c-40ec-8939-fbd2350a9ec3-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd\" (UID: \"2a7d527b-9f7c-40ec-8939-fbd2350a9ec3\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.804415 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsldx\" (UniqueName: \"kubernetes.io/projected/2a7d527b-9f7c-40ec-8939-fbd2350a9ec3-kube-api-access-vsldx\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd\" (UID: \"2a7d527b-9f7c-40ec-8939-fbd2350a9ec3\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.804659 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a7d527b-9f7c-40ec-8939-fbd2350a9ec3-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd\" (UID: \"2a7d527b-9f7c-40ec-8939-fbd2350a9ec3\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.804807 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2a7d527b-9f7c-40ec-8939-fbd2350a9ec3-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd\" (UID: \"2a7d527b-9f7c-40ec-8939-fbd2350a9ec3\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.907325 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a7d527b-9f7c-40ec-8939-fbd2350a9ec3-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd\" (UID: \"2a7d527b-9f7c-40ec-8939-fbd2350a9ec3\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.907479 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsldx\" (UniqueName: \"kubernetes.io/projected/2a7d527b-9f7c-40ec-8939-fbd2350a9ec3-kube-api-access-vsldx\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd\" (UID: \"2a7d527b-9f7c-40ec-8939-fbd2350a9ec3\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.907521 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a7d527b-9f7c-40ec-8939-fbd2350a9ec3-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd\" (UID: \"2a7d527b-9f7c-40ec-8939-fbd2350a9ec3\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.907597 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2a7d527b-9f7c-40ec-8939-fbd2350a9ec3-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd\" (UID: \"2a7d527b-9f7c-40ec-8939-fbd2350a9ec3\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.913032 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2a7d527b-9f7c-40ec-8939-fbd2350a9ec3-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd\" (UID: \"2a7d527b-9f7c-40ec-8939-fbd2350a9ec3\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.914063 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a7d527b-9f7c-40ec-8939-fbd2350a9ec3-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd\" (UID: \"2a7d527b-9f7c-40ec-8939-fbd2350a9ec3\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.914428 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a7d527b-9f7c-40ec-8939-fbd2350a9ec3-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd\" (UID: \"2a7d527b-9f7c-40ec-8939-fbd2350a9ec3\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd" Oct 11 01:52:36 crc kubenswrapper[4743]: I1011 01:52:36.926644 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsldx\" (UniqueName: \"kubernetes.io/projected/2a7d527b-9f7c-40ec-8939-fbd2350a9ec3-kube-api-access-vsldx\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd\" (UID: \"2a7d527b-9f7c-40ec-8939-fbd2350a9ec3\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd" Oct 11 01:52:37 crc kubenswrapper[4743]: I1011 01:52:37.032002 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd" Oct 11 01:52:37 crc kubenswrapper[4743]: I1011 01:52:37.602491 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd"] Oct 11 01:52:38 crc kubenswrapper[4743]: I1011 01:52:38.624776 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd" event={"ID":"2a7d527b-9f7c-40ec-8939-fbd2350a9ec3","Type":"ContainerStarted","Data":"3e47b2176116571496137e5dafe5d90fbd01f20a8dde1b390935840fb89b29b0"} Oct 11 01:52:38 crc kubenswrapper[4743]: I1011 01:52:38.625386 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd" event={"ID":"2a7d527b-9f7c-40ec-8939-fbd2350a9ec3","Type":"ContainerStarted","Data":"a1e47bd6cc871700c5d3309580b18db4d9ee9464c071df68fd21a370a9982efc"} Oct 11 01:52:38 crc kubenswrapper[4743]: I1011 01:52:38.655482 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd" podStartSLOduration=2.090118413 podStartE2EDuration="2.65546334s" podCreationTimestamp="2025-10-11 01:52:36 +0000 UTC" firstStartedPulling="2025-10-11 01:52:37.614898138 +0000 UTC m=+3652.267878545" lastFinishedPulling="2025-10-11 01:52:38.180243055 +0000 UTC m=+3652.833223472" observedRunningTime="2025-10-11 01:52:38.643347846 +0000 UTC m=+3653.296328243" watchObservedRunningTime="2025-10-11 01:52:38.65546334 +0000 UTC m=+3653.308443737" Oct 11 01:52:46 crc kubenswrapper[4743]: I1011 01:52:46.713096 4743 generic.go:334] "Generic (PLEG): container finished" podID="2a7d527b-9f7c-40ec-8939-fbd2350a9ec3" containerID="3e47b2176116571496137e5dafe5d90fbd01f20a8dde1b390935840fb89b29b0" exitCode=0 Oct 11 01:52:46 crc kubenswrapper[4743]: I1011 01:52:46.713168 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd" event={"ID":"2a7d527b-9f7c-40ec-8939-fbd2350a9ec3","Type":"ContainerDied","Data":"3e47b2176116571496137e5dafe5d90fbd01f20a8dde1b390935840fb89b29b0"} Oct 11 01:52:47 crc kubenswrapper[4743]: I1011 01:52:47.091760 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:52:47 crc kubenswrapper[4743]: E1011 01:52:47.092078 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.210350 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.279942 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a7d527b-9f7c-40ec-8939-fbd2350a9ec3-inventory\") pod \"2a7d527b-9f7c-40ec-8939-fbd2350a9ec3\" (UID: \"2a7d527b-9f7c-40ec-8939-fbd2350a9ec3\") " Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.280065 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsldx\" (UniqueName: \"kubernetes.io/projected/2a7d527b-9f7c-40ec-8939-fbd2350a9ec3-kube-api-access-vsldx\") pod \"2a7d527b-9f7c-40ec-8939-fbd2350a9ec3\" (UID: \"2a7d527b-9f7c-40ec-8939-fbd2350a9ec3\") " Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.280139 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2a7d527b-9f7c-40ec-8939-fbd2350a9ec3-ceph\") pod \"2a7d527b-9f7c-40ec-8939-fbd2350a9ec3\" (UID: \"2a7d527b-9f7c-40ec-8939-fbd2350a9ec3\") " Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.280296 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a7d527b-9f7c-40ec-8939-fbd2350a9ec3-ssh-key\") pod \"2a7d527b-9f7c-40ec-8939-fbd2350a9ec3\" (UID: \"2a7d527b-9f7c-40ec-8939-fbd2350a9ec3\") " Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.286401 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a7d527b-9f7c-40ec-8939-fbd2350a9ec3-ceph" (OuterVolumeSpecName: "ceph") pod "2a7d527b-9f7c-40ec-8939-fbd2350a9ec3" (UID: "2a7d527b-9f7c-40ec-8939-fbd2350a9ec3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.287253 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a7d527b-9f7c-40ec-8939-fbd2350a9ec3-kube-api-access-vsldx" (OuterVolumeSpecName: "kube-api-access-vsldx") pod "2a7d527b-9f7c-40ec-8939-fbd2350a9ec3" (UID: "2a7d527b-9f7c-40ec-8939-fbd2350a9ec3"). InnerVolumeSpecName "kube-api-access-vsldx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.325253 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a7d527b-9f7c-40ec-8939-fbd2350a9ec3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2a7d527b-9f7c-40ec-8939-fbd2350a9ec3" (UID: "2a7d527b-9f7c-40ec-8939-fbd2350a9ec3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.330503 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a7d527b-9f7c-40ec-8939-fbd2350a9ec3-inventory" (OuterVolumeSpecName: "inventory") pod "2a7d527b-9f7c-40ec-8939-fbd2350a9ec3" (UID: "2a7d527b-9f7c-40ec-8939-fbd2350a9ec3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.382843 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a7d527b-9f7c-40ec-8939-fbd2350a9ec3-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.382885 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsldx\" (UniqueName: \"kubernetes.io/projected/2a7d527b-9f7c-40ec-8939-fbd2350a9ec3-kube-api-access-vsldx\") on node \"crc\" DevicePath \"\"" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.382896 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2a7d527b-9f7c-40ec-8939-fbd2350a9ec3-ceph\") on node \"crc\" DevicePath \"\"" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.382904 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a7d527b-9f7c-40ec-8939-fbd2350a9ec3-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.739936 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd" event={"ID":"2a7d527b-9f7c-40ec-8939-fbd2350a9ec3","Type":"ContainerDied","Data":"a1e47bd6cc871700c5d3309580b18db4d9ee9464c071df68fd21a370a9982efc"} Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.739990 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1e47bd6cc871700c5d3309580b18db4d9ee9464c071df68fd21a370a9982efc" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.740073 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.868150 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8"] Oct 11 01:52:48 crc kubenswrapper[4743]: E1011 01:52:48.868813 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a7d527b-9f7c-40ec-8939-fbd2350a9ec3" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.868912 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7d527b-9f7c-40ec-8939-fbd2350a9ec3" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.869184 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a7d527b-9f7c-40ec-8939-fbd2350a9ec3" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.869984 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.872407 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.872994 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.873242 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.873515 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.873792 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.882603 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8"] Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.885430 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.994378 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ade740-f798-4354-9e89-35aa325d8b92-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2vs8\" (UID: \"00ade740-f798-4354-9e89-35aa325d8b92\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.994437 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/00ade740-f798-4354-9e89-35aa325d8b92-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2vs8\" (UID: \"00ade740-f798-4354-9e89-35aa325d8b92\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.994531 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/00ade740-f798-4354-9e89-35aa325d8b92-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2vs8\" (UID: \"00ade740-f798-4354-9e89-35aa325d8b92\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.994585 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00ade740-f798-4354-9e89-35aa325d8b92-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2vs8\" (UID: \"00ade740-f798-4354-9e89-35aa325d8b92\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.994617 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2knwn\" (UniqueName: \"kubernetes.io/projected/00ade740-f798-4354-9e89-35aa325d8b92-kube-api-access-2knwn\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2vs8\" (UID: \"00ade740-f798-4354-9e89-35aa325d8b92\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" Oct 11 01:52:48 crc kubenswrapper[4743]: I1011 01:52:48.994643 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/00ade740-f798-4354-9e89-35aa325d8b92-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2vs8\" (UID: \"00ade740-f798-4354-9e89-35aa325d8b92\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" Oct 11 01:52:49 crc kubenswrapper[4743]: I1011 01:52:49.096045 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ade740-f798-4354-9e89-35aa325d8b92-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2vs8\" (UID: \"00ade740-f798-4354-9e89-35aa325d8b92\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" Oct 11 01:52:49 crc kubenswrapper[4743]: I1011 01:52:49.096095 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/00ade740-f798-4354-9e89-35aa325d8b92-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2vs8\" (UID: \"00ade740-f798-4354-9e89-35aa325d8b92\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" Oct 11 01:52:49 crc kubenswrapper[4743]: I1011 01:52:49.096168 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/00ade740-f798-4354-9e89-35aa325d8b92-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2vs8\" (UID: \"00ade740-f798-4354-9e89-35aa325d8b92\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" Oct 11 01:52:49 crc kubenswrapper[4743]: I1011 01:52:49.096216 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00ade740-f798-4354-9e89-35aa325d8b92-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2vs8\" (UID: \"00ade740-f798-4354-9e89-35aa325d8b92\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" Oct 11 01:52:49 crc kubenswrapper[4743]: I1011 01:52:49.096238 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2knwn\" (UniqueName: \"kubernetes.io/projected/00ade740-f798-4354-9e89-35aa325d8b92-kube-api-access-2knwn\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2vs8\" (UID: \"00ade740-f798-4354-9e89-35aa325d8b92\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" Oct 11 01:52:49 crc kubenswrapper[4743]: I1011 01:52:49.096257 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/00ade740-f798-4354-9e89-35aa325d8b92-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2vs8\" (UID: \"00ade740-f798-4354-9e89-35aa325d8b92\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" Oct 11 01:52:49 crc kubenswrapper[4743]: I1011 01:52:49.097485 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/00ade740-f798-4354-9e89-35aa325d8b92-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2vs8\" (UID: \"00ade740-f798-4354-9e89-35aa325d8b92\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" Oct 11 01:52:49 crc kubenswrapper[4743]: I1011 01:52:49.099367 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/00ade740-f798-4354-9e89-35aa325d8b92-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2vs8\" (UID: \"00ade740-f798-4354-9e89-35aa325d8b92\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" Oct 11 01:52:49 crc kubenswrapper[4743]: I1011 01:52:49.099392 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ade740-f798-4354-9e89-35aa325d8b92-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2vs8\" (UID: \"00ade740-f798-4354-9e89-35aa325d8b92\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" Oct 11 01:52:49 crc kubenswrapper[4743]: I1011 01:52:49.100597 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00ade740-f798-4354-9e89-35aa325d8b92-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2vs8\" (UID: \"00ade740-f798-4354-9e89-35aa325d8b92\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" Oct 11 01:52:49 crc kubenswrapper[4743]: I1011 01:52:49.103000 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/00ade740-f798-4354-9e89-35aa325d8b92-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2vs8\" (UID: \"00ade740-f798-4354-9e89-35aa325d8b92\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" Oct 11 01:52:49 crc kubenswrapper[4743]: I1011 01:52:49.116467 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2knwn\" (UniqueName: \"kubernetes.io/projected/00ade740-f798-4354-9e89-35aa325d8b92-kube-api-access-2knwn\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w2vs8\" (UID: \"00ade740-f798-4354-9e89-35aa325d8b92\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" Oct 11 01:52:49 crc kubenswrapper[4743]: I1011 01:52:49.185626 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" Oct 11 01:52:49 crc kubenswrapper[4743]: I1011 01:52:49.756928 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8"] Oct 11 01:52:49 crc kubenswrapper[4743]: W1011 01:52:49.760891 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00ade740_f798_4354_9e89_35aa325d8b92.slice/crio-ce7088c23fbb06b788d1016b352fa190d2c03b86a8cffe2054947ff62742df54 WatchSource:0}: Error finding container ce7088c23fbb06b788d1016b352fa190d2c03b86a8cffe2054947ff62742df54: Status 404 returned error can't find the container with id ce7088c23fbb06b788d1016b352fa190d2c03b86a8cffe2054947ff62742df54 Oct 11 01:52:50 crc kubenswrapper[4743]: I1011 01:52:50.766563 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" event={"ID":"00ade740-f798-4354-9e89-35aa325d8b92","Type":"ContainerStarted","Data":"a85ebf4a04ea39ec79ae73f1e05d959e0c259f2681f7e11eac621504749ffe1f"} Oct 11 01:52:50 crc kubenswrapper[4743]: I1011 01:52:50.766874 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" event={"ID":"00ade740-f798-4354-9e89-35aa325d8b92","Type":"ContainerStarted","Data":"ce7088c23fbb06b788d1016b352fa190d2c03b86a8cffe2054947ff62742df54"} Oct 11 01:52:50 crc kubenswrapper[4743]: I1011 01:52:50.806335 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" podStartSLOduration=2.364751655 podStartE2EDuration="2.806310364s" podCreationTimestamp="2025-10-11 01:52:48 +0000 UTC" firstStartedPulling="2025-10-11 01:52:49.763992739 +0000 UTC m=+3664.416973136" lastFinishedPulling="2025-10-11 01:52:50.205551458 +0000 UTC m=+3664.858531845" observedRunningTime="2025-10-11 01:52:50.786275928 +0000 UTC m=+3665.439256335" watchObservedRunningTime="2025-10-11 01:52:50.806310364 +0000 UTC m=+3665.459290801" Oct 11 01:53:01 crc kubenswrapper[4743]: I1011 01:53:01.091721 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:53:01 crc kubenswrapper[4743]: E1011 01:53:01.092668 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:53:10 crc kubenswrapper[4743]: I1011 01:53:10.298746 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zftxg"] Oct 11 01:53:10 crc kubenswrapper[4743]: I1011 01:53:10.302513 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zftxg" Oct 11 01:53:10 crc kubenswrapper[4743]: I1011 01:53:10.316626 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zftxg"] Oct 11 01:53:10 crc kubenswrapper[4743]: I1011 01:53:10.426000 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6d37765-79ab-452b-a32a-26e497569aa6-utilities\") pod \"redhat-marketplace-zftxg\" (UID: \"c6d37765-79ab-452b-a32a-26e497569aa6\") " pod="openshift-marketplace/redhat-marketplace-zftxg" Oct 11 01:53:10 crc kubenswrapper[4743]: I1011 01:53:10.426517 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6d37765-79ab-452b-a32a-26e497569aa6-catalog-content\") pod \"redhat-marketplace-zftxg\" (UID: \"c6d37765-79ab-452b-a32a-26e497569aa6\") " pod="openshift-marketplace/redhat-marketplace-zftxg" Oct 11 01:53:10 crc kubenswrapper[4743]: I1011 01:53:10.426611 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvz65\" (UniqueName: \"kubernetes.io/projected/c6d37765-79ab-452b-a32a-26e497569aa6-kube-api-access-pvz65\") pod \"redhat-marketplace-zftxg\" (UID: \"c6d37765-79ab-452b-a32a-26e497569aa6\") " pod="openshift-marketplace/redhat-marketplace-zftxg" Oct 11 01:53:10 crc kubenswrapper[4743]: I1011 01:53:10.528431 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6d37765-79ab-452b-a32a-26e497569aa6-utilities\") pod \"redhat-marketplace-zftxg\" (UID: \"c6d37765-79ab-452b-a32a-26e497569aa6\") " pod="openshift-marketplace/redhat-marketplace-zftxg" Oct 11 01:53:10 crc kubenswrapper[4743]: I1011 01:53:10.528492 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6d37765-79ab-452b-a32a-26e497569aa6-catalog-content\") pod \"redhat-marketplace-zftxg\" (UID: \"c6d37765-79ab-452b-a32a-26e497569aa6\") " pod="openshift-marketplace/redhat-marketplace-zftxg" Oct 11 01:53:10 crc kubenswrapper[4743]: I1011 01:53:10.528567 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvz65\" (UniqueName: \"kubernetes.io/projected/c6d37765-79ab-452b-a32a-26e497569aa6-kube-api-access-pvz65\") pod \"redhat-marketplace-zftxg\" (UID: \"c6d37765-79ab-452b-a32a-26e497569aa6\") " pod="openshift-marketplace/redhat-marketplace-zftxg" Oct 11 01:53:10 crc kubenswrapper[4743]: I1011 01:53:10.529429 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6d37765-79ab-452b-a32a-26e497569aa6-utilities\") pod \"redhat-marketplace-zftxg\" (UID: \"c6d37765-79ab-452b-a32a-26e497569aa6\") " pod="openshift-marketplace/redhat-marketplace-zftxg" Oct 11 01:53:10 crc kubenswrapper[4743]: I1011 01:53:10.529645 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6d37765-79ab-452b-a32a-26e497569aa6-catalog-content\") pod \"redhat-marketplace-zftxg\" (UID: \"c6d37765-79ab-452b-a32a-26e497569aa6\") " pod="openshift-marketplace/redhat-marketplace-zftxg" Oct 11 01:53:10 crc kubenswrapper[4743]: I1011 01:53:10.552478 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvz65\" (UniqueName: \"kubernetes.io/projected/c6d37765-79ab-452b-a32a-26e497569aa6-kube-api-access-pvz65\") pod \"redhat-marketplace-zftxg\" (UID: \"c6d37765-79ab-452b-a32a-26e497569aa6\") " pod="openshift-marketplace/redhat-marketplace-zftxg" Oct 11 01:53:10 crc kubenswrapper[4743]: I1011 01:53:10.637696 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zftxg" Oct 11 01:53:11 crc kubenswrapper[4743]: I1011 01:53:11.132049 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zftxg"] Oct 11 01:53:11 crc kubenswrapper[4743]: W1011 01:53:11.147091 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6d37765_79ab_452b_a32a_26e497569aa6.slice/crio-06df9b90c5589f7c3fa8aa7b3469ef32acd0bebd50a7c6052887d0e09d350d5d WatchSource:0}: Error finding container 06df9b90c5589f7c3fa8aa7b3469ef32acd0bebd50a7c6052887d0e09d350d5d: Status 404 returned error can't find the container with id 06df9b90c5589f7c3fa8aa7b3469ef32acd0bebd50a7c6052887d0e09d350d5d Oct 11 01:53:12 crc kubenswrapper[4743]: I1011 01:53:12.014538 4743 generic.go:334] "Generic (PLEG): container finished" podID="c6d37765-79ab-452b-a32a-26e497569aa6" containerID="db08ce3095dc3a90b36b74fe0086275e91f70736be7d7043f6281a9984824dc2" exitCode=0 Oct 11 01:53:12 crc kubenswrapper[4743]: I1011 01:53:12.014548 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zftxg" event={"ID":"c6d37765-79ab-452b-a32a-26e497569aa6","Type":"ContainerDied","Data":"db08ce3095dc3a90b36b74fe0086275e91f70736be7d7043f6281a9984824dc2"} Oct 11 01:53:12 crc kubenswrapper[4743]: I1011 01:53:12.015121 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zftxg" event={"ID":"c6d37765-79ab-452b-a32a-26e497569aa6","Type":"ContainerStarted","Data":"06df9b90c5589f7c3fa8aa7b3469ef32acd0bebd50a7c6052887d0e09d350d5d"} Oct 11 01:53:13 crc kubenswrapper[4743]: I1011 01:53:13.029844 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zftxg" event={"ID":"c6d37765-79ab-452b-a32a-26e497569aa6","Type":"ContainerStarted","Data":"1f8ea28b2cd758f9dafd845309c1399e7a591eb2adf089d953c7cd5a472bcfd5"} Oct 11 01:53:14 crc kubenswrapper[4743]: I1011 01:53:14.052678 4743 generic.go:334] "Generic (PLEG): container finished" podID="c6d37765-79ab-452b-a32a-26e497569aa6" containerID="1f8ea28b2cd758f9dafd845309c1399e7a591eb2adf089d953c7cd5a472bcfd5" exitCode=0 Oct 11 01:53:14 crc kubenswrapper[4743]: I1011 01:53:14.052809 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zftxg" event={"ID":"c6d37765-79ab-452b-a32a-26e497569aa6","Type":"ContainerDied","Data":"1f8ea28b2cd758f9dafd845309c1399e7a591eb2adf089d953c7cd5a472bcfd5"} Oct 11 01:53:15 crc kubenswrapper[4743]: I1011 01:53:15.071367 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zftxg" event={"ID":"c6d37765-79ab-452b-a32a-26e497569aa6","Type":"ContainerStarted","Data":"b12116d577cdcdbb2499f638a25df7b2a09382027a3fcae2db5d8bf26e355a23"} Oct 11 01:53:15 crc kubenswrapper[4743]: I1011 01:53:15.094635 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zftxg" podStartSLOduration=2.625704956 podStartE2EDuration="5.094612215s" podCreationTimestamp="2025-10-11 01:53:10 +0000 UTC" firstStartedPulling="2025-10-11 01:53:12.01692683 +0000 UTC m=+3686.669907227" lastFinishedPulling="2025-10-11 01:53:14.485834069 +0000 UTC m=+3689.138814486" observedRunningTime="2025-10-11 01:53:15.089370255 +0000 UTC m=+3689.742350672" watchObservedRunningTime="2025-10-11 01:53:15.094612215 +0000 UTC m=+3689.747592612" Oct 11 01:53:16 crc kubenswrapper[4743]: I1011 01:53:16.101073 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:53:16 crc kubenswrapper[4743]: E1011 01:53:16.101701 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:53:20 crc kubenswrapper[4743]: I1011 01:53:20.637803 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zftxg" Oct 11 01:53:20 crc kubenswrapper[4743]: I1011 01:53:20.638409 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zftxg" Oct 11 01:53:20 crc kubenswrapper[4743]: I1011 01:53:20.692201 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zftxg" Oct 11 01:53:21 crc kubenswrapper[4743]: I1011 01:53:21.225018 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zftxg" Oct 11 01:53:21 crc kubenswrapper[4743]: I1011 01:53:21.283931 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zftxg"] Oct 11 01:53:23 crc kubenswrapper[4743]: I1011 01:53:23.160900 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zftxg" podUID="c6d37765-79ab-452b-a32a-26e497569aa6" containerName="registry-server" containerID="cri-o://b12116d577cdcdbb2499f638a25df7b2a09382027a3fcae2db5d8bf26e355a23" gracePeriod=2 Oct 11 01:53:23 crc kubenswrapper[4743]: I1011 01:53:23.710594 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zftxg" Oct 11 01:53:23 crc kubenswrapper[4743]: I1011 01:53:23.828331 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6d37765-79ab-452b-a32a-26e497569aa6-catalog-content\") pod \"c6d37765-79ab-452b-a32a-26e497569aa6\" (UID: \"c6d37765-79ab-452b-a32a-26e497569aa6\") " Oct 11 01:53:23 crc kubenswrapper[4743]: I1011 01:53:23.828508 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvz65\" (UniqueName: \"kubernetes.io/projected/c6d37765-79ab-452b-a32a-26e497569aa6-kube-api-access-pvz65\") pod \"c6d37765-79ab-452b-a32a-26e497569aa6\" (UID: \"c6d37765-79ab-452b-a32a-26e497569aa6\") " Oct 11 01:53:23 crc kubenswrapper[4743]: I1011 01:53:23.828568 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6d37765-79ab-452b-a32a-26e497569aa6-utilities\") pod \"c6d37765-79ab-452b-a32a-26e497569aa6\" (UID: \"c6d37765-79ab-452b-a32a-26e497569aa6\") " Oct 11 01:53:23 crc kubenswrapper[4743]: I1011 01:53:23.829413 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6d37765-79ab-452b-a32a-26e497569aa6-utilities" (OuterVolumeSpecName: "utilities") pod "c6d37765-79ab-452b-a32a-26e497569aa6" (UID: "c6d37765-79ab-452b-a32a-26e497569aa6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:53:23 crc kubenswrapper[4743]: I1011 01:53:23.834200 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6d37765-79ab-452b-a32a-26e497569aa6-kube-api-access-pvz65" (OuterVolumeSpecName: "kube-api-access-pvz65") pod "c6d37765-79ab-452b-a32a-26e497569aa6" (UID: "c6d37765-79ab-452b-a32a-26e497569aa6"). InnerVolumeSpecName "kube-api-access-pvz65". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:53:23 crc kubenswrapper[4743]: I1011 01:53:23.846112 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6d37765-79ab-452b-a32a-26e497569aa6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6d37765-79ab-452b-a32a-26e497569aa6" (UID: "c6d37765-79ab-452b-a32a-26e497569aa6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:53:23 crc kubenswrapper[4743]: I1011 01:53:23.931407 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6d37765-79ab-452b-a32a-26e497569aa6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 01:53:23 crc kubenswrapper[4743]: I1011 01:53:23.931444 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvz65\" (UniqueName: \"kubernetes.io/projected/c6d37765-79ab-452b-a32a-26e497569aa6-kube-api-access-pvz65\") on node \"crc\" DevicePath \"\"" Oct 11 01:53:23 crc kubenswrapper[4743]: I1011 01:53:23.931461 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6d37765-79ab-452b-a32a-26e497569aa6-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 01:53:24 crc kubenswrapper[4743]: I1011 01:53:24.173666 4743 generic.go:334] "Generic (PLEG): container finished" podID="c6d37765-79ab-452b-a32a-26e497569aa6" containerID="b12116d577cdcdbb2499f638a25df7b2a09382027a3fcae2db5d8bf26e355a23" exitCode=0 Oct 11 01:53:24 crc kubenswrapper[4743]: I1011 01:53:24.173731 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zftxg" event={"ID":"c6d37765-79ab-452b-a32a-26e497569aa6","Type":"ContainerDied","Data":"b12116d577cdcdbb2499f638a25df7b2a09382027a3fcae2db5d8bf26e355a23"} Oct 11 01:53:24 crc kubenswrapper[4743]: I1011 01:53:24.173774 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zftxg" event={"ID":"c6d37765-79ab-452b-a32a-26e497569aa6","Type":"ContainerDied","Data":"06df9b90c5589f7c3fa8aa7b3469ef32acd0bebd50a7c6052887d0e09d350d5d"} Oct 11 01:53:24 crc kubenswrapper[4743]: I1011 01:53:24.173782 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zftxg" Oct 11 01:53:24 crc kubenswrapper[4743]: I1011 01:53:24.173815 4743 scope.go:117] "RemoveContainer" containerID="b12116d577cdcdbb2499f638a25df7b2a09382027a3fcae2db5d8bf26e355a23" Oct 11 01:53:24 crc kubenswrapper[4743]: I1011 01:53:24.208321 4743 scope.go:117] "RemoveContainer" containerID="1f8ea28b2cd758f9dafd845309c1399e7a591eb2adf089d953c7cd5a472bcfd5" Oct 11 01:53:24 crc kubenswrapper[4743]: I1011 01:53:24.217237 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zftxg"] Oct 11 01:53:24 crc kubenswrapper[4743]: I1011 01:53:24.232889 4743 scope.go:117] "RemoveContainer" containerID="db08ce3095dc3a90b36b74fe0086275e91f70736be7d7043f6281a9984824dc2" Oct 11 01:53:24 crc kubenswrapper[4743]: I1011 01:53:24.241967 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zftxg"] Oct 11 01:53:24 crc kubenswrapper[4743]: I1011 01:53:24.314933 4743 scope.go:117] "RemoveContainer" containerID="b12116d577cdcdbb2499f638a25df7b2a09382027a3fcae2db5d8bf26e355a23" Oct 11 01:53:24 crc kubenswrapper[4743]: E1011 01:53:24.315357 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b12116d577cdcdbb2499f638a25df7b2a09382027a3fcae2db5d8bf26e355a23\": container with ID starting with b12116d577cdcdbb2499f638a25df7b2a09382027a3fcae2db5d8bf26e355a23 not found: ID does not exist" containerID="b12116d577cdcdbb2499f638a25df7b2a09382027a3fcae2db5d8bf26e355a23" Oct 11 01:53:24 crc kubenswrapper[4743]: I1011 01:53:24.315404 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b12116d577cdcdbb2499f638a25df7b2a09382027a3fcae2db5d8bf26e355a23"} err="failed to get container status \"b12116d577cdcdbb2499f638a25df7b2a09382027a3fcae2db5d8bf26e355a23\": rpc error: code = NotFound desc = could not find container \"b12116d577cdcdbb2499f638a25df7b2a09382027a3fcae2db5d8bf26e355a23\": container with ID starting with b12116d577cdcdbb2499f638a25df7b2a09382027a3fcae2db5d8bf26e355a23 not found: ID does not exist" Oct 11 01:53:24 crc kubenswrapper[4743]: I1011 01:53:24.315438 4743 scope.go:117] "RemoveContainer" containerID="1f8ea28b2cd758f9dafd845309c1399e7a591eb2adf089d953c7cd5a472bcfd5" Oct 11 01:53:24 crc kubenswrapper[4743]: E1011 01:53:24.315768 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f8ea28b2cd758f9dafd845309c1399e7a591eb2adf089d953c7cd5a472bcfd5\": container with ID starting with 1f8ea28b2cd758f9dafd845309c1399e7a591eb2adf089d953c7cd5a472bcfd5 not found: ID does not exist" containerID="1f8ea28b2cd758f9dafd845309c1399e7a591eb2adf089d953c7cd5a472bcfd5" Oct 11 01:53:24 crc kubenswrapper[4743]: I1011 01:53:24.315808 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f8ea28b2cd758f9dafd845309c1399e7a591eb2adf089d953c7cd5a472bcfd5"} err="failed to get container status \"1f8ea28b2cd758f9dafd845309c1399e7a591eb2adf089d953c7cd5a472bcfd5\": rpc error: code = NotFound desc = could not find container \"1f8ea28b2cd758f9dafd845309c1399e7a591eb2adf089d953c7cd5a472bcfd5\": container with ID starting with 1f8ea28b2cd758f9dafd845309c1399e7a591eb2adf089d953c7cd5a472bcfd5 not found: ID does not exist" Oct 11 01:53:24 crc kubenswrapper[4743]: I1011 01:53:24.315833 4743 scope.go:117] "RemoveContainer" containerID="db08ce3095dc3a90b36b74fe0086275e91f70736be7d7043f6281a9984824dc2" Oct 11 01:53:24 crc kubenswrapper[4743]: E1011 01:53:24.316155 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db08ce3095dc3a90b36b74fe0086275e91f70736be7d7043f6281a9984824dc2\": container with ID starting with db08ce3095dc3a90b36b74fe0086275e91f70736be7d7043f6281a9984824dc2 not found: ID does not exist" containerID="db08ce3095dc3a90b36b74fe0086275e91f70736be7d7043f6281a9984824dc2" Oct 11 01:53:24 crc kubenswrapper[4743]: I1011 01:53:24.316183 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db08ce3095dc3a90b36b74fe0086275e91f70736be7d7043f6281a9984824dc2"} err="failed to get container status \"db08ce3095dc3a90b36b74fe0086275e91f70736be7d7043f6281a9984824dc2\": rpc error: code = NotFound desc = could not find container \"db08ce3095dc3a90b36b74fe0086275e91f70736be7d7043f6281a9984824dc2\": container with ID starting with db08ce3095dc3a90b36b74fe0086275e91f70736be7d7043f6281a9984824dc2 not found: ID does not exist" Oct 11 01:53:26 crc kubenswrapper[4743]: I1011 01:53:26.119467 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6d37765-79ab-452b-a32a-26e497569aa6" path="/var/lib/kubelet/pods/c6d37765-79ab-452b-a32a-26e497569aa6/volumes" Oct 11 01:53:28 crc kubenswrapper[4743]: I1011 01:53:28.091384 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:53:28 crc kubenswrapper[4743]: E1011 01:53:28.092000 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:53:41 crc kubenswrapper[4743]: I1011 01:53:41.092285 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:53:41 crc kubenswrapper[4743]: E1011 01:53:41.093572 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:53:53 crc kubenswrapper[4743]: I1011 01:53:53.093167 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:53:53 crc kubenswrapper[4743]: E1011 01:53:53.094162 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:54:04 crc kubenswrapper[4743]: I1011 01:54:04.091886 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:54:04 crc kubenswrapper[4743]: E1011 01:54:04.092727 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:54:16 crc kubenswrapper[4743]: I1011 01:54:16.098382 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:54:16 crc kubenswrapper[4743]: E1011 01:54:16.099222 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:54:20 crc kubenswrapper[4743]: I1011 01:54:20.855982 4743 generic.go:334] "Generic (PLEG): container finished" podID="00ade740-f798-4354-9e89-35aa325d8b92" containerID="a85ebf4a04ea39ec79ae73f1e05d959e0c259f2681f7e11eac621504749ffe1f" exitCode=0 Oct 11 01:54:20 crc kubenswrapper[4743]: I1011 01:54:20.856109 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" event={"ID":"00ade740-f798-4354-9e89-35aa325d8b92","Type":"ContainerDied","Data":"a85ebf4a04ea39ec79ae73f1e05d959e0c259f2681f7e11eac621504749ffe1f"} Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.423918 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.504783 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/00ade740-f798-4354-9e89-35aa325d8b92-ovncontroller-config-0\") pod \"00ade740-f798-4354-9e89-35aa325d8b92\" (UID: \"00ade740-f798-4354-9e89-35aa325d8b92\") " Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.505033 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00ade740-f798-4354-9e89-35aa325d8b92-inventory\") pod \"00ade740-f798-4354-9e89-35aa325d8b92\" (UID: \"00ade740-f798-4354-9e89-35aa325d8b92\") " Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.505089 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2knwn\" (UniqueName: \"kubernetes.io/projected/00ade740-f798-4354-9e89-35aa325d8b92-kube-api-access-2knwn\") pod \"00ade740-f798-4354-9e89-35aa325d8b92\" (UID: \"00ade740-f798-4354-9e89-35aa325d8b92\") " Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.505132 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ade740-f798-4354-9e89-35aa325d8b92-ovn-combined-ca-bundle\") pod \"00ade740-f798-4354-9e89-35aa325d8b92\" (UID: \"00ade740-f798-4354-9e89-35aa325d8b92\") " Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.505173 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/00ade740-f798-4354-9e89-35aa325d8b92-ceph\") pod \"00ade740-f798-4354-9e89-35aa325d8b92\" (UID: \"00ade740-f798-4354-9e89-35aa325d8b92\") " Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.505251 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/00ade740-f798-4354-9e89-35aa325d8b92-ssh-key\") pod \"00ade740-f798-4354-9e89-35aa325d8b92\" (UID: \"00ade740-f798-4354-9e89-35aa325d8b92\") " Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.514002 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ade740-f798-4354-9e89-35aa325d8b92-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "00ade740-f798-4354-9e89-35aa325d8b92" (UID: "00ade740-f798-4354-9e89-35aa325d8b92"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.515133 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00ade740-f798-4354-9e89-35aa325d8b92-kube-api-access-2knwn" (OuterVolumeSpecName: "kube-api-access-2knwn") pod "00ade740-f798-4354-9e89-35aa325d8b92" (UID: "00ade740-f798-4354-9e89-35aa325d8b92"). InnerVolumeSpecName "kube-api-access-2knwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.521973 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ade740-f798-4354-9e89-35aa325d8b92-ceph" (OuterVolumeSpecName: "ceph") pod "00ade740-f798-4354-9e89-35aa325d8b92" (UID: "00ade740-f798-4354-9e89-35aa325d8b92"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.564237 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00ade740-f798-4354-9e89-35aa325d8b92-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "00ade740-f798-4354-9e89-35aa325d8b92" (UID: "00ade740-f798-4354-9e89-35aa325d8b92"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.570058 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ade740-f798-4354-9e89-35aa325d8b92-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "00ade740-f798-4354-9e89-35aa325d8b92" (UID: "00ade740-f798-4354-9e89-35aa325d8b92"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.609322 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ade740-f798-4354-9e89-35aa325d8b92-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.609357 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/00ade740-f798-4354-9e89-35aa325d8b92-ceph\") on node \"crc\" DevicePath \"\"" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.609369 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/00ade740-f798-4354-9e89-35aa325d8b92-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.609377 4743 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/00ade740-f798-4354-9e89-35aa325d8b92-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.609387 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2knwn\" (UniqueName: \"kubernetes.io/projected/00ade740-f798-4354-9e89-35aa325d8b92-kube-api-access-2knwn\") on node \"crc\" DevicePath \"\"" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.622327 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ade740-f798-4354-9e89-35aa325d8b92-inventory" (OuterVolumeSpecName: "inventory") pod "00ade740-f798-4354-9e89-35aa325d8b92" (UID: "00ade740-f798-4354-9e89-35aa325d8b92"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.711469 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00ade740-f798-4354-9e89-35aa325d8b92-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.877699 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" event={"ID":"00ade740-f798-4354-9e89-35aa325d8b92","Type":"ContainerDied","Data":"ce7088c23fbb06b788d1016b352fa190d2c03b86a8cffe2054947ff62742df54"} Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.877745 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce7088c23fbb06b788d1016b352fa190d2c03b86a8cffe2054947ff62742df54" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.877807 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w2vs8" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.964286 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l"] Oct 11 01:54:22 crc kubenswrapper[4743]: E1011 01:54:22.964848 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6d37765-79ab-452b-a32a-26e497569aa6" containerName="registry-server" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.964890 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6d37765-79ab-452b-a32a-26e497569aa6" containerName="registry-server" Oct 11 01:54:22 crc kubenswrapper[4743]: E1011 01:54:22.964927 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6d37765-79ab-452b-a32a-26e497569aa6" containerName="extract-utilities" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.964936 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6d37765-79ab-452b-a32a-26e497569aa6" containerName="extract-utilities" Oct 11 01:54:22 crc kubenswrapper[4743]: E1011 01:54:22.964946 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6d37765-79ab-452b-a32a-26e497569aa6" containerName="extract-content" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.964954 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6d37765-79ab-452b-a32a-26e497569aa6" containerName="extract-content" Oct 11 01:54:22 crc kubenswrapper[4743]: E1011 01:54:22.964971 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ade740-f798-4354-9e89-35aa325d8b92" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.964978 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ade740-f798-4354-9e89-35aa325d8b92" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.965292 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="00ade740-f798-4354-9e89-35aa325d8b92" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.965313 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6d37765-79ab-452b-a32a-26e497569aa6" containerName="registry-server" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.966270 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.969215 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.969627 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.969683 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.970094 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.970895 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.970909 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.971330 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:54:22 crc kubenswrapper[4743]: I1011 01:54:22.975801 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l"] Oct 11 01:54:23 crc kubenswrapper[4743]: I1011 01:54:23.117537 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" Oct 11 01:54:23 crc kubenswrapper[4743]: I1011 01:54:23.117743 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84cdk\" (UniqueName: \"kubernetes.io/projected/d792f039-d865-44e3-9474-be444dee2d03-kube-api-access-84cdk\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" Oct 11 01:54:23 crc kubenswrapper[4743]: I1011 01:54:23.117890 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" Oct 11 01:54:23 crc kubenswrapper[4743]: I1011 01:54:23.118313 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" Oct 11 01:54:23 crc kubenswrapper[4743]: I1011 01:54:23.118467 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" Oct 11 01:54:23 crc kubenswrapper[4743]: I1011 01:54:23.118507 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" Oct 11 01:54:23 crc kubenswrapper[4743]: I1011 01:54:23.118544 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" Oct 11 01:54:23 crc kubenswrapper[4743]: I1011 01:54:23.221546 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" Oct 11 01:54:23 crc kubenswrapper[4743]: I1011 01:54:23.221598 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" Oct 11 01:54:23 crc kubenswrapper[4743]: I1011 01:54:23.221652 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" Oct 11 01:54:23 crc kubenswrapper[4743]: I1011 01:54:23.221707 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" Oct 11 01:54:23 crc kubenswrapper[4743]: I1011 01:54:23.221748 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84cdk\" (UniqueName: \"kubernetes.io/projected/d792f039-d865-44e3-9474-be444dee2d03-kube-api-access-84cdk\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" Oct 11 01:54:23 crc kubenswrapper[4743]: I1011 01:54:23.221787 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" Oct 11 01:54:23 crc kubenswrapper[4743]: I1011 01:54:23.221943 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" Oct 11 01:54:23 crc kubenswrapper[4743]: I1011 01:54:23.226838 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" Oct 11 01:54:23 crc kubenswrapper[4743]: I1011 01:54:23.226846 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" Oct 11 01:54:23 crc kubenswrapper[4743]: I1011 01:54:23.227016 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" Oct 11 01:54:23 crc kubenswrapper[4743]: I1011 01:54:23.227147 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" Oct 11 01:54:23 crc kubenswrapper[4743]: I1011 01:54:23.227341 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" Oct 11 01:54:23 crc kubenswrapper[4743]: I1011 01:54:23.227949 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" Oct 11 01:54:23 crc kubenswrapper[4743]: I1011 01:54:23.242875 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84cdk\" (UniqueName: \"kubernetes.io/projected/d792f039-d865-44e3-9474-be444dee2d03-kube-api-access-84cdk\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" Oct 11 01:54:23 crc kubenswrapper[4743]: I1011 01:54:23.284676 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" Oct 11 01:54:23 crc kubenswrapper[4743]: I1011 01:54:23.791014 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l"] Oct 11 01:54:23 crc kubenswrapper[4743]: I1011 01:54:23.887937 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" event={"ID":"d792f039-d865-44e3-9474-be444dee2d03","Type":"ContainerStarted","Data":"23b501a4cc12a91cb83b31253eeb8e29db9703ddb8e34353c58d25e624d1f9f1"} Oct 11 01:54:24 crc kubenswrapper[4743]: I1011 01:54:24.900104 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" event={"ID":"d792f039-d865-44e3-9474-be444dee2d03","Type":"ContainerStarted","Data":"9b60bbe44e2792de7c377f91077741eaaa55676b28121f334379fd2104fe3aa7"} Oct 11 01:54:24 crc kubenswrapper[4743]: I1011 01:54:24.924590 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" podStartSLOduration=2.4428421670000002 podStartE2EDuration="2.924572475s" podCreationTimestamp="2025-10-11 01:54:22 +0000 UTC" firstStartedPulling="2025-10-11 01:54:23.784164376 +0000 UTC m=+3758.437144803" lastFinishedPulling="2025-10-11 01:54:24.265894674 +0000 UTC m=+3758.918875111" observedRunningTime="2025-10-11 01:54:24.919539641 +0000 UTC m=+3759.572520048" watchObservedRunningTime="2025-10-11 01:54:24.924572475 +0000 UTC m=+3759.577552872" Oct 11 01:54:28 crc kubenswrapper[4743]: I1011 01:54:28.094519 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:54:28 crc kubenswrapper[4743]: E1011 01:54:28.095122 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:54:41 crc kubenswrapper[4743]: I1011 01:54:41.093338 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:54:41 crc kubenswrapper[4743]: E1011 01:54:41.094398 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:54:52 crc kubenswrapper[4743]: I1011 01:54:52.092211 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:54:52 crc kubenswrapper[4743]: E1011 01:54:52.094490 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:55:07 crc kubenswrapper[4743]: I1011 01:55:07.091700 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:55:07 crc kubenswrapper[4743]: E1011 01:55:07.092517 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:55:19 crc kubenswrapper[4743]: I1011 01:55:19.092025 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:55:19 crc kubenswrapper[4743]: E1011 01:55:19.094484 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:55:32 crc kubenswrapper[4743]: I1011 01:55:32.092375 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:55:32 crc kubenswrapper[4743]: E1011 01:55:32.093180 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:55:43 crc kubenswrapper[4743]: I1011 01:55:43.092349 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:55:43 crc kubenswrapper[4743]: E1011 01:55:43.093276 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:55:53 crc kubenswrapper[4743]: I1011 01:55:53.943766 4743 generic.go:334] "Generic (PLEG): container finished" podID="d792f039-d865-44e3-9474-be444dee2d03" containerID="9b60bbe44e2792de7c377f91077741eaaa55676b28121f334379fd2104fe3aa7" exitCode=0 Oct 11 01:55:53 crc kubenswrapper[4743]: I1011 01:55:53.943924 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" event={"ID":"d792f039-d865-44e3-9474-be444dee2d03","Type":"ContainerDied","Data":"9b60bbe44e2792de7c377f91077741eaaa55676b28121f334379fd2104fe3aa7"} Oct 11 01:55:55 crc kubenswrapper[4743]: I1011 01:55:55.372621 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" Oct 11 01:55:55 crc kubenswrapper[4743]: I1011 01:55:55.547952 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-neutron-ovn-metadata-agent-neutron-config-0\") pod \"d792f039-d865-44e3-9474-be444dee2d03\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " Oct 11 01:55:55 crc kubenswrapper[4743]: I1011 01:55:55.548039 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-inventory\") pod \"d792f039-d865-44e3-9474-be444dee2d03\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " Oct 11 01:55:55 crc kubenswrapper[4743]: I1011 01:55:55.548271 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-nova-metadata-neutron-config-0\") pod \"d792f039-d865-44e3-9474-be444dee2d03\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " Oct 11 01:55:55 crc kubenswrapper[4743]: I1011 01:55:55.548956 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-ceph\") pod \"d792f039-d865-44e3-9474-be444dee2d03\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " Oct 11 01:55:55 crc kubenswrapper[4743]: I1011 01:55:55.549013 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-ssh-key\") pod \"d792f039-d865-44e3-9474-be444dee2d03\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " Oct 11 01:55:55 crc kubenswrapper[4743]: I1011 01:55:55.549048 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84cdk\" (UniqueName: \"kubernetes.io/projected/d792f039-d865-44e3-9474-be444dee2d03-kube-api-access-84cdk\") pod \"d792f039-d865-44e3-9474-be444dee2d03\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " Oct 11 01:55:55 crc kubenswrapper[4743]: I1011 01:55:55.549144 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-neutron-metadata-combined-ca-bundle\") pod \"d792f039-d865-44e3-9474-be444dee2d03\" (UID: \"d792f039-d865-44e3-9474-be444dee2d03\") " Oct 11 01:55:55 crc kubenswrapper[4743]: I1011 01:55:55.561039 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-ceph" (OuterVolumeSpecName: "ceph") pod "d792f039-d865-44e3-9474-be444dee2d03" (UID: "d792f039-d865-44e3-9474-be444dee2d03"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:55:55 crc kubenswrapper[4743]: I1011 01:55:55.561104 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d792f039-d865-44e3-9474-be444dee2d03-kube-api-access-84cdk" (OuterVolumeSpecName: "kube-api-access-84cdk") pod "d792f039-d865-44e3-9474-be444dee2d03" (UID: "d792f039-d865-44e3-9474-be444dee2d03"). InnerVolumeSpecName "kube-api-access-84cdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:55:55 crc kubenswrapper[4743]: I1011 01:55:55.562535 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "d792f039-d865-44e3-9474-be444dee2d03" (UID: "d792f039-d865-44e3-9474-be444dee2d03"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:55:55 crc kubenswrapper[4743]: I1011 01:55:55.578241 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-inventory" (OuterVolumeSpecName: "inventory") pod "d792f039-d865-44e3-9474-be444dee2d03" (UID: "d792f039-d865-44e3-9474-be444dee2d03"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:55:55 crc kubenswrapper[4743]: I1011 01:55:55.583505 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d792f039-d865-44e3-9474-be444dee2d03" (UID: "d792f039-d865-44e3-9474-be444dee2d03"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:55:55 crc kubenswrapper[4743]: I1011 01:55:55.586897 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "d792f039-d865-44e3-9474-be444dee2d03" (UID: "d792f039-d865-44e3-9474-be444dee2d03"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:55:55 crc kubenswrapper[4743]: I1011 01:55:55.601083 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "d792f039-d865-44e3-9474-be444dee2d03" (UID: "d792f039-d865-44e3-9474-be444dee2d03"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:55:55 crc kubenswrapper[4743]: I1011 01:55:55.651561 4743 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:55:55 crc kubenswrapper[4743]: I1011 01:55:55.651596 4743 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:55:55 crc kubenswrapper[4743]: I1011 01:55:55.651606 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:55:55 crc kubenswrapper[4743]: I1011 01:55:55.651615 4743 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:55:55 crc kubenswrapper[4743]: I1011 01:55:55.651629 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-ceph\") on node \"crc\" DevicePath \"\"" Oct 11 01:55:55 crc kubenswrapper[4743]: I1011 01:55:55.651639 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d792f039-d865-44e3-9474-be444dee2d03-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:55:55 crc kubenswrapper[4743]: I1011 01:55:55.651648 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84cdk\" (UniqueName: \"kubernetes.io/projected/d792f039-d865-44e3-9474-be444dee2d03-kube-api-access-84cdk\") on node \"crc\" DevicePath \"\"" Oct 11 01:55:55 crc kubenswrapper[4743]: I1011 01:55:55.969575 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" event={"ID":"d792f039-d865-44e3-9474-be444dee2d03","Type":"ContainerDied","Data":"23b501a4cc12a91cb83b31253eeb8e29db9703ddb8e34353c58d25e624d1f9f1"} Oct 11 01:55:55 crc kubenswrapper[4743]: I1011 01:55:55.969638 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23b501a4cc12a91cb83b31253eeb8e29db9703ddb8e34353c58d25e624d1f9f1" Oct 11 01:55:55 crc kubenswrapper[4743]: I1011 01:55:55.969720 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.086428 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt"] Oct 11 01:55:56 crc kubenswrapper[4743]: E1011 01:55:56.087453 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d792f039-d865-44e3-9474-be444dee2d03" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.087473 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d792f039-d865-44e3-9474-be444dee2d03" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.088035 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d792f039-d865-44e3-9474-be444dee2d03" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.089312 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.094380 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.094618 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.096634 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.097142 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.099774 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.106273 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.125382 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt"] Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.223844 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt\" (UID: \"6ea65353-b389-4222-8ff8-298d53283609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.223926 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt\" (UID: \"6ea65353-b389-4222-8ff8-298d53283609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.224101 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt\" (UID: \"6ea65353-b389-4222-8ff8-298d53283609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.224171 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt\" (UID: \"6ea65353-b389-4222-8ff8-298d53283609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.224195 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z27rt\" (UniqueName: \"kubernetes.io/projected/6ea65353-b389-4222-8ff8-298d53283609-kube-api-access-z27rt\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt\" (UID: \"6ea65353-b389-4222-8ff8-298d53283609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.224234 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt\" (UID: \"6ea65353-b389-4222-8ff8-298d53283609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.326389 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt\" (UID: \"6ea65353-b389-4222-8ff8-298d53283609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.326474 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt\" (UID: \"6ea65353-b389-4222-8ff8-298d53283609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.326493 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z27rt\" (UniqueName: \"kubernetes.io/projected/6ea65353-b389-4222-8ff8-298d53283609-kube-api-access-z27rt\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt\" (UID: \"6ea65353-b389-4222-8ff8-298d53283609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.326529 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt\" (UID: \"6ea65353-b389-4222-8ff8-298d53283609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.326553 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt\" (UID: \"6ea65353-b389-4222-8ff8-298d53283609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.326579 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt\" (UID: \"6ea65353-b389-4222-8ff8-298d53283609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.330501 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt\" (UID: \"6ea65353-b389-4222-8ff8-298d53283609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.333679 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt\" (UID: \"6ea65353-b389-4222-8ff8-298d53283609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.334164 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt\" (UID: \"6ea65353-b389-4222-8ff8-298d53283609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.336456 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt\" (UID: \"6ea65353-b389-4222-8ff8-298d53283609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.337415 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt\" (UID: \"6ea65353-b389-4222-8ff8-298d53283609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.344435 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z27rt\" (UniqueName: \"kubernetes.io/projected/6ea65353-b389-4222-8ff8-298d53283609-kube-api-access-z27rt\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt\" (UID: \"6ea65353-b389-4222-8ff8-298d53283609\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" Oct 11 01:55:56 crc kubenswrapper[4743]: I1011 01:55:56.425447 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" Oct 11 01:55:57 crc kubenswrapper[4743]: I1011 01:55:57.091495 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:55:57 crc kubenswrapper[4743]: E1011 01:55:57.092358 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:55:57 crc kubenswrapper[4743]: I1011 01:55:57.252509 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt"] Oct 11 01:55:57 crc kubenswrapper[4743]: I1011 01:55:57.257717 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 01:55:57 crc kubenswrapper[4743]: I1011 01:55:57.992872 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" event={"ID":"6ea65353-b389-4222-8ff8-298d53283609","Type":"ContainerStarted","Data":"d0c9a46487243197a7e9e71e436516c2186e3335699cf1564b24ff7657de53e5"} Oct 11 01:55:59 crc kubenswrapper[4743]: I1011 01:55:59.002977 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" event={"ID":"6ea65353-b389-4222-8ff8-298d53283609","Type":"ContainerStarted","Data":"6b1f78d75a105b7a27f4d8a7c24cc4ef9c0d6112402ef75044c17eb1ed5c9a5a"} Oct 11 01:55:59 crc kubenswrapper[4743]: I1011 01:55:59.025554 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" podStartSLOduration=2.539708323 podStartE2EDuration="3.025530441s" podCreationTimestamp="2025-10-11 01:55:56 +0000 UTC" firstStartedPulling="2025-10-11 01:55:57.257478719 +0000 UTC m=+3851.910459116" lastFinishedPulling="2025-10-11 01:55:57.743300837 +0000 UTC m=+3852.396281234" observedRunningTime="2025-10-11 01:55:59.021369768 +0000 UTC m=+3853.674350175" watchObservedRunningTime="2025-10-11 01:55:59.025530441 +0000 UTC m=+3853.678510858" Oct 11 01:56:11 crc kubenswrapper[4743]: I1011 01:56:11.092456 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:56:11 crc kubenswrapper[4743]: E1011 01:56:11.093214 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 01:56:22 crc kubenswrapper[4743]: I1011 01:56:22.091706 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:56:23 crc kubenswrapper[4743]: I1011 01:56:23.272091 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"0c6c02f491c08a5f1c576130dfd249554a30c097963b0370bce3acf791a47aa1"} Oct 11 01:58:08 crc kubenswrapper[4743]: I1011 01:58:08.962049 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cdqpv"] Oct 11 01:58:08 crc kubenswrapper[4743]: I1011 01:58:08.965694 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cdqpv" Oct 11 01:58:09 crc kubenswrapper[4743]: I1011 01:58:09.008431 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cdqpv"] Oct 11 01:58:09 crc kubenswrapper[4743]: I1011 01:58:09.079244 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4bnm\" (UniqueName: \"kubernetes.io/projected/765e18b7-07c6-4c5a-9ade-125519acd56d-kube-api-access-p4bnm\") pod \"redhat-operators-cdqpv\" (UID: \"765e18b7-07c6-4c5a-9ade-125519acd56d\") " pod="openshift-marketplace/redhat-operators-cdqpv" Oct 11 01:58:09 crc kubenswrapper[4743]: I1011 01:58:09.079324 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/765e18b7-07c6-4c5a-9ade-125519acd56d-utilities\") pod \"redhat-operators-cdqpv\" (UID: \"765e18b7-07c6-4c5a-9ade-125519acd56d\") " pod="openshift-marketplace/redhat-operators-cdqpv" Oct 11 01:58:09 crc kubenswrapper[4743]: I1011 01:58:09.079409 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/765e18b7-07c6-4c5a-9ade-125519acd56d-catalog-content\") pod \"redhat-operators-cdqpv\" (UID: \"765e18b7-07c6-4c5a-9ade-125519acd56d\") " pod="openshift-marketplace/redhat-operators-cdqpv" Oct 11 01:58:09 crc kubenswrapper[4743]: I1011 01:58:09.181760 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/765e18b7-07c6-4c5a-9ade-125519acd56d-catalog-content\") pod \"redhat-operators-cdqpv\" (UID: \"765e18b7-07c6-4c5a-9ade-125519acd56d\") " pod="openshift-marketplace/redhat-operators-cdqpv" Oct 11 01:58:09 crc kubenswrapper[4743]: I1011 01:58:09.183114 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4bnm\" (UniqueName: \"kubernetes.io/projected/765e18b7-07c6-4c5a-9ade-125519acd56d-kube-api-access-p4bnm\") pod \"redhat-operators-cdqpv\" (UID: \"765e18b7-07c6-4c5a-9ade-125519acd56d\") " pod="openshift-marketplace/redhat-operators-cdqpv" Oct 11 01:58:09 crc kubenswrapper[4743]: I1011 01:58:09.183324 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/765e18b7-07c6-4c5a-9ade-125519acd56d-utilities\") pod \"redhat-operators-cdqpv\" (UID: \"765e18b7-07c6-4c5a-9ade-125519acd56d\") " pod="openshift-marketplace/redhat-operators-cdqpv" Oct 11 01:58:09 crc kubenswrapper[4743]: I1011 01:58:09.183723 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/765e18b7-07c6-4c5a-9ade-125519acd56d-catalog-content\") pod \"redhat-operators-cdqpv\" (UID: \"765e18b7-07c6-4c5a-9ade-125519acd56d\") " pod="openshift-marketplace/redhat-operators-cdqpv" Oct 11 01:58:09 crc kubenswrapper[4743]: I1011 01:58:09.183760 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/765e18b7-07c6-4c5a-9ade-125519acd56d-utilities\") pod \"redhat-operators-cdqpv\" (UID: \"765e18b7-07c6-4c5a-9ade-125519acd56d\") " pod="openshift-marketplace/redhat-operators-cdqpv" Oct 11 01:58:09 crc kubenswrapper[4743]: I1011 01:58:09.201108 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4bnm\" (UniqueName: \"kubernetes.io/projected/765e18b7-07c6-4c5a-9ade-125519acd56d-kube-api-access-p4bnm\") pod \"redhat-operators-cdqpv\" (UID: \"765e18b7-07c6-4c5a-9ade-125519acd56d\") " pod="openshift-marketplace/redhat-operators-cdqpv" Oct 11 01:58:09 crc kubenswrapper[4743]: I1011 01:58:09.293429 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cdqpv" Oct 11 01:58:09 crc kubenswrapper[4743]: I1011 01:58:09.869242 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cdqpv"] Oct 11 01:58:09 crc kubenswrapper[4743]: W1011 01:58:09.881486 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod765e18b7_07c6_4c5a_9ade_125519acd56d.slice/crio-690d53b28289b1ec884e75cfd63b136e373b8c2108f91c6cf020c3080b0e7c1e WatchSource:0}: Error finding container 690d53b28289b1ec884e75cfd63b136e373b8c2108f91c6cf020c3080b0e7c1e: Status 404 returned error can't find the container with id 690d53b28289b1ec884e75cfd63b136e373b8c2108f91c6cf020c3080b0e7c1e Oct 11 01:58:10 crc kubenswrapper[4743]: I1011 01:58:10.583922 4743 generic.go:334] "Generic (PLEG): container finished" podID="765e18b7-07c6-4c5a-9ade-125519acd56d" containerID="28e3d0944841653e74907c8e526eec311a2d3be97b435178ed06559a1eed0af4" exitCode=0 Oct 11 01:58:10 crc kubenswrapper[4743]: I1011 01:58:10.583982 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdqpv" event={"ID":"765e18b7-07c6-4c5a-9ade-125519acd56d","Type":"ContainerDied","Data":"28e3d0944841653e74907c8e526eec311a2d3be97b435178ed06559a1eed0af4"} Oct 11 01:58:10 crc kubenswrapper[4743]: I1011 01:58:10.584245 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdqpv" event={"ID":"765e18b7-07c6-4c5a-9ade-125519acd56d","Type":"ContainerStarted","Data":"690d53b28289b1ec884e75cfd63b136e373b8c2108f91c6cf020c3080b0e7c1e"} Oct 11 01:58:12 crc kubenswrapper[4743]: I1011 01:58:12.609092 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdqpv" event={"ID":"765e18b7-07c6-4c5a-9ade-125519acd56d","Type":"ContainerStarted","Data":"4f87eb815b086fa6a9da34e53bdc5de23638c2fea6b50577b61135c5a65a28fb"} Oct 11 01:58:15 crc kubenswrapper[4743]: I1011 01:58:15.665418 4743 generic.go:334] "Generic (PLEG): container finished" podID="765e18b7-07c6-4c5a-9ade-125519acd56d" containerID="4f87eb815b086fa6a9da34e53bdc5de23638c2fea6b50577b61135c5a65a28fb" exitCode=0 Oct 11 01:58:15 crc kubenswrapper[4743]: I1011 01:58:15.665515 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdqpv" event={"ID":"765e18b7-07c6-4c5a-9ade-125519acd56d","Type":"ContainerDied","Data":"4f87eb815b086fa6a9da34e53bdc5de23638c2fea6b50577b61135c5a65a28fb"} Oct 11 01:58:16 crc kubenswrapper[4743]: I1011 01:58:16.680748 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdqpv" event={"ID":"765e18b7-07c6-4c5a-9ade-125519acd56d","Type":"ContainerStarted","Data":"857c01630f84b2e6694ebdad4ce0a148f2a27bb07db1bf1a9d67371e951d1fc8"} Oct 11 01:58:16 crc kubenswrapper[4743]: I1011 01:58:16.707945 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cdqpv" podStartSLOduration=3.249458775 podStartE2EDuration="8.707928369s" podCreationTimestamp="2025-10-11 01:58:08 +0000 UTC" firstStartedPulling="2025-10-11 01:58:10.586526777 +0000 UTC m=+3985.239507174" lastFinishedPulling="2025-10-11 01:58:16.044996331 +0000 UTC m=+3990.697976768" observedRunningTime="2025-10-11 01:58:16.698044034 +0000 UTC m=+3991.351024431" watchObservedRunningTime="2025-10-11 01:58:16.707928369 +0000 UTC m=+3991.360908766" Oct 11 01:58:19 crc kubenswrapper[4743]: I1011 01:58:19.294486 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cdqpv" Oct 11 01:58:19 crc kubenswrapper[4743]: I1011 01:58:19.295457 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cdqpv" Oct 11 01:58:20 crc kubenswrapper[4743]: I1011 01:58:20.374895 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cdqpv" podUID="765e18b7-07c6-4c5a-9ade-125519acd56d" containerName="registry-server" probeResult="failure" output=< Oct 11 01:58:20 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Oct 11 01:58:20 crc kubenswrapper[4743]: > Oct 11 01:58:29 crc kubenswrapper[4743]: I1011 01:58:29.363171 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cdqpv" Oct 11 01:58:29 crc kubenswrapper[4743]: I1011 01:58:29.441257 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cdqpv" Oct 11 01:58:29 crc kubenswrapper[4743]: I1011 01:58:29.608382 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cdqpv"] Oct 11 01:58:30 crc kubenswrapper[4743]: I1011 01:58:30.855013 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cdqpv" podUID="765e18b7-07c6-4c5a-9ade-125519acd56d" containerName="registry-server" containerID="cri-o://857c01630f84b2e6694ebdad4ce0a148f2a27bb07db1bf1a9d67371e951d1fc8" gracePeriod=2 Oct 11 01:58:31 crc kubenswrapper[4743]: I1011 01:58:31.869163 4743 generic.go:334] "Generic (PLEG): container finished" podID="765e18b7-07c6-4c5a-9ade-125519acd56d" containerID="857c01630f84b2e6694ebdad4ce0a148f2a27bb07db1bf1a9d67371e951d1fc8" exitCode=0 Oct 11 01:58:31 crc kubenswrapper[4743]: I1011 01:58:31.869297 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdqpv" event={"ID":"765e18b7-07c6-4c5a-9ade-125519acd56d","Type":"ContainerDied","Data":"857c01630f84b2e6694ebdad4ce0a148f2a27bb07db1bf1a9d67371e951d1fc8"} Oct 11 01:58:31 crc kubenswrapper[4743]: I1011 01:58:31.869494 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdqpv" event={"ID":"765e18b7-07c6-4c5a-9ade-125519acd56d","Type":"ContainerDied","Data":"690d53b28289b1ec884e75cfd63b136e373b8c2108f91c6cf020c3080b0e7c1e"} Oct 11 01:58:31 crc kubenswrapper[4743]: I1011 01:58:31.869512 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="690d53b28289b1ec884e75cfd63b136e373b8c2108f91c6cf020c3080b0e7c1e" Oct 11 01:58:32 crc kubenswrapper[4743]: I1011 01:58:32.577744 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cdqpv" Oct 11 01:58:32 crc kubenswrapper[4743]: I1011 01:58:32.688436 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/765e18b7-07c6-4c5a-9ade-125519acd56d-catalog-content\") pod \"765e18b7-07c6-4c5a-9ade-125519acd56d\" (UID: \"765e18b7-07c6-4c5a-9ade-125519acd56d\") " Oct 11 01:58:32 crc kubenswrapper[4743]: I1011 01:58:32.688553 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/765e18b7-07c6-4c5a-9ade-125519acd56d-utilities\") pod \"765e18b7-07c6-4c5a-9ade-125519acd56d\" (UID: \"765e18b7-07c6-4c5a-9ade-125519acd56d\") " Oct 11 01:58:32 crc kubenswrapper[4743]: I1011 01:58:32.688650 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4bnm\" (UniqueName: \"kubernetes.io/projected/765e18b7-07c6-4c5a-9ade-125519acd56d-kube-api-access-p4bnm\") pod \"765e18b7-07c6-4c5a-9ade-125519acd56d\" (UID: \"765e18b7-07c6-4c5a-9ade-125519acd56d\") " Oct 11 01:58:32 crc kubenswrapper[4743]: I1011 01:58:32.689599 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/765e18b7-07c6-4c5a-9ade-125519acd56d-utilities" (OuterVolumeSpecName: "utilities") pod "765e18b7-07c6-4c5a-9ade-125519acd56d" (UID: "765e18b7-07c6-4c5a-9ade-125519acd56d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:58:32 crc kubenswrapper[4743]: I1011 01:58:32.698618 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/765e18b7-07c6-4c5a-9ade-125519acd56d-kube-api-access-p4bnm" (OuterVolumeSpecName: "kube-api-access-p4bnm") pod "765e18b7-07c6-4c5a-9ade-125519acd56d" (UID: "765e18b7-07c6-4c5a-9ade-125519acd56d"). InnerVolumeSpecName "kube-api-access-p4bnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:58:32 crc kubenswrapper[4743]: I1011 01:58:32.790977 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/765e18b7-07c6-4c5a-9ade-125519acd56d-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 01:58:32 crc kubenswrapper[4743]: I1011 01:58:32.791013 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4bnm\" (UniqueName: \"kubernetes.io/projected/765e18b7-07c6-4c5a-9ade-125519acd56d-kube-api-access-p4bnm\") on node \"crc\" DevicePath \"\"" Oct 11 01:58:32 crc kubenswrapper[4743]: I1011 01:58:32.791212 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/765e18b7-07c6-4c5a-9ade-125519acd56d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "765e18b7-07c6-4c5a-9ade-125519acd56d" (UID: "765e18b7-07c6-4c5a-9ade-125519acd56d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:58:32 crc kubenswrapper[4743]: I1011 01:58:32.883329 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cdqpv" Oct 11 01:58:32 crc kubenswrapper[4743]: I1011 01:58:32.892808 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/765e18b7-07c6-4c5a-9ade-125519acd56d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 01:58:32 crc kubenswrapper[4743]: I1011 01:58:32.926059 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cdqpv"] Oct 11 01:58:32 crc kubenswrapper[4743]: I1011 01:58:32.935245 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cdqpv"] Oct 11 01:58:34 crc kubenswrapper[4743]: I1011 01:58:34.108723 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="765e18b7-07c6-4c5a-9ade-125519acd56d" path="/var/lib/kubelet/pods/765e18b7-07c6-4c5a-9ade-125519acd56d/volumes" Oct 11 01:58:44 crc kubenswrapper[4743]: I1011 01:58:44.458773 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:58:44 crc kubenswrapper[4743]: I1011 01:58:44.459356 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:59:14 crc kubenswrapper[4743]: I1011 01:59:14.458214 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:59:14 crc kubenswrapper[4743]: I1011 01:59:14.458813 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:59:14 crc kubenswrapper[4743]: I1011 01:59:14.574492 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bsmdx"] Oct 11 01:59:14 crc kubenswrapper[4743]: E1011 01:59:14.574993 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="765e18b7-07c6-4c5a-9ade-125519acd56d" containerName="extract-utilities" Oct 11 01:59:14 crc kubenswrapper[4743]: I1011 01:59:14.575009 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="765e18b7-07c6-4c5a-9ade-125519acd56d" containerName="extract-utilities" Oct 11 01:59:14 crc kubenswrapper[4743]: E1011 01:59:14.575027 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="765e18b7-07c6-4c5a-9ade-125519acd56d" containerName="registry-server" Oct 11 01:59:14 crc kubenswrapper[4743]: I1011 01:59:14.575033 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="765e18b7-07c6-4c5a-9ade-125519acd56d" containerName="registry-server" Oct 11 01:59:14 crc kubenswrapper[4743]: E1011 01:59:14.575052 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="765e18b7-07c6-4c5a-9ade-125519acd56d" containerName="extract-content" Oct 11 01:59:14 crc kubenswrapper[4743]: I1011 01:59:14.575058 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="765e18b7-07c6-4c5a-9ade-125519acd56d" containerName="extract-content" Oct 11 01:59:14 crc kubenswrapper[4743]: I1011 01:59:14.575314 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="765e18b7-07c6-4c5a-9ade-125519acd56d" containerName="registry-server" Oct 11 01:59:14 crc kubenswrapper[4743]: I1011 01:59:14.576880 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bsmdx" Oct 11 01:59:14 crc kubenswrapper[4743]: I1011 01:59:14.590471 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bsmdx"] Oct 11 01:59:14 crc kubenswrapper[4743]: I1011 01:59:14.696565 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87c6n\" (UniqueName: \"kubernetes.io/projected/0c374566-8d2a-4ae7-98a5-dc2917839847-kube-api-access-87c6n\") pod \"certified-operators-bsmdx\" (UID: \"0c374566-8d2a-4ae7-98a5-dc2917839847\") " pod="openshift-marketplace/certified-operators-bsmdx" Oct 11 01:59:14 crc kubenswrapper[4743]: I1011 01:59:14.696757 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c374566-8d2a-4ae7-98a5-dc2917839847-utilities\") pod \"certified-operators-bsmdx\" (UID: \"0c374566-8d2a-4ae7-98a5-dc2917839847\") " pod="openshift-marketplace/certified-operators-bsmdx" Oct 11 01:59:14 crc kubenswrapper[4743]: I1011 01:59:14.696803 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c374566-8d2a-4ae7-98a5-dc2917839847-catalog-content\") pod \"certified-operators-bsmdx\" (UID: \"0c374566-8d2a-4ae7-98a5-dc2917839847\") " pod="openshift-marketplace/certified-operators-bsmdx" Oct 11 01:59:14 crc kubenswrapper[4743]: I1011 01:59:14.798619 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c374566-8d2a-4ae7-98a5-dc2917839847-utilities\") pod \"certified-operators-bsmdx\" (UID: \"0c374566-8d2a-4ae7-98a5-dc2917839847\") " pod="openshift-marketplace/certified-operators-bsmdx" Oct 11 01:59:14 crc kubenswrapper[4743]: I1011 01:59:14.798938 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c374566-8d2a-4ae7-98a5-dc2917839847-catalog-content\") pod \"certified-operators-bsmdx\" (UID: \"0c374566-8d2a-4ae7-98a5-dc2917839847\") " pod="openshift-marketplace/certified-operators-bsmdx" Oct 11 01:59:14 crc kubenswrapper[4743]: I1011 01:59:14.799032 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87c6n\" (UniqueName: \"kubernetes.io/projected/0c374566-8d2a-4ae7-98a5-dc2917839847-kube-api-access-87c6n\") pod \"certified-operators-bsmdx\" (UID: \"0c374566-8d2a-4ae7-98a5-dc2917839847\") " pod="openshift-marketplace/certified-operators-bsmdx" Oct 11 01:59:14 crc kubenswrapper[4743]: I1011 01:59:14.799406 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c374566-8d2a-4ae7-98a5-dc2917839847-utilities\") pod \"certified-operators-bsmdx\" (UID: \"0c374566-8d2a-4ae7-98a5-dc2917839847\") " pod="openshift-marketplace/certified-operators-bsmdx" Oct 11 01:59:14 crc kubenswrapper[4743]: I1011 01:59:14.799434 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c374566-8d2a-4ae7-98a5-dc2917839847-catalog-content\") pod \"certified-operators-bsmdx\" (UID: \"0c374566-8d2a-4ae7-98a5-dc2917839847\") " pod="openshift-marketplace/certified-operators-bsmdx" Oct 11 01:59:14 crc kubenswrapper[4743]: I1011 01:59:14.817469 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87c6n\" (UniqueName: \"kubernetes.io/projected/0c374566-8d2a-4ae7-98a5-dc2917839847-kube-api-access-87c6n\") pod \"certified-operators-bsmdx\" (UID: \"0c374566-8d2a-4ae7-98a5-dc2917839847\") " pod="openshift-marketplace/certified-operators-bsmdx" Oct 11 01:59:14 crc kubenswrapper[4743]: I1011 01:59:14.899050 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bsmdx" Oct 11 01:59:15 crc kubenswrapper[4743]: I1011 01:59:15.486710 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bsmdx"] Oct 11 01:59:16 crc kubenswrapper[4743]: I1011 01:59:16.445952 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsmdx" event={"ID":"0c374566-8d2a-4ae7-98a5-dc2917839847","Type":"ContainerStarted","Data":"6adbcb392d2c074d6f939ece7b6d278958891e6b625e5bb8a96a27d634ea20c9"} Oct 11 01:59:17 crc kubenswrapper[4743]: I1011 01:59:17.458307 4743 generic.go:334] "Generic (PLEG): container finished" podID="0c374566-8d2a-4ae7-98a5-dc2917839847" containerID="0748a0d17d9291e70bb788e119d5a99bb03e8dc488f7c509529a4c9ddf797b8f" exitCode=0 Oct 11 01:59:17 crc kubenswrapper[4743]: I1011 01:59:17.458405 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsmdx" event={"ID":"0c374566-8d2a-4ae7-98a5-dc2917839847","Type":"ContainerDied","Data":"0748a0d17d9291e70bb788e119d5a99bb03e8dc488f7c509529a4c9ddf797b8f"} Oct 11 01:59:19 crc kubenswrapper[4743]: I1011 01:59:19.484175 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsmdx" event={"ID":"0c374566-8d2a-4ae7-98a5-dc2917839847","Type":"ContainerStarted","Data":"71e72ad004a6950a2cd79e91206b28b806b93365483e2a9c1fb1c41ec0992de1"} Oct 11 01:59:20 crc kubenswrapper[4743]: I1011 01:59:20.499477 4743 generic.go:334] "Generic (PLEG): container finished" podID="0c374566-8d2a-4ae7-98a5-dc2917839847" containerID="71e72ad004a6950a2cd79e91206b28b806b93365483e2a9c1fb1c41ec0992de1" exitCode=0 Oct 11 01:59:20 crc kubenswrapper[4743]: I1011 01:59:20.499541 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsmdx" event={"ID":"0c374566-8d2a-4ae7-98a5-dc2917839847","Type":"ContainerDied","Data":"71e72ad004a6950a2cd79e91206b28b806b93365483e2a9c1fb1c41ec0992de1"} Oct 11 01:59:21 crc kubenswrapper[4743]: I1011 01:59:21.512965 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsmdx" event={"ID":"0c374566-8d2a-4ae7-98a5-dc2917839847","Type":"ContainerStarted","Data":"8e2495527090d65dfc6156b79c5f031592af494e4ce6c7655e5de295cb6fb0ad"} Oct 11 01:59:21 crc kubenswrapper[4743]: I1011 01:59:21.549588 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bsmdx" podStartSLOduration=4.037971657 podStartE2EDuration="7.549558816s" podCreationTimestamp="2025-10-11 01:59:14 +0000 UTC" firstStartedPulling="2025-10-11 01:59:17.460018046 +0000 UTC m=+4052.112998453" lastFinishedPulling="2025-10-11 01:59:20.971605185 +0000 UTC m=+4055.624585612" observedRunningTime="2025-10-11 01:59:21.538734318 +0000 UTC m=+4056.191714725" watchObservedRunningTime="2025-10-11 01:59:21.549558816 +0000 UTC m=+4056.202539243" Oct 11 01:59:24 crc kubenswrapper[4743]: I1011 01:59:24.899537 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bsmdx" Oct 11 01:59:24 crc kubenswrapper[4743]: I1011 01:59:24.900008 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bsmdx" Oct 11 01:59:24 crc kubenswrapper[4743]: I1011 01:59:24.959313 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bsmdx" Oct 11 01:59:34 crc kubenswrapper[4743]: I1011 01:59:34.966029 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bsmdx" Oct 11 01:59:35 crc kubenswrapper[4743]: I1011 01:59:35.034280 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bsmdx"] Oct 11 01:59:35 crc kubenswrapper[4743]: I1011 01:59:35.708992 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bsmdx" podUID="0c374566-8d2a-4ae7-98a5-dc2917839847" containerName="registry-server" containerID="cri-o://8e2495527090d65dfc6156b79c5f031592af494e4ce6c7655e5de295cb6fb0ad" gracePeriod=2 Oct 11 01:59:36 crc kubenswrapper[4743]: I1011 01:59:36.250106 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bsmdx" Oct 11 01:59:36 crc kubenswrapper[4743]: I1011 01:59:36.451309 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c374566-8d2a-4ae7-98a5-dc2917839847-catalog-content\") pod \"0c374566-8d2a-4ae7-98a5-dc2917839847\" (UID: \"0c374566-8d2a-4ae7-98a5-dc2917839847\") " Oct 11 01:59:36 crc kubenswrapper[4743]: I1011 01:59:36.451407 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87c6n\" (UniqueName: \"kubernetes.io/projected/0c374566-8d2a-4ae7-98a5-dc2917839847-kube-api-access-87c6n\") pod \"0c374566-8d2a-4ae7-98a5-dc2917839847\" (UID: \"0c374566-8d2a-4ae7-98a5-dc2917839847\") " Oct 11 01:59:36 crc kubenswrapper[4743]: I1011 01:59:36.451490 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c374566-8d2a-4ae7-98a5-dc2917839847-utilities\") pod \"0c374566-8d2a-4ae7-98a5-dc2917839847\" (UID: \"0c374566-8d2a-4ae7-98a5-dc2917839847\") " Oct 11 01:59:36 crc kubenswrapper[4743]: I1011 01:59:36.452627 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c374566-8d2a-4ae7-98a5-dc2917839847-utilities" (OuterVolumeSpecName: "utilities") pod "0c374566-8d2a-4ae7-98a5-dc2917839847" (UID: "0c374566-8d2a-4ae7-98a5-dc2917839847"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:59:36 crc kubenswrapper[4743]: I1011 01:59:36.459887 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c374566-8d2a-4ae7-98a5-dc2917839847-kube-api-access-87c6n" (OuterVolumeSpecName: "kube-api-access-87c6n") pod "0c374566-8d2a-4ae7-98a5-dc2917839847" (UID: "0c374566-8d2a-4ae7-98a5-dc2917839847"). InnerVolumeSpecName "kube-api-access-87c6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:59:36 crc kubenswrapper[4743]: I1011 01:59:36.502941 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c374566-8d2a-4ae7-98a5-dc2917839847-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c374566-8d2a-4ae7-98a5-dc2917839847" (UID: "0c374566-8d2a-4ae7-98a5-dc2917839847"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 01:59:36 crc kubenswrapper[4743]: I1011 01:59:36.554333 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c374566-8d2a-4ae7-98a5-dc2917839847-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 01:59:36 crc kubenswrapper[4743]: I1011 01:59:36.554372 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87c6n\" (UniqueName: \"kubernetes.io/projected/0c374566-8d2a-4ae7-98a5-dc2917839847-kube-api-access-87c6n\") on node \"crc\" DevicePath \"\"" Oct 11 01:59:36 crc kubenswrapper[4743]: I1011 01:59:36.554386 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c374566-8d2a-4ae7-98a5-dc2917839847-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 01:59:36 crc kubenswrapper[4743]: I1011 01:59:36.722225 4743 generic.go:334] "Generic (PLEG): container finished" podID="0c374566-8d2a-4ae7-98a5-dc2917839847" containerID="8e2495527090d65dfc6156b79c5f031592af494e4ce6c7655e5de295cb6fb0ad" exitCode=0 Oct 11 01:59:36 crc kubenswrapper[4743]: I1011 01:59:36.722291 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsmdx" event={"ID":"0c374566-8d2a-4ae7-98a5-dc2917839847","Type":"ContainerDied","Data":"8e2495527090d65dfc6156b79c5f031592af494e4ce6c7655e5de295cb6fb0ad"} Oct 11 01:59:36 crc kubenswrapper[4743]: I1011 01:59:36.722353 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsmdx" event={"ID":"0c374566-8d2a-4ae7-98a5-dc2917839847","Type":"ContainerDied","Data":"6adbcb392d2c074d6f939ece7b6d278958891e6b625e5bb8a96a27d634ea20c9"} Oct 11 01:59:36 crc kubenswrapper[4743]: I1011 01:59:36.722352 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bsmdx" Oct 11 01:59:36 crc kubenswrapper[4743]: I1011 01:59:36.722377 4743 scope.go:117] "RemoveContainer" containerID="8e2495527090d65dfc6156b79c5f031592af494e4ce6c7655e5de295cb6fb0ad" Oct 11 01:59:36 crc kubenswrapper[4743]: I1011 01:59:36.760184 4743 scope.go:117] "RemoveContainer" containerID="71e72ad004a6950a2cd79e91206b28b806b93365483e2a9c1fb1c41ec0992de1" Oct 11 01:59:36 crc kubenswrapper[4743]: I1011 01:59:36.797958 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bsmdx"] Oct 11 01:59:36 crc kubenswrapper[4743]: I1011 01:59:36.806448 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bsmdx"] Oct 11 01:59:36 crc kubenswrapper[4743]: I1011 01:59:36.807254 4743 scope.go:117] "RemoveContainer" containerID="0748a0d17d9291e70bb788e119d5a99bb03e8dc488f7c509529a4c9ddf797b8f" Oct 11 01:59:36 crc kubenswrapper[4743]: I1011 01:59:36.848786 4743 scope.go:117] "RemoveContainer" containerID="8e2495527090d65dfc6156b79c5f031592af494e4ce6c7655e5de295cb6fb0ad" Oct 11 01:59:36 crc kubenswrapper[4743]: E1011 01:59:36.849844 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e2495527090d65dfc6156b79c5f031592af494e4ce6c7655e5de295cb6fb0ad\": container with ID starting with 8e2495527090d65dfc6156b79c5f031592af494e4ce6c7655e5de295cb6fb0ad not found: ID does not exist" containerID="8e2495527090d65dfc6156b79c5f031592af494e4ce6c7655e5de295cb6fb0ad" Oct 11 01:59:36 crc kubenswrapper[4743]: I1011 01:59:36.849917 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e2495527090d65dfc6156b79c5f031592af494e4ce6c7655e5de295cb6fb0ad"} err="failed to get container status \"8e2495527090d65dfc6156b79c5f031592af494e4ce6c7655e5de295cb6fb0ad\": rpc error: code = NotFound desc = could not find container \"8e2495527090d65dfc6156b79c5f031592af494e4ce6c7655e5de295cb6fb0ad\": container with ID starting with 8e2495527090d65dfc6156b79c5f031592af494e4ce6c7655e5de295cb6fb0ad not found: ID does not exist" Oct 11 01:59:36 crc kubenswrapper[4743]: I1011 01:59:36.849945 4743 scope.go:117] "RemoveContainer" containerID="71e72ad004a6950a2cd79e91206b28b806b93365483e2a9c1fb1c41ec0992de1" Oct 11 01:59:36 crc kubenswrapper[4743]: E1011 01:59:36.850346 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71e72ad004a6950a2cd79e91206b28b806b93365483e2a9c1fb1c41ec0992de1\": container with ID starting with 71e72ad004a6950a2cd79e91206b28b806b93365483e2a9c1fb1c41ec0992de1 not found: ID does not exist" containerID="71e72ad004a6950a2cd79e91206b28b806b93365483e2a9c1fb1c41ec0992de1" Oct 11 01:59:36 crc kubenswrapper[4743]: I1011 01:59:36.850412 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71e72ad004a6950a2cd79e91206b28b806b93365483e2a9c1fb1c41ec0992de1"} err="failed to get container status \"71e72ad004a6950a2cd79e91206b28b806b93365483e2a9c1fb1c41ec0992de1\": rpc error: code = NotFound desc = could not find container \"71e72ad004a6950a2cd79e91206b28b806b93365483e2a9c1fb1c41ec0992de1\": container with ID starting with 71e72ad004a6950a2cd79e91206b28b806b93365483e2a9c1fb1c41ec0992de1 not found: ID does not exist" Oct 11 01:59:36 crc kubenswrapper[4743]: I1011 01:59:36.850461 4743 scope.go:117] "RemoveContainer" containerID="0748a0d17d9291e70bb788e119d5a99bb03e8dc488f7c509529a4c9ddf797b8f" Oct 11 01:59:36 crc kubenswrapper[4743]: E1011 01:59:36.850740 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0748a0d17d9291e70bb788e119d5a99bb03e8dc488f7c509529a4c9ddf797b8f\": container with ID starting with 0748a0d17d9291e70bb788e119d5a99bb03e8dc488f7c509529a4c9ddf797b8f not found: ID does not exist" containerID="0748a0d17d9291e70bb788e119d5a99bb03e8dc488f7c509529a4c9ddf797b8f" Oct 11 01:59:36 crc kubenswrapper[4743]: I1011 01:59:36.850767 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0748a0d17d9291e70bb788e119d5a99bb03e8dc488f7c509529a4c9ddf797b8f"} err="failed to get container status \"0748a0d17d9291e70bb788e119d5a99bb03e8dc488f7c509529a4c9ddf797b8f\": rpc error: code = NotFound desc = could not find container \"0748a0d17d9291e70bb788e119d5a99bb03e8dc488f7c509529a4c9ddf797b8f\": container with ID starting with 0748a0d17d9291e70bb788e119d5a99bb03e8dc488f7c509529a4c9ddf797b8f not found: ID does not exist" Oct 11 01:59:38 crc kubenswrapper[4743]: I1011 01:59:38.103741 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c374566-8d2a-4ae7-98a5-dc2917839847" path="/var/lib/kubelet/pods/0c374566-8d2a-4ae7-98a5-dc2917839847/volumes" Oct 11 01:59:44 crc kubenswrapper[4743]: I1011 01:59:44.458141 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 01:59:44 crc kubenswrapper[4743]: I1011 01:59:44.461242 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 01:59:44 crc kubenswrapper[4743]: I1011 01:59:44.461444 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 01:59:44 crc kubenswrapper[4743]: I1011 01:59:44.462526 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0c6c02f491c08a5f1c576130dfd249554a30c097963b0370bce3acf791a47aa1"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 01:59:44 crc kubenswrapper[4743]: I1011 01:59:44.462735 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://0c6c02f491c08a5f1c576130dfd249554a30c097963b0370bce3acf791a47aa1" gracePeriod=600 Oct 11 01:59:44 crc kubenswrapper[4743]: I1011 01:59:44.815357 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="0c6c02f491c08a5f1c576130dfd249554a30c097963b0370bce3acf791a47aa1" exitCode=0 Oct 11 01:59:44 crc kubenswrapper[4743]: I1011 01:59:44.815437 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"0c6c02f491c08a5f1c576130dfd249554a30c097963b0370bce3acf791a47aa1"} Oct 11 01:59:44 crc kubenswrapper[4743]: I1011 01:59:44.815704 4743 scope.go:117] "RemoveContainer" containerID="3f38cba11851c95fa12b6d21a7746bd10b6e69b2d33debe34b378719ccfc7393" Oct 11 01:59:45 crc kubenswrapper[4743]: I1011 01:59:45.826042 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57"} Oct 11 01:59:46 crc kubenswrapper[4743]: I1011 01:59:46.839721 4743 generic.go:334] "Generic (PLEG): container finished" podID="6ea65353-b389-4222-8ff8-298d53283609" containerID="6b1f78d75a105b7a27f4d8a7c24cc4ef9c0d6112402ef75044c17eb1ed5c9a5a" exitCode=0 Oct 11 01:59:46 crc kubenswrapper[4743]: I1011 01:59:46.839796 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" event={"ID":"6ea65353-b389-4222-8ff8-298d53283609","Type":"ContainerDied","Data":"6b1f78d75a105b7a27f4d8a7c24cc4ef9c0d6112402ef75044c17eb1ed5c9a5a"} Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.395569 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.474125 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-inventory\") pod \"6ea65353-b389-4222-8ff8-298d53283609\" (UID: \"6ea65353-b389-4222-8ff8-298d53283609\") " Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.474397 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-ceph\") pod \"6ea65353-b389-4222-8ff8-298d53283609\" (UID: \"6ea65353-b389-4222-8ff8-298d53283609\") " Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.474427 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-libvirt-combined-ca-bundle\") pod \"6ea65353-b389-4222-8ff8-298d53283609\" (UID: \"6ea65353-b389-4222-8ff8-298d53283609\") " Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.474575 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-ssh-key\") pod \"6ea65353-b389-4222-8ff8-298d53283609\" (UID: \"6ea65353-b389-4222-8ff8-298d53283609\") " Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.474645 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z27rt\" (UniqueName: \"kubernetes.io/projected/6ea65353-b389-4222-8ff8-298d53283609-kube-api-access-z27rt\") pod \"6ea65353-b389-4222-8ff8-298d53283609\" (UID: \"6ea65353-b389-4222-8ff8-298d53283609\") " Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.474778 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-libvirt-secret-0\") pod \"6ea65353-b389-4222-8ff8-298d53283609\" (UID: \"6ea65353-b389-4222-8ff8-298d53283609\") " Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.480686 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-ceph" (OuterVolumeSpecName: "ceph") pod "6ea65353-b389-4222-8ff8-298d53283609" (UID: "6ea65353-b389-4222-8ff8-298d53283609"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.481034 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "6ea65353-b389-4222-8ff8-298d53283609" (UID: "6ea65353-b389-4222-8ff8-298d53283609"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.482288 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea65353-b389-4222-8ff8-298d53283609-kube-api-access-z27rt" (OuterVolumeSpecName: "kube-api-access-z27rt") pod "6ea65353-b389-4222-8ff8-298d53283609" (UID: "6ea65353-b389-4222-8ff8-298d53283609"). InnerVolumeSpecName "kube-api-access-z27rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.504219 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "6ea65353-b389-4222-8ff8-298d53283609" (UID: "6ea65353-b389-4222-8ff8-298d53283609"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.511275 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6ea65353-b389-4222-8ff8-298d53283609" (UID: "6ea65353-b389-4222-8ff8-298d53283609"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.529749 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-inventory" (OuterVolumeSpecName: "inventory") pod "6ea65353-b389-4222-8ff8-298d53283609" (UID: "6ea65353-b389-4222-8ff8-298d53283609"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.578656 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.578706 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z27rt\" (UniqueName: \"kubernetes.io/projected/6ea65353-b389-4222-8ff8-298d53283609-kube-api-access-z27rt\") on node \"crc\" DevicePath \"\"" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.578730 4743 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.578749 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.578769 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-ceph\") on node \"crc\" DevicePath \"\"" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.578787 4743 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea65353-b389-4222-8ff8-298d53283609-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.866738 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" event={"ID":"6ea65353-b389-4222-8ff8-298d53283609","Type":"ContainerDied","Data":"d0c9a46487243197a7e9e71e436516c2186e3335699cf1564b24ff7657de53e5"} Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.866780 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0c9a46487243197a7e9e71e436516c2186e3335699cf1564b24ff7657de53e5" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.866832 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.975109 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff"] Oct 11 01:59:48 crc kubenswrapper[4743]: E1011 01:59:48.975669 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea65353-b389-4222-8ff8-298d53283609" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.975700 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea65353-b389-4222-8ff8-298d53283609" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 11 01:59:48 crc kubenswrapper[4743]: E1011 01:59:48.975743 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c374566-8d2a-4ae7-98a5-dc2917839847" containerName="extract-utilities" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.975756 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c374566-8d2a-4ae7-98a5-dc2917839847" containerName="extract-utilities" Oct 11 01:59:48 crc kubenswrapper[4743]: E1011 01:59:48.975788 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c374566-8d2a-4ae7-98a5-dc2917839847" containerName="registry-server" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.975802 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c374566-8d2a-4ae7-98a5-dc2917839847" containerName="registry-server" Oct 11 01:59:48 crc kubenswrapper[4743]: E1011 01:59:48.975824 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c374566-8d2a-4ae7-98a5-dc2917839847" containerName="extract-content" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.975835 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c374566-8d2a-4ae7-98a5-dc2917839847" containerName="extract-content" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.976161 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c374566-8d2a-4ae7-98a5-dc2917839847" containerName="registry-server" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.976197 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ea65353-b389-4222-8ff8-298d53283609" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.977063 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.981964 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.981971 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.981980 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.982842 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.982968 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.982894 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.983331 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.983335 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.983792 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 01:59:48 crc kubenswrapper[4743]: I1011 01:59:48.997769 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff"] Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.087352 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.087435 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6kgt\" (UniqueName: \"kubernetes.io/projected/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-kube-api-access-h6kgt\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.087478 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.087519 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.087596 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.087622 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.087666 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.087763 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.087798 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.087913 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.087971 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.189262 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.189351 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.189456 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.189506 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.189545 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6kgt\" (UniqueName: \"kubernetes.io/projected/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-kube-api-access-h6kgt\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.189586 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.189633 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.189776 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.189803 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.189847 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.190023 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.191937 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.192108 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.193872 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.195079 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.195570 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.196315 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.196770 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.197049 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.198544 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.202880 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.222908 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6kgt\" (UniqueName: \"kubernetes.io/projected/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-kube-api-access-h6kgt\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:49 crc kubenswrapper[4743]: I1011 01:59:49.300442 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 01:59:50 crc kubenswrapper[4743]: I1011 01:59:50.001961 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff"] Oct 11 01:59:50 crc kubenswrapper[4743]: I1011 01:59:50.892355 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" event={"ID":"75f90fbf-75a7-4b2a-af1a-7693cebeaea3","Type":"ContainerStarted","Data":"2758ff1d07190f9ffa13ed85a5afbee76a80882b0b1daad0b7aa646b1e6964d2"} Oct 11 01:59:51 crc kubenswrapper[4743]: I1011 01:59:51.911710 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" event={"ID":"75f90fbf-75a7-4b2a-af1a-7693cebeaea3","Type":"ContainerStarted","Data":"e868714e6758abdf54dea905e5f34aa7cb66137b677d135912f09d644860a77f"} Oct 11 01:59:51 crc kubenswrapper[4743]: I1011 01:59:51.940656 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" podStartSLOduration=3.403776744 podStartE2EDuration="3.940630797s" podCreationTimestamp="2025-10-11 01:59:48 +0000 UTC" firstStartedPulling="2025-10-11 01:59:50.013357659 +0000 UTC m=+4084.666338096" lastFinishedPulling="2025-10-11 01:59:50.550211752 +0000 UTC m=+4085.203192149" observedRunningTime="2025-10-11 01:59:51.934058094 +0000 UTC m=+4086.587038521" watchObservedRunningTime="2025-10-11 01:59:51.940630797 +0000 UTC m=+4086.593611204" Oct 11 02:00:00 crc kubenswrapper[4743]: I1011 02:00:00.151380 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335800-4bpgr"] Oct 11 02:00:00 crc kubenswrapper[4743]: I1011 02:00:00.153363 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335800-4bpgr" Oct 11 02:00:00 crc kubenswrapper[4743]: I1011 02:00:00.157109 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 11 02:00:00 crc kubenswrapper[4743]: I1011 02:00:00.157299 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 11 02:00:00 crc kubenswrapper[4743]: I1011 02:00:00.192683 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335800-4bpgr"] Oct 11 02:00:00 crc kubenswrapper[4743]: I1011 02:00:00.259627 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e2de2bb-f463-4ca1-9666-34d2e889665c-secret-volume\") pod \"collect-profiles-29335800-4bpgr\" (UID: \"0e2de2bb-f463-4ca1-9666-34d2e889665c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335800-4bpgr" Oct 11 02:00:00 crc kubenswrapper[4743]: I1011 02:00:00.259795 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e2de2bb-f463-4ca1-9666-34d2e889665c-config-volume\") pod \"collect-profiles-29335800-4bpgr\" (UID: \"0e2de2bb-f463-4ca1-9666-34d2e889665c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335800-4bpgr" Oct 11 02:00:00 crc kubenswrapper[4743]: I1011 02:00:00.259879 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rphxw\" (UniqueName: \"kubernetes.io/projected/0e2de2bb-f463-4ca1-9666-34d2e889665c-kube-api-access-rphxw\") pod \"collect-profiles-29335800-4bpgr\" (UID: \"0e2de2bb-f463-4ca1-9666-34d2e889665c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335800-4bpgr" Oct 11 02:00:00 crc kubenswrapper[4743]: I1011 02:00:00.361520 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e2de2bb-f463-4ca1-9666-34d2e889665c-config-volume\") pod \"collect-profiles-29335800-4bpgr\" (UID: \"0e2de2bb-f463-4ca1-9666-34d2e889665c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335800-4bpgr" Oct 11 02:00:00 crc kubenswrapper[4743]: I1011 02:00:00.361593 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rphxw\" (UniqueName: \"kubernetes.io/projected/0e2de2bb-f463-4ca1-9666-34d2e889665c-kube-api-access-rphxw\") pod \"collect-profiles-29335800-4bpgr\" (UID: \"0e2de2bb-f463-4ca1-9666-34d2e889665c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335800-4bpgr" Oct 11 02:00:00 crc kubenswrapper[4743]: I1011 02:00:00.361869 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e2de2bb-f463-4ca1-9666-34d2e889665c-secret-volume\") pod \"collect-profiles-29335800-4bpgr\" (UID: \"0e2de2bb-f463-4ca1-9666-34d2e889665c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335800-4bpgr" Oct 11 02:00:00 crc kubenswrapper[4743]: I1011 02:00:00.362539 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e2de2bb-f463-4ca1-9666-34d2e889665c-config-volume\") pod \"collect-profiles-29335800-4bpgr\" (UID: \"0e2de2bb-f463-4ca1-9666-34d2e889665c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335800-4bpgr" Oct 11 02:00:00 crc kubenswrapper[4743]: I1011 02:00:00.371712 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e2de2bb-f463-4ca1-9666-34d2e889665c-secret-volume\") pod \"collect-profiles-29335800-4bpgr\" (UID: \"0e2de2bb-f463-4ca1-9666-34d2e889665c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335800-4bpgr" Oct 11 02:00:00 crc kubenswrapper[4743]: I1011 02:00:00.384770 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rphxw\" (UniqueName: \"kubernetes.io/projected/0e2de2bb-f463-4ca1-9666-34d2e889665c-kube-api-access-rphxw\") pod \"collect-profiles-29335800-4bpgr\" (UID: \"0e2de2bb-f463-4ca1-9666-34d2e889665c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335800-4bpgr" Oct 11 02:00:00 crc kubenswrapper[4743]: I1011 02:00:00.492133 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335800-4bpgr" Oct 11 02:00:01 crc kubenswrapper[4743]: I1011 02:00:01.002256 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335800-4bpgr"] Oct 11 02:00:02 crc kubenswrapper[4743]: I1011 02:00:02.032049 4743 generic.go:334] "Generic (PLEG): container finished" podID="0e2de2bb-f463-4ca1-9666-34d2e889665c" containerID="0170fcfc69bfd13c70a7f3fc633a3f1458adda2257e9f3b70331ada44d0e01fe" exitCode=0 Oct 11 02:00:02 crc kubenswrapper[4743]: I1011 02:00:02.032132 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335800-4bpgr" event={"ID":"0e2de2bb-f463-4ca1-9666-34d2e889665c","Type":"ContainerDied","Data":"0170fcfc69bfd13c70a7f3fc633a3f1458adda2257e9f3b70331ada44d0e01fe"} Oct 11 02:00:02 crc kubenswrapper[4743]: I1011 02:00:02.032538 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335800-4bpgr" event={"ID":"0e2de2bb-f463-4ca1-9666-34d2e889665c","Type":"ContainerStarted","Data":"0610ff22b7c51e21a701c27af4331eab74e254e9619e709e2e0636db1dacb64e"} Oct 11 02:00:03 crc kubenswrapper[4743]: I1011 02:00:03.415498 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335800-4bpgr" Oct 11 02:00:03 crc kubenswrapper[4743]: I1011 02:00:03.533899 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rphxw\" (UniqueName: \"kubernetes.io/projected/0e2de2bb-f463-4ca1-9666-34d2e889665c-kube-api-access-rphxw\") pod \"0e2de2bb-f463-4ca1-9666-34d2e889665c\" (UID: \"0e2de2bb-f463-4ca1-9666-34d2e889665c\") " Oct 11 02:00:03 crc kubenswrapper[4743]: I1011 02:00:03.534165 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e2de2bb-f463-4ca1-9666-34d2e889665c-secret-volume\") pod \"0e2de2bb-f463-4ca1-9666-34d2e889665c\" (UID: \"0e2de2bb-f463-4ca1-9666-34d2e889665c\") " Oct 11 02:00:03 crc kubenswrapper[4743]: I1011 02:00:03.534251 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e2de2bb-f463-4ca1-9666-34d2e889665c-config-volume\") pod \"0e2de2bb-f463-4ca1-9666-34d2e889665c\" (UID: \"0e2de2bb-f463-4ca1-9666-34d2e889665c\") " Oct 11 02:00:03 crc kubenswrapper[4743]: I1011 02:00:03.534984 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e2de2bb-f463-4ca1-9666-34d2e889665c-config-volume" (OuterVolumeSpecName: "config-volume") pod "0e2de2bb-f463-4ca1-9666-34d2e889665c" (UID: "0e2de2bb-f463-4ca1-9666-34d2e889665c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 02:00:03 crc kubenswrapper[4743]: I1011 02:00:03.543732 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e2de2bb-f463-4ca1-9666-34d2e889665c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0e2de2bb-f463-4ca1-9666-34d2e889665c" (UID: "0e2de2bb-f463-4ca1-9666-34d2e889665c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:00:03 crc kubenswrapper[4743]: I1011 02:00:03.544328 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e2de2bb-f463-4ca1-9666-34d2e889665c-kube-api-access-rphxw" (OuterVolumeSpecName: "kube-api-access-rphxw") pod "0e2de2bb-f463-4ca1-9666-34d2e889665c" (UID: "0e2de2bb-f463-4ca1-9666-34d2e889665c"). InnerVolumeSpecName "kube-api-access-rphxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:00:03 crc kubenswrapper[4743]: I1011 02:00:03.637281 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e2de2bb-f463-4ca1-9666-34d2e889665c-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 11 02:00:03 crc kubenswrapper[4743]: I1011 02:00:03.637336 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e2de2bb-f463-4ca1-9666-34d2e889665c-config-volume\") on node \"crc\" DevicePath \"\"" Oct 11 02:00:03 crc kubenswrapper[4743]: I1011 02:00:03.637356 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rphxw\" (UniqueName: \"kubernetes.io/projected/0e2de2bb-f463-4ca1-9666-34d2e889665c-kube-api-access-rphxw\") on node \"crc\" DevicePath \"\"" Oct 11 02:00:04 crc kubenswrapper[4743]: I1011 02:00:04.074169 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335800-4bpgr" event={"ID":"0e2de2bb-f463-4ca1-9666-34d2e889665c","Type":"ContainerDied","Data":"0610ff22b7c51e21a701c27af4331eab74e254e9619e709e2e0636db1dacb64e"} Oct 11 02:00:04 crc kubenswrapper[4743]: I1011 02:00:04.074220 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0610ff22b7c51e21a701c27af4331eab74e254e9619e709e2e0636db1dacb64e" Oct 11 02:00:04 crc kubenswrapper[4743]: I1011 02:00:04.074261 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335800-4bpgr" Oct 11 02:00:04 crc kubenswrapper[4743]: E1011 02:00:04.318085 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e2de2bb_f463_4ca1_9666_34d2e889665c.slice\": RecentStats: unable to find data in memory cache]" Oct 11 02:00:04 crc kubenswrapper[4743]: I1011 02:00:04.485832 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335755-r9b8t"] Oct 11 02:00:04 crc kubenswrapper[4743]: I1011 02:00:04.495059 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335755-r9b8t"] Oct 11 02:00:06 crc kubenswrapper[4743]: I1011 02:00:06.103505 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13d28b94-fc19-4f99-98f4-5e0891a1a7a7" path="/var/lib/kubelet/pods/13d28b94-fc19-4f99-98f4-5e0891a1a7a7/volumes" Oct 11 02:00:55 crc kubenswrapper[4743]: I1011 02:00:55.306270 4743 scope.go:117] "RemoveContainer" containerID="a658a83b250b03c861c4f9d2ba2ecf815fade25c78b3eb625204af21d38a1ca6" Oct 11 02:01:00 crc kubenswrapper[4743]: I1011 02:01:00.158491 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29335801-cqb6t"] Oct 11 02:01:00 crc kubenswrapper[4743]: E1011 02:01:00.159613 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e2de2bb-f463-4ca1-9666-34d2e889665c" containerName="collect-profiles" Oct 11 02:01:00 crc kubenswrapper[4743]: I1011 02:01:00.159630 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e2de2bb-f463-4ca1-9666-34d2e889665c" containerName="collect-profiles" Oct 11 02:01:00 crc kubenswrapper[4743]: I1011 02:01:00.160045 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e2de2bb-f463-4ca1-9666-34d2e889665c" containerName="collect-profiles" Oct 11 02:01:00 crc kubenswrapper[4743]: I1011 02:01:00.160927 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29335801-cqb6t" Oct 11 02:01:00 crc kubenswrapper[4743]: I1011 02:01:00.168242 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29335801-cqb6t"] Oct 11 02:01:00 crc kubenswrapper[4743]: I1011 02:01:00.303993 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90420843-1d2e-48e7-bec5-63cc4cd8557e-combined-ca-bundle\") pod \"keystone-cron-29335801-cqb6t\" (UID: \"90420843-1d2e-48e7-bec5-63cc4cd8557e\") " pod="openstack/keystone-cron-29335801-cqb6t" Oct 11 02:01:00 crc kubenswrapper[4743]: I1011 02:01:00.304381 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6ss6\" (UniqueName: \"kubernetes.io/projected/90420843-1d2e-48e7-bec5-63cc4cd8557e-kube-api-access-c6ss6\") pod \"keystone-cron-29335801-cqb6t\" (UID: \"90420843-1d2e-48e7-bec5-63cc4cd8557e\") " pod="openstack/keystone-cron-29335801-cqb6t" Oct 11 02:01:00 crc kubenswrapper[4743]: I1011 02:01:00.304433 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/90420843-1d2e-48e7-bec5-63cc4cd8557e-fernet-keys\") pod \"keystone-cron-29335801-cqb6t\" (UID: \"90420843-1d2e-48e7-bec5-63cc4cd8557e\") " pod="openstack/keystone-cron-29335801-cqb6t" Oct 11 02:01:00 crc kubenswrapper[4743]: I1011 02:01:00.304564 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90420843-1d2e-48e7-bec5-63cc4cd8557e-config-data\") pod \"keystone-cron-29335801-cqb6t\" (UID: \"90420843-1d2e-48e7-bec5-63cc4cd8557e\") " pod="openstack/keystone-cron-29335801-cqb6t" Oct 11 02:01:00 crc kubenswrapper[4743]: I1011 02:01:00.406377 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90420843-1d2e-48e7-bec5-63cc4cd8557e-combined-ca-bundle\") pod \"keystone-cron-29335801-cqb6t\" (UID: \"90420843-1d2e-48e7-bec5-63cc4cd8557e\") " pod="openstack/keystone-cron-29335801-cqb6t" Oct 11 02:01:00 crc kubenswrapper[4743]: I1011 02:01:00.406458 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6ss6\" (UniqueName: \"kubernetes.io/projected/90420843-1d2e-48e7-bec5-63cc4cd8557e-kube-api-access-c6ss6\") pod \"keystone-cron-29335801-cqb6t\" (UID: \"90420843-1d2e-48e7-bec5-63cc4cd8557e\") " pod="openstack/keystone-cron-29335801-cqb6t" Oct 11 02:01:00 crc kubenswrapper[4743]: I1011 02:01:00.406526 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/90420843-1d2e-48e7-bec5-63cc4cd8557e-fernet-keys\") pod \"keystone-cron-29335801-cqb6t\" (UID: \"90420843-1d2e-48e7-bec5-63cc4cd8557e\") " pod="openstack/keystone-cron-29335801-cqb6t" Oct 11 02:01:00 crc kubenswrapper[4743]: I1011 02:01:00.406632 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90420843-1d2e-48e7-bec5-63cc4cd8557e-config-data\") pod \"keystone-cron-29335801-cqb6t\" (UID: \"90420843-1d2e-48e7-bec5-63cc4cd8557e\") " pod="openstack/keystone-cron-29335801-cqb6t" Oct 11 02:01:00 crc kubenswrapper[4743]: I1011 02:01:00.414260 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90420843-1d2e-48e7-bec5-63cc4cd8557e-combined-ca-bundle\") pod \"keystone-cron-29335801-cqb6t\" (UID: \"90420843-1d2e-48e7-bec5-63cc4cd8557e\") " pod="openstack/keystone-cron-29335801-cqb6t" Oct 11 02:01:00 crc kubenswrapper[4743]: I1011 02:01:00.414278 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90420843-1d2e-48e7-bec5-63cc4cd8557e-config-data\") pod \"keystone-cron-29335801-cqb6t\" (UID: \"90420843-1d2e-48e7-bec5-63cc4cd8557e\") " pod="openstack/keystone-cron-29335801-cqb6t" Oct 11 02:01:00 crc kubenswrapper[4743]: I1011 02:01:00.419209 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/90420843-1d2e-48e7-bec5-63cc4cd8557e-fernet-keys\") pod \"keystone-cron-29335801-cqb6t\" (UID: \"90420843-1d2e-48e7-bec5-63cc4cd8557e\") " pod="openstack/keystone-cron-29335801-cqb6t" Oct 11 02:01:00 crc kubenswrapper[4743]: I1011 02:01:00.423893 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6ss6\" (UniqueName: \"kubernetes.io/projected/90420843-1d2e-48e7-bec5-63cc4cd8557e-kube-api-access-c6ss6\") pod \"keystone-cron-29335801-cqb6t\" (UID: \"90420843-1d2e-48e7-bec5-63cc4cd8557e\") " pod="openstack/keystone-cron-29335801-cqb6t" Oct 11 02:01:00 crc kubenswrapper[4743]: I1011 02:01:00.488244 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29335801-cqb6t" Oct 11 02:01:01 crc kubenswrapper[4743]: I1011 02:01:01.021015 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29335801-cqb6t"] Oct 11 02:01:01 crc kubenswrapper[4743]: W1011 02:01:01.265407 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90420843_1d2e_48e7_bec5_63cc4cd8557e.slice/crio-e8017b0c9588bc3669ef2eb48daecc92dab8042f020ad792a6675357756a4cd6 WatchSource:0}: Error finding container e8017b0c9588bc3669ef2eb48daecc92dab8042f020ad792a6675357756a4cd6: Status 404 returned error can't find the container with id e8017b0c9588bc3669ef2eb48daecc92dab8042f020ad792a6675357756a4cd6 Oct 11 02:01:01 crc kubenswrapper[4743]: I1011 02:01:01.818047 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29335801-cqb6t" event={"ID":"90420843-1d2e-48e7-bec5-63cc4cd8557e","Type":"ContainerStarted","Data":"4d3b41d5443083ccd9058f0af4d34e0b4921b55404fce17380a2a657b847bdc0"} Oct 11 02:01:01 crc kubenswrapper[4743]: I1011 02:01:01.818392 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29335801-cqb6t" event={"ID":"90420843-1d2e-48e7-bec5-63cc4cd8557e","Type":"ContainerStarted","Data":"e8017b0c9588bc3669ef2eb48daecc92dab8042f020ad792a6675357756a4cd6"} Oct 11 02:01:01 crc kubenswrapper[4743]: I1011 02:01:01.851321 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29335801-cqb6t" podStartSLOduration=1.851294298 podStartE2EDuration="1.851294298s" podCreationTimestamp="2025-10-11 02:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 02:01:01.835017635 +0000 UTC m=+4156.487998062" watchObservedRunningTime="2025-10-11 02:01:01.851294298 +0000 UTC m=+4156.504274725" Oct 11 02:01:04 crc kubenswrapper[4743]: I1011 02:01:04.850616 4743 generic.go:334] "Generic (PLEG): container finished" podID="90420843-1d2e-48e7-bec5-63cc4cd8557e" containerID="4d3b41d5443083ccd9058f0af4d34e0b4921b55404fce17380a2a657b847bdc0" exitCode=0 Oct 11 02:01:04 crc kubenswrapper[4743]: I1011 02:01:04.850652 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29335801-cqb6t" event={"ID":"90420843-1d2e-48e7-bec5-63cc4cd8557e","Type":"ContainerDied","Data":"4d3b41d5443083ccd9058f0af4d34e0b4921b55404fce17380a2a657b847bdc0"} Oct 11 02:01:06 crc kubenswrapper[4743]: I1011 02:01:06.349597 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29335801-cqb6t" Oct 11 02:01:06 crc kubenswrapper[4743]: I1011 02:01:06.539136 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90420843-1d2e-48e7-bec5-63cc4cd8557e-config-data\") pod \"90420843-1d2e-48e7-bec5-63cc4cd8557e\" (UID: \"90420843-1d2e-48e7-bec5-63cc4cd8557e\") " Oct 11 02:01:06 crc kubenswrapper[4743]: I1011 02:01:06.539266 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/90420843-1d2e-48e7-bec5-63cc4cd8557e-fernet-keys\") pod \"90420843-1d2e-48e7-bec5-63cc4cd8557e\" (UID: \"90420843-1d2e-48e7-bec5-63cc4cd8557e\") " Oct 11 02:01:06 crc kubenswrapper[4743]: I1011 02:01:06.539466 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90420843-1d2e-48e7-bec5-63cc4cd8557e-combined-ca-bundle\") pod \"90420843-1d2e-48e7-bec5-63cc4cd8557e\" (UID: \"90420843-1d2e-48e7-bec5-63cc4cd8557e\") " Oct 11 02:01:06 crc kubenswrapper[4743]: I1011 02:01:06.539504 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6ss6\" (UniqueName: \"kubernetes.io/projected/90420843-1d2e-48e7-bec5-63cc4cd8557e-kube-api-access-c6ss6\") pod \"90420843-1d2e-48e7-bec5-63cc4cd8557e\" (UID: \"90420843-1d2e-48e7-bec5-63cc4cd8557e\") " Oct 11 02:01:06 crc kubenswrapper[4743]: I1011 02:01:06.544786 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90420843-1d2e-48e7-bec5-63cc4cd8557e-kube-api-access-c6ss6" (OuterVolumeSpecName: "kube-api-access-c6ss6") pod "90420843-1d2e-48e7-bec5-63cc4cd8557e" (UID: "90420843-1d2e-48e7-bec5-63cc4cd8557e"). InnerVolumeSpecName "kube-api-access-c6ss6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:01:06 crc kubenswrapper[4743]: I1011 02:01:06.562984 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90420843-1d2e-48e7-bec5-63cc4cd8557e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "90420843-1d2e-48e7-bec5-63cc4cd8557e" (UID: "90420843-1d2e-48e7-bec5-63cc4cd8557e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:01:06 crc kubenswrapper[4743]: I1011 02:01:06.570283 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90420843-1d2e-48e7-bec5-63cc4cd8557e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90420843-1d2e-48e7-bec5-63cc4cd8557e" (UID: "90420843-1d2e-48e7-bec5-63cc4cd8557e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:01:06 crc kubenswrapper[4743]: I1011 02:01:06.606449 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90420843-1d2e-48e7-bec5-63cc4cd8557e-config-data" (OuterVolumeSpecName: "config-data") pod "90420843-1d2e-48e7-bec5-63cc4cd8557e" (UID: "90420843-1d2e-48e7-bec5-63cc4cd8557e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:01:06 crc kubenswrapper[4743]: I1011 02:01:06.642083 4743 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/90420843-1d2e-48e7-bec5-63cc4cd8557e-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 11 02:01:06 crc kubenswrapper[4743]: I1011 02:01:06.642114 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90420843-1d2e-48e7-bec5-63cc4cd8557e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 02:01:06 crc kubenswrapper[4743]: I1011 02:01:06.642126 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6ss6\" (UniqueName: \"kubernetes.io/projected/90420843-1d2e-48e7-bec5-63cc4cd8557e-kube-api-access-c6ss6\") on node \"crc\" DevicePath \"\"" Oct 11 02:01:06 crc kubenswrapper[4743]: I1011 02:01:06.642135 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90420843-1d2e-48e7-bec5-63cc4cd8557e-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 02:01:06 crc kubenswrapper[4743]: I1011 02:01:06.870690 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29335801-cqb6t" event={"ID":"90420843-1d2e-48e7-bec5-63cc4cd8557e","Type":"ContainerDied","Data":"e8017b0c9588bc3669ef2eb48daecc92dab8042f020ad792a6675357756a4cd6"} Oct 11 02:01:06 crc kubenswrapper[4743]: I1011 02:01:06.871018 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8017b0c9588bc3669ef2eb48daecc92dab8042f020ad792a6675357756a4cd6" Oct 11 02:01:06 crc kubenswrapper[4743]: I1011 02:01:06.870743 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29335801-cqb6t" Oct 11 02:01:22 crc kubenswrapper[4743]: I1011 02:01:22.273586 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9p2gm"] Oct 11 02:01:22 crc kubenswrapper[4743]: E1011 02:01:22.274552 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90420843-1d2e-48e7-bec5-63cc4cd8557e" containerName="keystone-cron" Oct 11 02:01:22 crc kubenswrapper[4743]: I1011 02:01:22.274563 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="90420843-1d2e-48e7-bec5-63cc4cd8557e" containerName="keystone-cron" Oct 11 02:01:22 crc kubenswrapper[4743]: I1011 02:01:22.274787 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="90420843-1d2e-48e7-bec5-63cc4cd8557e" containerName="keystone-cron" Oct 11 02:01:22 crc kubenswrapper[4743]: I1011 02:01:22.276601 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9p2gm" Oct 11 02:01:22 crc kubenswrapper[4743]: I1011 02:01:22.292695 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9p2gm"] Oct 11 02:01:22 crc kubenswrapper[4743]: I1011 02:01:22.362041 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7af78f17-adac-4fbd-9074-640b6c6fb0dd-catalog-content\") pod \"community-operators-9p2gm\" (UID: \"7af78f17-adac-4fbd-9074-640b6c6fb0dd\") " pod="openshift-marketplace/community-operators-9p2gm" Oct 11 02:01:22 crc kubenswrapper[4743]: I1011 02:01:22.362260 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v28n\" (UniqueName: \"kubernetes.io/projected/7af78f17-adac-4fbd-9074-640b6c6fb0dd-kube-api-access-8v28n\") pod \"community-operators-9p2gm\" (UID: \"7af78f17-adac-4fbd-9074-640b6c6fb0dd\") " pod="openshift-marketplace/community-operators-9p2gm" Oct 11 02:01:22 crc kubenswrapper[4743]: I1011 02:01:22.362432 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7af78f17-adac-4fbd-9074-640b6c6fb0dd-utilities\") pod \"community-operators-9p2gm\" (UID: \"7af78f17-adac-4fbd-9074-640b6c6fb0dd\") " pod="openshift-marketplace/community-operators-9p2gm" Oct 11 02:01:22 crc kubenswrapper[4743]: I1011 02:01:22.465878 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7af78f17-adac-4fbd-9074-640b6c6fb0dd-catalog-content\") pod \"community-operators-9p2gm\" (UID: \"7af78f17-adac-4fbd-9074-640b6c6fb0dd\") " pod="openshift-marketplace/community-operators-9p2gm" Oct 11 02:01:22 crc kubenswrapper[4743]: I1011 02:01:22.466344 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v28n\" (UniqueName: \"kubernetes.io/projected/7af78f17-adac-4fbd-9074-640b6c6fb0dd-kube-api-access-8v28n\") pod \"community-operators-9p2gm\" (UID: \"7af78f17-adac-4fbd-9074-640b6c6fb0dd\") " pod="openshift-marketplace/community-operators-9p2gm" Oct 11 02:01:22 crc kubenswrapper[4743]: I1011 02:01:22.466556 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7af78f17-adac-4fbd-9074-640b6c6fb0dd-catalog-content\") pod \"community-operators-9p2gm\" (UID: \"7af78f17-adac-4fbd-9074-640b6c6fb0dd\") " pod="openshift-marketplace/community-operators-9p2gm" Oct 11 02:01:22 crc kubenswrapper[4743]: I1011 02:01:22.466749 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7af78f17-adac-4fbd-9074-640b6c6fb0dd-utilities\") pod \"community-operators-9p2gm\" (UID: \"7af78f17-adac-4fbd-9074-640b6c6fb0dd\") " pod="openshift-marketplace/community-operators-9p2gm" Oct 11 02:01:22 crc kubenswrapper[4743]: I1011 02:01:22.467103 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7af78f17-adac-4fbd-9074-640b6c6fb0dd-utilities\") pod \"community-operators-9p2gm\" (UID: \"7af78f17-adac-4fbd-9074-640b6c6fb0dd\") " pod="openshift-marketplace/community-operators-9p2gm" Oct 11 02:01:22 crc kubenswrapper[4743]: I1011 02:01:22.492829 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v28n\" (UniqueName: \"kubernetes.io/projected/7af78f17-adac-4fbd-9074-640b6c6fb0dd-kube-api-access-8v28n\") pod \"community-operators-9p2gm\" (UID: \"7af78f17-adac-4fbd-9074-640b6c6fb0dd\") " pod="openshift-marketplace/community-operators-9p2gm" Oct 11 02:01:22 crc kubenswrapper[4743]: I1011 02:01:22.656110 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9p2gm" Oct 11 02:01:23 crc kubenswrapper[4743]: I1011 02:01:23.186660 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9p2gm"] Oct 11 02:01:24 crc kubenswrapper[4743]: I1011 02:01:24.077029 4743 generic.go:334] "Generic (PLEG): container finished" podID="7af78f17-adac-4fbd-9074-640b6c6fb0dd" containerID="64a21a6712e085a221f1bf56516cda4c2615017ed7a93a7491e1298df72e798b" exitCode=0 Oct 11 02:01:24 crc kubenswrapper[4743]: I1011 02:01:24.077210 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9p2gm" event={"ID":"7af78f17-adac-4fbd-9074-640b6c6fb0dd","Type":"ContainerDied","Data":"64a21a6712e085a221f1bf56516cda4c2615017ed7a93a7491e1298df72e798b"} Oct 11 02:01:24 crc kubenswrapper[4743]: I1011 02:01:24.077414 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9p2gm" event={"ID":"7af78f17-adac-4fbd-9074-640b6c6fb0dd","Type":"ContainerStarted","Data":"d48a1e51fb8429c60cfc5580b5e71671c9e5f451d4aff1690ba93a191a50fe96"} Oct 11 02:01:24 crc kubenswrapper[4743]: I1011 02:01:24.080148 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 02:01:26 crc kubenswrapper[4743]: I1011 02:01:26.105144 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9p2gm" event={"ID":"7af78f17-adac-4fbd-9074-640b6c6fb0dd","Type":"ContainerStarted","Data":"24ff87dea2bd606413d1a12da629588acf5d78543e0a783cae6df1dc1280698f"} Oct 11 02:01:27 crc kubenswrapper[4743]: I1011 02:01:27.119124 4743 generic.go:334] "Generic (PLEG): container finished" podID="7af78f17-adac-4fbd-9074-640b6c6fb0dd" containerID="24ff87dea2bd606413d1a12da629588acf5d78543e0a783cae6df1dc1280698f" exitCode=0 Oct 11 02:01:27 crc kubenswrapper[4743]: I1011 02:01:27.119190 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9p2gm" event={"ID":"7af78f17-adac-4fbd-9074-640b6c6fb0dd","Type":"ContainerDied","Data":"24ff87dea2bd606413d1a12da629588acf5d78543e0a783cae6df1dc1280698f"} Oct 11 02:01:28 crc kubenswrapper[4743]: I1011 02:01:28.131395 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9p2gm" event={"ID":"7af78f17-adac-4fbd-9074-640b6c6fb0dd","Type":"ContainerStarted","Data":"5a2f73cfa41449909eb1994b3957efca37b5eeb3fe1f24eb9ce2bb8307cbe40c"} Oct 11 02:01:28 crc kubenswrapper[4743]: I1011 02:01:28.155880 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9p2gm" podStartSLOduration=2.547043208 podStartE2EDuration="6.155864966s" podCreationTimestamp="2025-10-11 02:01:22 +0000 UTC" firstStartedPulling="2025-10-11 02:01:24.07983799 +0000 UTC m=+4178.732818397" lastFinishedPulling="2025-10-11 02:01:27.688659738 +0000 UTC m=+4182.341640155" observedRunningTime="2025-10-11 02:01:28.15280304 +0000 UTC m=+4182.805783437" watchObservedRunningTime="2025-10-11 02:01:28.155864966 +0000 UTC m=+4182.808845363" Oct 11 02:01:32 crc kubenswrapper[4743]: I1011 02:01:32.656465 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9p2gm" Oct 11 02:01:32 crc kubenswrapper[4743]: I1011 02:01:32.657249 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9p2gm" Oct 11 02:01:33 crc kubenswrapper[4743]: I1011 02:01:33.332754 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9p2gm" Oct 11 02:01:33 crc kubenswrapper[4743]: I1011 02:01:33.388249 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9p2gm" Oct 11 02:01:33 crc kubenswrapper[4743]: I1011 02:01:33.574353 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9p2gm"] Oct 11 02:01:35 crc kubenswrapper[4743]: I1011 02:01:35.226434 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9p2gm" podUID="7af78f17-adac-4fbd-9074-640b6c6fb0dd" containerName="registry-server" containerID="cri-o://5a2f73cfa41449909eb1994b3957efca37b5eeb3fe1f24eb9ce2bb8307cbe40c" gracePeriod=2 Oct 11 02:01:35 crc kubenswrapper[4743]: I1011 02:01:35.745549 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9p2gm" Oct 11 02:01:35 crc kubenswrapper[4743]: I1011 02:01:35.882595 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v28n\" (UniqueName: \"kubernetes.io/projected/7af78f17-adac-4fbd-9074-640b6c6fb0dd-kube-api-access-8v28n\") pod \"7af78f17-adac-4fbd-9074-640b6c6fb0dd\" (UID: \"7af78f17-adac-4fbd-9074-640b6c6fb0dd\") " Oct 11 02:01:35 crc kubenswrapper[4743]: I1011 02:01:35.882773 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7af78f17-adac-4fbd-9074-640b6c6fb0dd-utilities\") pod \"7af78f17-adac-4fbd-9074-640b6c6fb0dd\" (UID: \"7af78f17-adac-4fbd-9074-640b6c6fb0dd\") " Oct 11 02:01:35 crc kubenswrapper[4743]: I1011 02:01:35.882803 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7af78f17-adac-4fbd-9074-640b6c6fb0dd-catalog-content\") pod \"7af78f17-adac-4fbd-9074-640b6c6fb0dd\" (UID: \"7af78f17-adac-4fbd-9074-640b6c6fb0dd\") " Oct 11 02:01:35 crc kubenswrapper[4743]: I1011 02:01:35.883626 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7af78f17-adac-4fbd-9074-640b6c6fb0dd-utilities" (OuterVolumeSpecName: "utilities") pod "7af78f17-adac-4fbd-9074-640b6c6fb0dd" (UID: "7af78f17-adac-4fbd-9074-640b6c6fb0dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:01:35 crc kubenswrapper[4743]: I1011 02:01:35.890999 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7af78f17-adac-4fbd-9074-640b6c6fb0dd-kube-api-access-8v28n" (OuterVolumeSpecName: "kube-api-access-8v28n") pod "7af78f17-adac-4fbd-9074-640b6c6fb0dd" (UID: "7af78f17-adac-4fbd-9074-640b6c6fb0dd"). InnerVolumeSpecName "kube-api-access-8v28n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:01:35 crc kubenswrapper[4743]: I1011 02:01:35.941208 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7af78f17-adac-4fbd-9074-640b6c6fb0dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7af78f17-adac-4fbd-9074-640b6c6fb0dd" (UID: "7af78f17-adac-4fbd-9074-640b6c6fb0dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:01:35 crc kubenswrapper[4743]: I1011 02:01:35.986041 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v28n\" (UniqueName: \"kubernetes.io/projected/7af78f17-adac-4fbd-9074-640b6c6fb0dd-kube-api-access-8v28n\") on node \"crc\" DevicePath \"\"" Oct 11 02:01:35 crc kubenswrapper[4743]: I1011 02:01:35.986109 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7af78f17-adac-4fbd-9074-640b6c6fb0dd-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 02:01:35 crc kubenswrapper[4743]: I1011 02:01:35.986129 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7af78f17-adac-4fbd-9074-640b6c6fb0dd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 02:01:36 crc kubenswrapper[4743]: I1011 02:01:36.238949 4743 generic.go:334] "Generic (PLEG): container finished" podID="7af78f17-adac-4fbd-9074-640b6c6fb0dd" containerID="5a2f73cfa41449909eb1994b3957efca37b5eeb3fe1f24eb9ce2bb8307cbe40c" exitCode=0 Oct 11 02:01:36 crc kubenswrapper[4743]: I1011 02:01:36.239023 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9p2gm" Oct 11 02:01:36 crc kubenswrapper[4743]: I1011 02:01:36.239032 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9p2gm" event={"ID":"7af78f17-adac-4fbd-9074-640b6c6fb0dd","Type":"ContainerDied","Data":"5a2f73cfa41449909eb1994b3957efca37b5eeb3fe1f24eb9ce2bb8307cbe40c"} Oct 11 02:01:36 crc kubenswrapper[4743]: I1011 02:01:36.239113 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9p2gm" event={"ID":"7af78f17-adac-4fbd-9074-640b6c6fb0dd","Type":"ContainerDied","Data":"d48a1e51fb8429c60cfc5580b5e71671c9e5f451d4aff1690ba93a191a50fe96"} Oct 11 02:01:36 crc kubenswrapper[4743]: I1011 02:01:36.239152 4743 scope.go:117] "RemoveContainer" containerID="5a2f73cfa41449909eb1994b3957efca37b5eeb3fe1f24eb9ce2bb8307cbe40c" Oct 11 02:01:36 crc kubenswrapper[4743]: I1011 02:01:36.273653 4743 scope.go:117] "RemoveContainer" containerID="24ff87dea2bd606413d1a12da629588acf5d78543e0a783cae6df1dc1280698f" Oct 11 02:01:36 crc kubenswrapper[4743]: I1011 02:01:36.296272 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9p2gm"] Oct 11 02:01:36 crc kubenswrapper[4743]: I1011 02:01:36.302357 4743 scope.go:117] "RemoveContainer" containerID="64a21a6712e085a221f1bf56516cda4c2615017ed7a93a7491e1298df72e798b" Oct 11 02:01:36 crc kubenswrapper[4743]: I1011 02:01:36.310382 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9p2gm"] Oct 11 02:01:36 crc kubenswrapper[4743]: I1011 02:01:36.356074 4743 scope.go:117] "RemoveContainer" containerID="5a2f73cfa41449909eb1994b3957efca37b5eeb3fe1f24eb9ce2bb8307cbe40c" Oct 11 02:01:36 crc kubenswrapper[4743]: E1011 02:01:36.356671 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a2f73cfa41449909eb1994b3957efca37b5eeb3fe1f24eb9ce2bb8307cbe40c\": container with ID starting with 5a2f73cfa41449909eb1994b3957efca37b5eeb3fe1f24eb9ce2bb8307cbe40c not found: ID does not exist" containerID="5a2f73cfa41449909eb1994b3957efca37b5eeb3fe1f24eb9ce2bb8307cbe40c" Oct 11 02:01:36 crc kubenswrapper[4743]: I1011 02:01:36.356776 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a2f73cfa41449909eb1994b3957efca37b5eeb3fe1f24eb9ce2bb8307cbe40c"} err="failed to get container status \"5a2f73cfa41449909eb1994b3957efca37b5eeb3fe1f24eb9ce2bb8307cbe40c\": rpc error: code = NotFound desc = could not find container \"5a2f73cfa41449909eb1994b3957efca37b5eeb3fe1f24eb9ce2bb8307cbe40c\": container with ID starting with 5a2f73cfa41449909eb1994b3957efca37b5eeb3fe1f24eb9ce2bb8307cbe40c not found: ID does not exist" Oct 11 02:01:36 crc kubenswrapper[4743]: I1011 02:01:36.356980 4743 scope.go:117] "RemoveContainer" containerID="24ff87dea2bd606413d1a12da629588acf5d78543e0a783cae6df1dc1280698f" Oct 11 02:01:36 crc kubenswrapper[4743]: E1011 02:01:36.357341 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24ff87dea2bd606413d1a12da629588acf5d78543e0a783cae6df1dc1280698f\": container with ID starting with 24ff87dea2bd606413d1a12da629588acf5d78543e0a783cae6df1dc1280698f not found: ID does not exist" containerID="24ff87dea2bd606413d1a12da629588acf5d78543e0a783cae6df1dc1280698f" Oct 11 02:01:36 crc kubenswrapper[4743]: I1011 02:01:36.357430 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24ff87dea2bd606413d1a12da629588acf5d78543e0a783cae6df1dc1280698f"} err="failed to get container status \"24ff87dea2bd606413d1a12da629588acf5d78543e0a783cae6df1dc1280698f\": rpc error: code = NotFound desc = could not find container \"24ff87dea2bd606413d1a12da629588acf5d78543e0a783cae6df1dc1280698f\": container with ID starting with 24ff87dea2bd606413d1a12da629588acf5d78543e0a783cae6df1dc1280698f not found: ID does not exist" Oct 11 02:01:36 crc kubenswrapper[4743]: I1011 02:01:36.357503 4743 scope.go:117] "RemoveContainer" containerID="64a21a6712e085a221f1bf56516cda4c2615017ed7a93a7491e1298df72e798b" Oct 11 02:01:36 crc kubenswrapper[4743]: E1011 02:01:36.357852 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64a21a6712e085a221f1bf56516cda4c2615017ed7a93a7491e1298df72e798b\": container with ID starting with 64a21a6712e085a221f1bf56516cda4c2615017ed7a93a7491e1298df72e798b not found: ID does not exist" containerID="64a21a6712e085a221f1bf56516cda4c2615017ed7a93a7491e1298df72e798b" Oct 11 02:01:36 crc kubenswrapper[4743]: I1011 02:01:36.358049 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64a21a6712e085a221f1bf56516cda4c2615017ed7a93a7491e1298df72e798b"} err="failed to get container status \"64a21a6712e085a221f1bf56516cda4c2615017ed7a93a7491e1298df72e798b\": rpc error: code = NotFound desc = could not find container \"64a21a6712e085a221f1bf56516cda4c2615017ed7a93a7491e1298df72e798b\": container with ID starting with 64a21a6712e085a221f1bf56516cda4c2615017ed7a93a7491e1298df72e798b not found: ID does not exist" Oct 11 02:01:38 crc kubenswrapper[4743]: I1011 02:01:38.105895 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7af78f17-adac-4fbd-9074-640b6c6fb0dd" path="/var/lib/kubelet/pods/7af78f17-adac-4fbd-9074-640b6c6fb0dd/volumes" Oct 11 02:02:14 crc kubenswrapper[4743]: I1011 02:02:14.457923 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:02:14 crc kubenswrapper[4743]: I1011 02:02:14.458568 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:02:44 crc kubenswrapper[4743]: I1011 02:02:44.458550 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:02:44 crc kubenswrapper[4743]: I1011 02:02:44.461497 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:03:14 crc kubenswrapper[4743]: I1011 02:03:14.458061 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:03:14 crc kubenswrapper[4743]: I1011 02:03:14.458691 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:03:14 crc kubenswrapper[4743]: I1011 02:03:14.458749 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 02:03:14 crc kubenswrapper[4743]: I1011 02:03:14.459793 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 02:03:14 crc kubenswrapper[4743]: I1011 02:03:14.459937 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" gracePeriod=600 Oct 11 02:03:14 crc kubenswrapper[4743]: E1011 02:03:14.596149 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:03:15 crc kubenswrapper[4743]: I1011 02:03:15.417242 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" exitCode=0 Oct 11 02:03:15 crc kubenswrapper[4743]: I1011 02:03:15.417328 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57"} Oct 11 02:03:15 crc kubenswrapper[4743]: I1011 02:03:15.417670 4743 scope.go:117] "RemoveContainer" containerID="0c6c02f491c08a5f1c576130dfd249554a30c097963b0370bce3acf791a47aa1" Oct 11 02:03:15 crc kubenswrapper[4743]: I1011 02:03:15.422978 4743 scope.go:117] "RemoveContainer" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" Oct 11 02:03:15 crc kubenswrapper[4743]: E1011 02:03:15.423639 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:03:29 crc kubenswrapper[4743]: I1011 02:03:29.452561 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-95blz"] Oct 11 02:03:29 crc kubenswrapper[4743]: E1011 02:03:29.455093 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7af78f17-adac-4fbd-9074-640b6c6fb0dd" containerName="registry-server" Oct 11 02:03:29 crc kubenswrapper[4743]: I1011 02:03:29.455211 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af78f17-adac-4fbd-9074-640b6c6fb0dd" containerName="registry-server" Oct 11 02:03:29 crc kubenswrapper[4743]: E1011 02:03:29.455338 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7af78f17-adac-4fbd-9074-640b6c6fb0dd" containerName="extract-utilities" Oct 11 02:03:29 crc kubenswrapper[4743]: I1011 02:03:29.455436 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af78f17-adac-4fbd-9074-640b6c6fb0dd" containerName="extract-utilities" Oct 11 02:03:29 crc kubenswrapper[4743]: E1011 02:03:29.455534 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7af78f17-adac-4fbd-9074-640b6c6fb0dd" containerName="extract-content" Oct 11 02:03:29 crc kubenswrapper[4743]: I1011 02:03:29.455611 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af78f17-adac-4fbd-9074-640b6c6fb0dd" containerName="extract-content" Oct 11 02:03:29 crc kubenswrapper[4743]: I1011 02:03:29.456026 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7af78f17-adac-4fbd-9074-640b6c6fb0dd" containerName="registry-server" Oct 11 02:03:29 crc kubenswrapper[4743]: I1011 02:03:29.457794 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-95blz" Oct 11 02:03:29 crc kubenswrapper[4743]: I1011 02:03:29.471071 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-95blz"] Oct 11 02:03:29 crc kubenswrapper[4743]: I1011 02:03:29.492565 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97d92\" (UniqueName: \"kubernetes.io/projected/abfc61b4-eac9-412b-aacb-5faf5773509d-kube-api-access-97d92\") pod \"redhat-marketplace-95blz\" (UID: \"abfc61b4-eac9-412b-aacb-5faf5773509d\") " pod="openshift-marketplace/redhat-marketplace-95blz" Oct 11 02:03:29 crc kubenswrapper[4743]: I1011 02:03:29.492731 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abfc61b4-eac9-412b-aacb-5faf5773509d-catalog-content\") pod \"redhat-marketplace-95blz\" (UID: \"abfc61b4-eac9-412b-aacb-5faf5773509d\") " pod="openshift-marketplace/redhat-marketplace-95blz" Oct 11 02:03:29 crc kubenswrapper[4743]: I1011 02:03:29.492791 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abfc61b4-eac9-412b-aacb-5faf5773509d-utilities\") pod \"redhat-marketplace-95blz\" (UID: \"abfc61b4-eac9-412b-aacb-5faf5773509d\") " pod="openshift-marketplace/redhat-marketplace-95blz" Oct 11 02:03:29 crc kubenswrapper[4743]: I1011 02:03:29.594239 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97d92\" (UniqueName: \"kubernetes.io/projected/abfc61b4-eac9-412b-aacb-5faf5773509d-kube-api-access-97d92\") pod \"redhat-marketplace-95blz\" (UID: \"abfc61b4-eac9-412b-aacb-5faf5773509d\") " pod="openshift-marketplace/redhat-marketplace-95blz" Oct 11 02:03:29 crc kubenswrapper[4743]: I1011 02:03:29.594398 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abfc61b4-eac9-412b-aacb-5faf5773509d-catalog-content\") pod \"redhat-marketplace-95blz\" (UID: \"abfc61b4-eac9-412b-aacb-5faf5773509d\") " pod="openshift-marketplace/redhat-marketplace-95blz" Oct 11 02:03:29 crc kubenswrapper[4743]: I1011 02:03:29.594446 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abfc61b4-eac9-412b-aacb-5faf5773509d-utilities\") pod \"redhat-marketplace-95blz\" (UID: \"abfc61b4-eac9-412b-aacb-5faf5773509d\") " pod="openshift-marketplace/redhat-marketplace-95blz" Oct 11 02:03:29 crc kubenswrapper[4743]: I1011 02:03:29.594932 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abfc61b4-eac9-412b-aacb-5faf5773509d-utilities\") pod \"redhat-marketplace-95blz\" (UID: \"abfc61b4-eac9-412b-aacb-5faf5773509d\") " pod="openshift-marketplace/redhat-marketplace-95blz" Oct 11 02:03:29 crc kubenswrapper[4743]: I1011 02:03:29.595415 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abfc61b4-eac9-412b-aacb-5faf5773509d-catalog-content\") pod \"redhat-marketplace-95blz\" (UID: \"abfc61b4-eac9-412b-aacb-5faf5773509d\") " pod="openshift-marketplace/redhat-marketplace-95blz" Oct 11 02:03:29 crc kubenswrapper[4743]: I1011 02:03:29.618748 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97d92\" (UniqueName: \"kubernetes.io/projected/abfc61b4-eac9-412b-aacb-5faf5773509d-kube-api-access-97d92\") pod \"redhat-marketplace-95blz\" (UID: \"abfc61b4-eac9-412b-aacb-5faf5773509d\") " pod="openshift-marketplace/redhat-marketplace-95blz" Oct 11 02:03:29 crc kubenswrapper[4743]: I1011 02:03:29.789726 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-95blz" Oct 11 02:03:30 crc kubenswrapper[4743]: I1011 02:03:30.091757 4743 scope.go:117] "RemoveContainer" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" Oct 11 02:03:30 crc kubenswrapper[4743]: E1011 02:03:30.092338 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:03:30 crc kubenswrapper[4743]: I1011 02:03:30.272334 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-95blz"] Oct 11 02:03:30 crc kubenswrapper[4743]: I1011 02:03:30.590102 4743 generic.go:334] "Generic (PLEG): container finished" podID="abfc61b4-eac9-412b-aacb-5faf5773509d" containerID="c9c19ca6d37c3a2accfac44740e7f8c638125d8c8626e4de68cd4d631bc6657f" exitCode=0 Oct 11 02:03:30 crc kubenswrapper[4743]: I1011 02:03:30.590184 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-95blz" event={"ID":"abfc61b4-eac9-412b-aacb-5faf5773509d","Type":"ContainerDied","Data":"c9c19ca6d37c3a2accfac44740e7f8c638125d8c8626e4de68cd4d631bc6657f"} Oct 11 02:03:30 crc kubenswrapper[4743]: I1011 02:03:30.590626 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-95blz" event={"ID":"abfc61b4-eac9-412b-aacb-5faf5773509d","Type":"ContainerStarted","Data":"f81adfa027ba8511fe853d08e6d8e896e7a5430ccd7c969ec1e48b6c32a73036"} Oct 11 02:03:32 crc kubenswrapper[4743]: I1011 02:03:32.613971 4743 generic.go:334] "Generic (PLEG): container finished" podID="abfc61b4-eac9-412b-aacb-5faf5773509d" containerID="c50f572766dfe47fa7cf643f88689917873c0d626808c28f8ff6da17c4c39b88" exitCode=0 Oct 11 02:03:32 crc kubenswrapper[4743]: I1011 02:03:32.614022 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-95blz" event={"ID":"abfc61b4-eac9-412b-aacb-5faf5773509d","Type":"ContainerDied","Data":"c50f572766dfe47fa7cf643f88689917873c0d626808c28f8ff6da17c4c39b88"} Oct 11 02:03:33 crc kubenswrapper[4743]: I1011 02:03:33.625825 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-95blz" event={"ID":"abfc61b4-eac9-412b-aacb-5faf5773509d","Type":"ContainerStarted","Data":"9398b8963f35bb69d202f2fcd62f89f77865d5eac9ef22862593fadc9c44ca0b"} Oct 11 02:03:33 crc kubenswrapper[4743]: I1011 02:03:33.651555 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-95blz" podStartSLOduration=2.162286606 podStartE2EDuration="4.651538448s" podCreationTimestamp="2025-10-11 02:03:29 +0000 UTC" firstStartedPulling="2025-10-11 02:03:30.592100258 +0000 UTC m=+4305.245080675" lastFinishedPulling="2025-10-11 02:03:33.08135212 +0000 UTC m=+4307.734332517" observedRunningTime="2025-10-11 02:03:33.64314006 +0000 UTC m=+4308.296120457" watchObservedRunningTime="2025-10-11 02:03:33.651538448 +0000 UTC m=+4308.304518845" Oct 11 02:03:39 crc kubenswrapper[4743]: I1011 02:03:39.790043 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-95blz" Oct 11 02:03:39 crc kubenswrapper[4743]: I1011 02:03:39.790631 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-95blz" Oct 11 02:03:39 crc kubenswrapper[4743]: I1011 02:03:39.885917 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-95blz" Oct 11 02:03:40 crc kubenswrapper[4743]: I1011 02:03:40.756648 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-95blz" Oct 11 02:03:40 crc kubenswrapper[4743]: I1011 02:03:40.822162 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-95blz"] Oct 11 02:03:42 crc kubenswrapper[4743]: I1011 02:03:42.092393 4743 scope.go:117] "RemoveContainer" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" Oct 11 02:03:42 crc kubenswrapper[4743]: E1011 02:03:42.093025 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:03:42 crc kubenswrapper[4743]: I1011 02:03:42.737197 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-95blz" podUID="abfc61b4-eac9-412b-aacb-5faf5773509d" containerName="registry-server" containerID="cri-o://9398b8963f35bb69d202f2fcd62f89f77865d5eac9ef22862593fadc9c44ca0b" gracePeriod=2 Oct 11 02:03:43 crc kubenswrapper[4743]: I1011 02:03:43.311018 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-95blz" Oct 11 02:03:43 crc kubenswrapper[4743]: I1011 02:03:43.485154 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abfc61b4-eac9-412b-aacb-5faf5773509d-catalog-content\") pod \"abfc61b4-eac9-412b-aacb-5faf5773509d\" (UID: \"abfc61b4-eac9-412b-aacb-5faf5773509d\") " Oct 11 02:03:43 crc kubenswrapper[4743]: I1011 02:03:43.485385 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abfc61b4-eac9-412b-aacb-5faf5773509d-utilities\") pod \"abfc61b4-eac9-412b-aacb-5faf5773509d\" (UID: \"abfc61b4-eac9-412b-aacb-5faf5773509d\") " Oct 11 02:03:43 crc kubenswrapper[4743]: I1011 02:03:43.485459 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97d92\" (UniqueName: \"kubernetes.io/projected/abfc61b4-eac9-412b-aacb-5faf5773509d-kube-api-access-97d92\") pod \"abfc61b4-eac9-412b-aacb-5faf5773509d\" (UID: \"abfc61b4-eac9-412b-aacb-5faf5773509d\") " Oct 11 02:03:43 crc kubenswrapper[4743]: I1011 02:03:43.486556 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abfc61b4-eac9-412b-aacb-5faf5773509d-utilities" (OuterVolumeSpecName: "utilities") pod "abfc61b4-eac9-412b-aacb-5faf5773509d" (UID: "abfc61b4-eac9-412b-aacb-5faf5773509d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:03:43 crc kubenswrapper[4743]: I1011 02:03:43.493056 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abfc61b4-eac9-412b-aacb-5faf5773509d-kube-api-access-97d92" (OuterVolumeSpecName: "kube-api-access-97d92") pod "abfc61b4-eac9-412b-aacb-5faf5773509d" (UID: "abfc61b4-eac9-412b-aacb-5faf5773509d"). InnerVolumeSpecName "kube-api-access-97d92". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:03:43 crc kubenswrapper[4743]: I1011 02:03:43.497162 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abfc61b4-eac9-412b-aacb-5faf5773509d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "abfc61b4-eac9-412b-aacb-5faf5773509d" (UID: "abfc61b4-eac9-412b-aacb-5faf5773509d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:03:43 crc kubenswrapper[4743]: I1011 02:03:43.588016 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abfc61b4-eac9-412b-aacb-5faf5773509d-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 02:03:43 crc kubenswrapper[4743]: I1011 02:03:43.588436 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97d92\" (UniqueName: \"kubernetes.io/projected/abfc61b4-eac9-412b-aacb-5faf5773509d-kube-api-access-97d92\") on node \"crc\" DevicePath \"\"" Oct 11 02:03:43 crc kubenswrapper[4743]: I1011 02:03:43.588594 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abfc61b4-eac9-412b-aacb-5faf5773509d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 02:03:43 crc kubenswrapper[4743]: I1011 02:03:43.748575 4743 generic.go:334] "Generic (PLEG): container finished" podID="abfc61b4-eac9-412b-aacb-5faf5773509d" containerID="9398b8963f35bb69d202f2fcd62f89f77865d5eac9ef22862593fadc9c44ca0b" exitCode=0 Oct 11 02:03:43 crc kubenswrapper[4743]: I1011 02:03:43.748622 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-95blz" event={"ID":"abfc61b4-eac9-412b-aacb-5faf5773509d","Type":"ContainerDied","Data":"9398b8963f35bb69d202f2fcd62f89f77865d5eac9ef22862593fadc9c44ca0b"} Oct 11 02:03:43 crc kubenswrapper[4743]: I1011 02:03:43.748631 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-95blz" Oct 11 02:03:43 crc kubenswrapper[4743]: I1011 02:03:43.748654 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-95blz" event={"ID":"abfc61b4-eac9-412b-aacb-5faf5773509d","Type":"ContainerDied","Data":"f81adfa027ba8511fe853d08e6d8e896e7a5430ccd7c969ec1e48b6c32a73036"} Oct 11 02:03:43 crc kubenswrapper[4743]: I1011 02:03:43.748674 4743 scope.go:117] "RemoveContainer" containerID="9398b8963f35bb69d202f2fcd62f89f77865d5eac9ef22862593fadc9c44ca0b" Oct 11 02:03:43 crc kubenswrapper[4743]: I1011 02:03:43.783599 4743 scope.go:117] "RemoveContainer" containerID="c50f572766dfe47fa7cf643f88689917873c0d626808c28f8ff6da17c4c39b88" Oct 11 02:03:43 crc kubenswrapper[4743]: I1011 02:03:43.784687 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-95blz"] Oct 11 02:03:43 crc kubenswrapper[4743]: I1011 02:03:43.793945 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-95blz"] Oct 11 02:03:43 crc kubenswrapper[4743]: I1011 02:03:43.817174 4743 scope.go:117] "RemoveContainer" containerID="c9c19ca6d37c3a2accfac44740e7f8c638125d8c8626e4de68cd4d631bc6657f" Oct 11 02:03:43 crc kubenswrapper[4743]: I1011 02:03:43.876910 4743 scope.go:117] "RemoveContainer" containerID="9398b8963f35bb69d202f2fcd62f89f77865d5eac9ef22862593fadc9c44ca0b" Oct 11 02:03:43 crc kubenswrapper[4743]: E1011 02:03:43.877502 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9398b8963f35bb69d202f2fcd62f89f77865d5eac9ef22862593fadc9c44ca0b\": container with ID starting with 9398b8963f35bb69d202f2fcd62f89f77865d5eac9ef22862593fadc9c44ca0b not found: ID does not exist" containerID="9398b8963f35bb69d202f2fcd62f89f77865d5eac9ef22862593fadc9c44ca0b" Oct 11 02:03:43 crc kubenswrapper[4743]: I1011 02:03:43.877549 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9398b8963f35bb69d202f2fcd62f89f77865d5eac9ef22862593fadc9c44ca0b"} err="failed to get container status \"9398b8963f35bb69d202f2fcd62f89f77865d5eac9ef22862593fadc9c44ca0b\": rpc error: code = NotFound desc = could not find container \"9398b8963f35bb69d202f2fcd62f89f77865d5eac9ef22862593fadc9c44ca0b\": container with ID starting with 9398b8963f35bb69d202f2fcd62f89f77865d5eac9ef22862593fadc9c44ca0b not found: ID does not exist" Oct 11 02:03:43 crc kubenswrapper[4743]: I1011 02:03:43.877575 4743 scope.go:117] "RemoveContainer" containerID="c50f572766dfe47fa7cf643f88689917873c0d626808c28f8ff6da17c4c39b88" Oct 11 02:03:43 crc kubenswrapper[4743]: E1011 02:03:43.878070 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c50f572766dfe47fa7cf643f88689917873c0d626808c28f8ff6da17c4c39b88\": container with ID starting with c50f572766dfe47fa7cf643f88689917873c0d626808c28f8ff6da17c4c39b88 not found: ID does not exist" containerID="c50f572766dfe47fa7cf643f88689917873c0d626808c28f8ff6da17c4c39b88" Oct 11 02:03:43 crc kubenswrapper[4743]: I1011 02:03:43.878098 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c50f572766dfe47fa7cf643f88689917873c0d626808c28f8ff6da17c4c39b88"} err="failed to get container status \"c50f572766dfe47fa7cf643f88689917873c0d626808c28f8ff6da17c4c39b88\": rpc error: code = NotFound desc = could not find container \"c50f572766dfe47fa7cf643f88689917873c0d626808c28f8ff6da17c4c39b88\": container with ID starting with c50f572766dfe47fa7cf643f88689917873c0d626808c28f8ff6da17c4c39b88 not found: ID does not exist" Oct 11 02:03:43 crc kubenswrapper[4743]: I1011 02:03:43.878116 4743 scope.go:117] "RemoveContainer" containerID="c9c19ca6d37c3a2accfac44740e7f8c638125d8c8626e4de68cd4d631bc6657f" Oct 11 02:03:43 crc kubenswrapper[4743]: E1011 02:03:43.878469 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9c19ca6d37c3a2accfac44740e7f8c638125d8c8626e4de68cd4d631bc6657f\": container with ID starting with c9c19ca6d37c3a2accfac44740e7f8c638125d8c8626e4de68cd4d631bc6657f not found: ID does not exist" containerID="c9c19ca6d37c3a2accfac44740e7f8c638125d8c8626e4de68cd4d631bc6657f" Oct 11 02:03:43 crc kubenswrapper[4743]: I1011 02:03:43.878510 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9c19ca6d37c3a2accfac44740e7f8c638125d8c8626e4de68cd4d631bc6657f"} err="failed to get container status \"c9c19ca6d37c3a2accfac44740e7f8c638125d8c8626e4de68cd4d631bc6657f\": rpc error: code = NotFound desc = could not find container \"c9c19ca6d37c3a2accfac44740e7f8c638125d8c8626e4de68cd4d631bc6657f\": container with ID starting with c9c19ca6d37c3a2accfac44740e7f8c638125d8c8626e4de68cd4d631bc6657f not found: ID does not exist" Oct 11 02:03:44 crc kubenswrapper[4743]: I1011 02:03:44.104694 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abfc61b4-eac9-412b-aacb-5faf5773509d" path="/var/lib/kubelet/pods/abfc61b4-eac9-412b-aacb-5faf5773509d/volumes" Oct 11 02:03:56 crc kubenswrapper[4743]: I1011 02:03:56.105923 4743 scope.go:117] "RemoveContainer" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" Oct 11 02:03:56 crc kubenswrapper[4743]: E1011 02:03:56.107127 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:04:08 crc kubenswrapper[4743]: I1011 02:04:08.091755 4743 scope.go:117] "RemoveContainer" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" Oct 11 02:04:08 crc kubenswrapper[4743]: E1011 02:04:08.092847 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:04:22 crc kubenswrapper[4743]: I1011 02:04:22.091825 4743 scope.go:117] "RemoveContainer" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" Oct 11 02:04:22 crc kubenswrapper[4743]: E1011 02:04:22.092600 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:04:35 crc kubenswrapper[4743]: I1011 02:04:35.091794 4743 scope.go:117] "RemoveContainer" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" Oct 11 02:04:35 crc kubenswrapper[4743]: E1011 02:04:35.092705 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:04:50 crc kubenswrapper[4743]: I1011 02:04:50.092624 4743 scope.go:117] "RemoveContainer" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" Oct 11 02:04:50 crc kubenswrapper[4743]: E1011 02:04:50.093765 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:04:55 crc kubenswrapper[4743]: I1011 02:04:55.500349 4743 scope.go:117] "RemoveContainer" containerID="4f87eb815b086fa6a9da34e53bdc5de23638c2fea6b50577b61135c5a65a28fb" Oct 11 02:04:55 crc kubenswrapper[4743]: I1011 02:04:55.532639 4743 scope.go:117] "RemoveContainer" containerID="28e3d0944841653e74907c8e526eec311a2d3be97b435178ed06559a1eed0af4" Oct 11 02:04:55 crc kubenswrapper[4743]: I1011 02:04:55.685203 4743 scope.go:117] "RemoveContainer" containerID="857c01630f84b2e6694ebdad4ce0a148f2a27bb07db1bf1a9d67371e951d1fc8" Oct 11 02:05:04 crc kubenswrapper[4743]: I1011 02:05:04.092237 4743 scope.go:117] "RemoveContainer" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" Oct 11 02:05:04 crc kubenswrapper[4743]: E1011 02:05:04.093435 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:05:19 crc kubenswrapper[4743]: I1011 02:05:19.091713 4743 scope.go:117] "RemoveContainer" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" Oct 11 02:05:19 crc kubenswrapper[4743]: E1011 02:05:19.092324 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:05:32 crc kubenswrapper[4743]: I1011 02:05:32.091986 4743 scope.go:117] "RemoveContainer" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" Oct 11 02:05:32 crc kubenswrapper[4743]: E1011 02:05:32.092824 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:05:38 crc kubenswrapper[4743]: I1011 02:05:38.199142 4743 generic.go:334] "Generic (PLEG): container finished" podID="75f90fbf-75a7-4b2a-af1a-7693cebeaea3" containerID="e868714e6758abdf54dea905e5f34aa7cb66137b677d135912f09d644860a77f" exitCode=0 Oct 11 02:05:38 crc kubenswrapper[4743]: I1011 02:05:38.199207 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" event={"ID":"75f90fbf-75a7-4b2a-af1a-7693cebeaea3","Type":"ContainerDied","Data":"e868714e6758abdf54dea905e5f34aa7cb66137b677d135912f09d644860a77f"} Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.730354 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.786900 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-cell1-compute-config-0\") pod \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.787018 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-custom-ceph-combined-ca-bundle\") pod \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.787100 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-ssh-key\") pod \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.787150 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-cell1-compute-config-1\") pod \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.787178 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-ceph\") pod \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.787205 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-migration-ssh-key-1\") pod \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.787261 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6kgt\" (UniqueName: \"kubernetes.io/projected/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-kube-api-access-h6kgt\") pod \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.787342 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-extra-config-0\") pod \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.787383 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-ceph-nova-0\") pod \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.803023 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "75f90fbf-75a7-4b2a-af1a-7693cebeaea3" (UID: "75f90fbf-75a7-4b2a-af1a-7693cebeaea3"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.804632 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-ceph" (OuterVolumeSpecName: "ceph") pod "75f90fbf-75a7-4b2a-af1a-7693cebeaea3" (UID: "75f90fbf-75a7-4b2a-af1a-7693cebeaea3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.809542 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-kube-api-access-h6kgt" (OuterVolumeSpecName: "kube-api-access-h6kgt") pod "75f90fbf-75a7-4b2a-af1a-7693cebeaea3" (UID: "75f90fbf-75a7-4b2a-af1a-7693cebeaea3"). InnerVolumeSpecName "kube-api-access-h6kgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.839771 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "75f90fbf-75a7-4b2a-af1a-7693cebeaea3" (UID: "75f90fbf-75a7-4b2a-af1a-7693cebeaea3"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.843088 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "75f90fbf-75a7-4b2a-af1a-7693cebeaea3" (UID: "75f90fbf-75a7-4b2a-af1a-7693cebeaea3"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.844483 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "75f90fbf-75a7-4b2a-af1a-7693cebeaea3" (UID: "75f90fbf-75a7-4b2a-af1a-7693cebeaea3"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.868001 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "75f90fbf-75a7-4b2a-af1a-7693cebeaea3" (UID: "75f90fbf-75a7-4b2a-af1a-7693cebeaea3"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.868731 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "75f90fbf-75a7-4b2a-af1a-7693cebeaea3" (UID: "75f90fbf-75a7-4b2a-af1a-7693cebeaea3"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.887173 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "75f90fbf-75a7-4b2a-af1a-7693cebeaea3" (UID: "75f90fbf-75a7-4b2a-af1a-7693cebeaea3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.892573 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-migration-ssh-key-0\") pod \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.892626 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-inventory\") pod \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\" (UID: \"75f90fbf-75a7-4b2a-af1a-7693cebeaea3\") " Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.893210 4743 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.893248 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.893261 4743 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.893281 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-ceph\") on node \"crc\" DevicePath \"\"" Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.893290 4743 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.893299 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6kgt\" (UniqueName: \"kubernetes.io/projected/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-kube-api-access-h6kgt\") on node \"crc\" DevicePath \"\"" Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.893308 4743 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.893319 4743 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.893328 4743 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.920272 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-inventory" (OuterVolumeSpecName: "inventory") pod "75f90fbf-75a7-4b2a-af1a-7693cebeaea3" (UID: "75f90fbf-75a7-4b2a-af1a-7693cebeaea3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.922849 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "75f90fbf-75a7-4b2a-af1a-7693cebeaea3" (UID: "75f90fbf-75a7-4b2a-af1a-7693cebeaea3"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.994679 4743 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 11 02:05:39 crc kubenswrapper[4743]: I1011 02:05:39.994708 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75f90fbf-75a7-4b2a-af1a-7693cebeaea3-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.220039 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" event={"ID":"75f90fbf-75a7-4b2a-af1a-7693cebeaea3","Type":"ContainerDied","Data":"2758ff1d07190f9ffa13ed85a5afbee76a80882b0b1daad0b7aa646b1e6964d2"} Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.220086 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2758ff1d07190f9ffa13ed85a5afbee76a80882b0b1daad0b7aa646b1e6964d2" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.220090 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.340872 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd"] Oct 11 02:05:40 crc kubenswrapper[4743]: E1011 02:05:40.341587 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75f90fbf-75a7-4b2a-af1a-7693cebeaea3" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.341729 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f90fbf-75a7-4b2a-af1a-7693cebeaea3" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 11 02:05:40 crc kubenswrapper[4743]: E1011 02:05:40.341845 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abfc61b4-eac9-412b-aacb-5faf5773509d" containerName="registry-server" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.341945 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="abfc61b4-eac9-412b-aacb-5faf5773509d" containerName="registry-server" Oct 11 02:05:40 crc kubenswrapper[4743]: E1011 02:05:40.342068 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abfc61b4-eac9-412b-aacb-5faf5773509d" containerName="extract-utilities" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.342144 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="abfc61b4-eac9-412b-aacb-5faf5773509d" containerName="extract-utilities" Oct 11 02:05:40 crc kubenswrapper[4743]: E1011 02:05:40.342219 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abfc61b4-eac9-412b-aacb-5faf5773509d" containerName="extract-content" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.342289 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="abfc61b4-eac9-412b-aacb-5faf5773509d" containerName="extract-content" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.342621 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="75f90fbf-75a7-4b2a-af1a-7693cebeaea3" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.342716 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="abfc61b4-eac9-412b-aacb-5faf5773509d" containerName="registry-server" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.343689 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.346902 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.347290 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.348013 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.348414 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.348835 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.349187 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.359403 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd"] Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.402152 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.402206 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.402275 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.402337 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.402407 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ceph\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.402446 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.402615 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.402845 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj56f\" (UniqueName: \"kubernetes.io/projected/7585335b-a755-40a1-b388-d90e2fa07121-kube-api-access-xj56f\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.504414 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.504452 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.504477 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.504507 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.504539 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ceph\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.504561 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.504622 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.504684 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj56f\" (UniqueName: \"kubernetes.io/projected/7585335b-a755-40a1-b388-d90e2fa07121-kube-api-access-xj56f\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.508094 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.508963 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.509237 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.509720 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.509909 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.510456 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.511507 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ceph\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.521406 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj56f\" (UniqueName: \"kubernetes.io/projected/7585335b-a755-40a1-b388-d90e2fa07121-kube-api-access-xj56f\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:40 crc kubenswrapper[4743]: I1011 02:05:40.681254 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:05:41 crc kubenswrapper[4743]: I1011 02:05:41.277839 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd"] Oct 11 02:05:42 crc kubenswrapper[4743]: I1011 02:05:42.239265 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" event={"ID":"7585335b-a755-40a1-b388-d90e2fa07121","Type":"ContainerStarted","Data":"96e301c9162236e45c4a63a0b0e7d58a52d1f6e36882c63236e28534d22dcdc4"} Oct 11 02:05:42 crc kubenswrapper[4743]: I1011 02:05:42.239906 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" event={"ID":"7585335b-a755-40a1-b388-d90e2fa07121","Type":"ContainerStarted","Data":"3fc0cf747490795cef9c411bcb2852ba23efb6a87149d2d5b9dbb9dd32fd1659"} Oct 11 02:05:42 crc kubenswrapper[4743]: I1011 02:05:42.260746 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" podStartSLOduration=1.722100135 podStartE2EDuration="2.260730631s" podCreationTimestamp="2025-10-11 02:05:40 +0000 UTC" firstStartedPulling="2025-10-11 02:05:41.281852934 +0000 UTC m=+4435.934833331" lastFinishedPulling="2025-10-11 02:05:41.82048343 +0000 UTC m=+4436.473463827" observedRunningTime="2025-10-11 02:05:42.252446386 +0000 UTC m=+4436.905426783" watchObservedRunningTime="2025-10-11 02:05:42.260730631 +0000 UTC m=+4436.913711028" Oct 11 02:05:45 crc kubenswrapper[4743]: I1011 02:05:45.091533 4743 scope.go:117] "RemoveContainer" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" Oct 11 02:05:45 crc kubenswrapper[4743]: E1011 02:05:45.092300 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:05:56 crc kubenswrapper[4743]: I1011 02:05:56.109131 4743 scope.go:117] "RemoveContainer" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" Oct 11 02:05:56 crc kubenswrapper[4743]: E1011 02:05:56.110455 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:06:09 crc kubenswrapper[4743]: I1011 02:06:09.091999 4743 scope.go:117] "RemoveContainer" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" Oct 11 02:06:09 crc kubenswrapper[4743]: E1011 02:06:09.093340 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:06:24 crc kubenswrapper[4743]: I1011 02:06:24.091971 4743 scope.go:117] "RemoveContainer" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" Oct 11 02:06:24 crc kubenswrapper[4743]: E1011 02:06:24.092990 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:06:37 crc kubenswrapper[4743]: I1011 02:06:37.092750 4743 scope.go:117] "RemoveContainer" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" Oct 11 02:06:37 crc kubenswrapper[4743]: E1011 02:06:37.094051 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:06:48 crc kubenswrapper[4743]: I1011 02:06:48.093431 4743 scope.go:117] "RemoveContainer" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" Oct 11 02:06:48 crc kubenswrapper[4743]: E1011 02:06:48.094763 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:07:01 crc kubenswrapper[4743]: I1011 02:07:01.092697 4743 scope.go:117] "RemoveContainer" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" Oct 11 02:07:01 crc kubenswrapper[4743]: E1011 02:07:01.093613 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:07:13 crc kubenswrapper[4743]: I1011 02:07:13.091780 4743 scope.go:117] "RemoveContainer" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" Oct 11 02:07:13 crc kubenswrapper[4743]: E1011 02:07:13.092823 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:07:28 crc kubenswrapper[4743]: I1011 02:07:28.093466 4743 scope.go:117] "RemoveContainer" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" Oct 11 02:07:28 crc kubenswrapper[4743]: E1011 02:07:28.094307 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:07:43 crc kubenswrapper[4743]: I1011 02:07:43.091466 4743 scope.go:117] "RemoveContainer" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" Oct 11 02:07:43 crc kubenswrapper[4743]: E1011 02:07:43.092218 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:07:56 crc kubenswrapper[4743]: I1011 02:07:56.101365 4743 scope.go:117] "RemoveContainer" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" Oct 11 02:07:56 crc kubenswrapper[4743]: E1011 02:07:56.102318 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:08:07 crc kubenswrapper[4743]: I1011 02:08:07.180285 4743 trace.go:236] Trace[1831962929]: "Calculate volume metrics of registry-storage for pod openshift-image-registry/image-registry-66df7c8f76-lhnj4" (11-Oct-2025 02:08:05.903) (total time: 1276ms): Oct 11 02:08:07 crc kubenswrapper[4743]: Trace[1831962929]: [1.276877585s] [1.276877585s] END Oct 11 02:08:09 crc kubenswrapper[4743]: I1011 02:08:09.092329 4743 scope.go:117] "RemoveContainer" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" Oct 11 02:08:09 crc kubenswrapper[4743]: E1011 02:08:09.093172 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:08:21 crc kubenswrapper[4743]: I1011 02:08:21.092193 4743 scope.go:117] "RemoveContainer" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" Oct 11 02:08:22 crc kubenswrapper[4743]: I1011 02:08:22.205055 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"d142e23db0b3ed02fb38edd4569166021ea8fee0edf9a6ae9288591f32b91fb4"} Oct 11 02:09:33 crc kubenswrapper[4743]: I1011 02:09:33.038843 4743 generic.go:334] "Generic (PLEG): container finished" podID="7585335b-a755-40a1-b388-d90e2fa07121" containerID="96e301c9162236e45c4a63a0b0e7d58a52d1f6e36882c63236e28534d22dcdc4" exitCode=0 Oct 11 02:09:33 crc kubenswrapper[4743]: I1011 02:09:33.038959 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" event={"ID":"7585335b-a755-40a1-b388-d90e2fa07121","Type":"ContainerDied","Data":"96e301c9162236e45c4a63a0b0e7d58a52d1f6e36882c63236e28534d22dcdc4"} Oct 11 02:09:34 crc kubenswrapper[4743]: I1011 02:09:34.546413 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:09:34 crc kubenswrapper[4743]: I1011 02:09:34.584784 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ceilometer-compute-config-data-0\") pod \"7585335b-a755-40a1-b388-d90e2fa07121\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " Oct 11 02:09:34 crc kubenswrapper[4743]: I1011 02:09:34.584918 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ssh-key\") pod \"7585335b-a755-40a1-b388-d90e2fa07121\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " Oct 11 02:09:34 crc kubenswrapper[4743]: I1011 02:09:34.584994 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ceilometer-compute-config-data-2\") pod \"7585335b-a755-40a1-b388-d90e2fa07121\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " Oct 11 02:09:34 crc kubenswrapper[4743]: I1011 02:09:34.585072 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj56f\" (UniqueName: \"kubernetes.io/projected/7585335b-a755-40a1-b388-d90e2fa07121-kube-api-access-xj56f\") pod \"7585335b-a755-40a1-b388-d90e2fa07121\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " Oct 11 02:09:34 crc kubenswrapper[4743]: I1011 02:09:34.585111 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ceilometer-compute-config-data-1\") pod \"7585335b-a755-40a1-b388-d90e2fa07121\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " Oct 11 02:09:34 crc kubenswrapper[4743]: I1011 02:09:34.585144 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-inventory\") pod \"7585335b-a755-40a1-b388-d90e2fa07121\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " Oct 11 02:09:34 crc kubenswrapper[4743]: I1011 02:09:34.585201 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-telemetry-combined-ca-bundle\") pod \"7585335b-a755-40a1-b388-d90e2fa07121\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " Oct 11 02:09:34 crc kubenswrapper[4743]: I1011 02:09:34.585225 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ceph\") pod \"7585335b-a755-40a1-b388-d90e2fa07121\" (UID: \"7585335b-a755-40a1-b388-d90e2fa07121\") " Oct 11 02:09:34 crc kubenswrapper[4743]: I1011 02:09:34.591019 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7585335b-a755-40a1-b388-d90e2fa07121-kube-api-access-xj56f" (OuterVolumeSpecName: "kube-api-access-xj56f") pod "7585335b-a755-40a1-b388-d90e2fa07121" (UID: "7585335b-a755-40a1-b388-d90e2fa07121"). InnerVolumeSpecName "kube-api-access-xj56f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:09:34 crc kubenswrapper[4743]: I1011 02:09:34.593071 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "7585335b-a755-40a1-b388-d90e2fa07121" (UID: "7585335b-a755-40a1-b388-d90e2fa07121"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:09:34 crc kubenswrapper[4743]: I1011 02:09:34.593423 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ceph" (OuterVolumeSpecName: "ceph") pod "7585335b-a755-40a1-b388-d90e2fa07121" (UID: "7585335b-a755-40a1-b388-d90e2fa07121"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:09:34 crc kubenswrapper[4743]: I1011 02:09:34.617808 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "7585335b-a755-40a1-b388-d90e2fa07121" (UID: "7585335b-a755-40a1-b388-d90e2fa07121"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:09:34 crc kubenswrapper[4743]: I1011 02:09:34.619053 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "7585335b-a755-40a1-b388-d90e2fa07121" (UID: "7585335b-a755-40a1-b388-d90e2fa07121"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:09:34 crc kubenswrapper[4743]: I1011 02:09:34.632893 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "7585335b-a755-40a1-b388-d90e2fa07121" (UID: "7585335b-a755-40a1-b388-d90e2fa07121"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:09:34 crc kubenswrapper[4743]: I1011 02:09:34.653932 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7585335b-a755-40a1-b388-d90e2fa07121" (UID: "7585335b-a755-40a1-b388-d90e2fa07121"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:09:34 crc kubenswrapper[4743]: I1011 02:09:34.660044 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-inventory" (OuterVolumeSpecName: "inventory") pod "7585335b-a755-40a1-b388-d90e2fa07121" (UID: "7585335b-a755-40a1-b388-d90e2fa07121"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:09:34 crc kubenswrapper[4743]: I1011 02:09:34.687995 4743 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 02:09:34 crc kubenswrapper[4743]: I1011 02:09:34.688034 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ceph\") on node \"crc\" DevicePath \"\"" Oct 11 02:09:34 crc kubenswrapper[4743]: I1011 02:09:34.688045 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 11 02:09:34 crc kubenswrapper[4743]: I1011 02:09:34.688069 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 02:09:34 crc kubenswrapper[4743]: I1011 02:09:34.688081 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 11 02:09:34 crc kubenswrapper[4743]: I1011 02:09:34.688091 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj56f\" (UniqueName: \"kubernetes.io/projected/7585335b-a755-40a1-b388-d90e2fa07121-kube-api-access-xj56f\") on node \"crc\" DevicePath \"\"" Oct 11 02:09:34 crc kubenswrapper[4743]: I1011 02:09:34.688101 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 11 02:09:34 crc kubenswrapper[4743]: I1011 02:09:34.688114 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7585335b-a755-40a1-b388-d90e2fa07121-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.061344 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" event={"ID":"7585335b-a755-40a1-b388-d90e2fa07121","Type":"ContainerDied","Data":"3fc0cf747490795cef9c411bcb2852ba23efb6a87149d2d5b9dbb9dd32fd1659"} Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.061730 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fc0cf747490795cef9c411bcb2852ba23efb6a87149d2d5b9dbb9dd32fd1659" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.061440 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.162801 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg"] Oct 11 02:09:35 crc kubenswrapper[4743]: E1011 02:09:35.163248 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7585335b-a755-40a1-b388-d90e2fa07121" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.163265 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7585335b-a755-40a1-b388-d90e2fa07121" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.163500 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7585335b-a755-40a1-b388-d90e2fa07121" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.164244 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.166321 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.166578 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.166721 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.166893 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.169120 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.179682 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.180706 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg"] Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.205525 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.205707 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts2vx\" (UniqueName: \"kubernetes.io/projected/f10a464d-943b-4c74-88f8-7d76dbdac358-kube-api-access-ts2vx\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.206331 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.206616 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.206851 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ceph\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.206942 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.207047 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.208755 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.310272 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.310387 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.310442 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ceph\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.310466 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.310486 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.310550 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.310580 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.310614 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts2vx\" (UniqueName: \"kubernetes.io/projected/f10a464d-943b-4c74-88f8-7d76dbdac358-kube-api-access-ts2vx\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.316985 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.317128 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.317372 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.317482 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.318575 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.319216 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.326608 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ceph\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.341005 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts2vx\" (UniqueName: \"kubernetes.io/projected/f10a464d-943b-4c74-88f8-7d76dbdac358-kube-api-access-ts2vx\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:35 crc kubenswrapper[4743]: I1011 02:09:35.484618 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:09:36 crc kubenswrapper[4743]: I1011 02:09:36.113848 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg"] Oct 11 02:09:36 crc kubenswrapper[4743]: I1011 02:09:36.125500 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 02:09:37 crc kubenswrapper[4743]: I1011 02:09:37.081923 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" event={"ID":"f10a464d-943b-4c74-88f8-7d76dbdac358","Type":"ContainerStarted","Data":"d1a320585dd1540b8feaab8e12baeea8512edd207ad7486a11c800e70d7a941b"} Oct 11 02:09:38 crc kubenswrapper[4743]: I1011 02:09:38.110785 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" event={"ID":"f10a464d-943b-4c74-88f8-7d76dbdac358","Type":"ContainerStarted","Data":"62c71862e75a3f0f504c234eb5bbec7e907dbf3aca7959f5ea11b7d9e19a6ff5"} Oct 11 02:09:38 crc kubenswrapper[4743]: I1011 02:09:38.129102 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" podStartSLOduration=2.384372818 podStartE2EDuration="3.129083137s" podCreationTimestamp="2025-10-11 02:09:35 +0000 UTC" firstStartedPulling="2025-10-11 02:09:36.125240694 +0000 UTC m=+4670.778221091" lastFinishedPulling="2025-10-11 02:09:36.869951013 +0000 UTC m=+4671.522931410" observedRunningTime="2025-10-11 02:09:38.124207647 +0000 UTC m=+4672.777188084" watchObservedRunningTime="2025-10-11 02:09:38.129083137 +0000 UTC m=+4672.782063554" Oct 11 02:09:45 crc kubenswrapper[4743]: I1011 02:09:45.849168 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xgg8n"] Oct 11 02:09:45 crc kubenswrapper[4743]: I1011 02:09:45.852366 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xgg8n" Oct 11 02:09:45 crc kubenswrapper[4743]: I1011 02:09:45.863818 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xgg8n"] Oct 11 02:09:45 crc kubenswrapper[4743]: I1011 02:09:45.958921 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n8mn\" (UniqueName: \"kubernetes.io/projected/270c7eec-829c-45d1-917a-b5282cc246e7-kube-api-access-9n8mn\") pod \"certified-operators-xgg8n\" (UID: \"270c7eec-829c-45d1-917a-b5282cc246e7\") " pod="openshift-marketplace/certified-operators-xgg8n" Oct 11 02:09:45 crc kubenswrapper[4743]: I1011 02:09:45.959246 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/270c7eec-829c-45d1-917a-b5282cc246e7-utilities\") pod \"certified-operators-xgg8n\" (UID: \"270c7eec-829c-45d1-917a-b5282cc246e7\") " pod="openshift-marketplace/certified-operators-xgg8n" Oct 11 02:09:45 crc kubenswrapper[4743]: I1011 02:09:45.959387 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/270c7eec-829c-45d1-917a-b5282cc246e7-catalog-content\") pod \"certified-operators-xgg8n\" (UID: \"270c7eec-829c-45d1-917a-b5282cc246e7\") " pod="openshift-marketplace/certified-operators-xgg8n" Oct 11 02:09:46 crc kubenswrapper[4743]: I1011 02:09:46.061665 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/270c7eec-829c-45d1-917a-b5282cc246e7-utilities\") pod \"certified-operators-xgg8n\" (UID: \"270c7eec-829c-45d1-917a-b5282cc246e7\") " pod="openshift-marketplace/certified-operators-xgg8n" Oct 11 02:09:46 crc kubenswrapper[4743]: I1011 02:09:46.061793 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/270c7eec-829c-45d1-917a-b5282cc246e7-catalog-content\") pod \"certified-operators-xgg8n\" (UID: \"270c7eec-829c-45d1-917a-b5282cc246e7\") " pod="openshift-marketplace/certified-operators-xgg8n" Oct 11 02:09:46 crc kubenswrapper[4743]: I1011 02:09:46.061846 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n8mn\" (UniqueName: \"kubernetes.io/projected/270c7eec-829c-45d1-917a-b5282cc246e7-kube-api-access-9n8mn\") pod \"certified-operators-xgg8n\" (UID: \"270c7eec-829c-45d1-917a-b5282cc246e7\") " pod="openshift-marketplace/certified-operators-xgg8n" Oct 11 02:09:46 crc kubenswrapper[4743]: I1011 02:09:46.062214 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/270c7eec-829c-45d1-917a-b5282cc246e7-utilities\") pod \"certified-operators-xgg8n\" (UID: \"270c7eec-829c-45d1-917a-b5282cc246e7\") " pod="openshift-marketplace/certified-operators-xgg8n" Oct 11 02:09:46 crc kubenswrapper[4743]: I1011 02:09:46.062462 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/270c7eec-829c-45d1-917a-b5282cc246e7-catalog-content\") pod \"certified-operators-xgg8n\" (UID: \"270c7eec-829c-45d1-917a-b5282cc246e7\") " pod="openshift-marketplace/certified-operators-xgg8n" Oct 11 02:09:46 crc kubenswrapper[4743]: I1011 02:09:46.084493 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n8mn\" (UniqueName: \"kubernetes.io/projected/270c7eec-829c-45d1-917a-b5282cc246e7-kube-api-access-9n8mn\") pod \"certified-operators-xgg8n\" (UID: \"270c7eec-829c-45d1-917a-b5282cc246e7\") " pod="openshift-marketplace/certified-operators-xgg8n" Oct 11 02:09:46 crc kubenswrapper[4743]: I1011 02:09:46.182709 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xgg8n" Oct 11 02:09:46 crc kubenswrapper[4743]: I1011 02:09:46.681452 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xgg8n"] Oct 11 02:09:47 crc kubenswrapper[4743]: I1011 02:09:47.201038 4743 generic.go:334] "Generic (PLEG): container finished" podID="270c7eec-829c-45d1-917a-b5282cc246e7" containerID="773aec117266b64f9e22db2900b7edb4ef62487363d82967be05cfcfdf076bc8" exitCode=0 Oct 11 02:09:47 crc kubenswrapper[4743]: I1011 02:09:47.201443 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgg8n" event={"ID":"270c7eec-829c-45d1-917a-b5282cc246e7","Type":"ContainerDied","Data":"773aec117266b64f9e22db2900b7edb4ef62487363d82967be05cfcfdf076bc8"} Oct 11 02:09:47 crc kubenswrapper[4743]: I1011 02:09:47.201485 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgg8n" event={"ID":"270c7eec-829c-45d1-917a-b5282cc246e7","Type":"ContainerStarted","Data":"a5bfe5e6a688008c52af2b439524b229b3e4479c3b5a1e9191ded6428bcbecd6"} Oct 11 02:09:48 crc kubenswrapper[4743]: I1011 02:09:48.213738 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgg8n" event={"ID":"270c7eec-829c-45d1-917a-b5282cc246e7","Type":"ContainerStarted","Data":"ea55c65bd5230d692023425bd5173225ede83da49afe878f55d9dcb7ebdaa203"} Oct 11 02:09:50 crc kubenswrapper[4743]: I1011 02:09:50.239140 4743 generic.go:334] "Generic (PLEG): container finished" podID="270c7eec-829c-45d1-917a-b5282cc246e7" containerID="ea55c65bd5230d692023425bd5173225ede83da49afe878f55d9dcb7ebdaa203" exitCode=0 Oct 11 02:09:50 crc kubenswrapper[4743]: I1011 02:09:50.239265 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgg8n" event={"ID":"270c7eec-829c-45d1-917a-b5282cc246e7","Type":"ContainerDied","Data":"ea55c65bd5230d692023425bd5173225ede83da49afe878f55d9dcb7ebdaa203"} Oct 11 02:09:51 crc kubenswrapper[4743]: I1011 02:09:51.251580 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgg8n" event={"ID":"270c7eec-829c-45d1-917a-b5282cc246e7","Type":"ContainerStarted","Data":"40f5e3e648fc267e34d142f525bd0fd00b6494f3dfd253854051e65ea50c2068"} Oct 11 02:09:51 crc kubenswrapper[4743]: I1011 02:09:51.291884 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xgg8n" podStartSLOduration=2.864041609 podStartE2EDuration="6.291833359s" podCreationTimestamp="2025-10-11 02:09:45 +0000 UTC" firstStartedPulling="2025-10-11 02:09:47.205628897 +0000 UTC m=+4681.858609324" lastFinishedPulling="2025-10-11 02:09:50.633420627 +0000 UTC m=+4685.286401074" observedRunningTime="2025-10-11 02:09:51.282469827 +0000 UTC m=+4685.935450264" watchObservedRunningTime="2025-10-11 02:09:51.291833359 +0000 UTC m=+4685.944813766" Oct 11 02:09:56 crc kubenswrapper[4743]: I1011 02:09:56.183958 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xgg8n" Oct 11 02:09:56 crc kubenswrapper[4743]: I1011 02:09:56.184722 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xgg8n" Oct 11 02:09:57 crc kubenswrapper[4743]: I1011 02:09:57.248566 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-xgg8n" podUID="270c7eec-829c-45d1-917a-b5282cc246e7" containerName="registry-server" probeResult="failure" output=< Oct 11 02:09:57 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Oct 11 02:09:57 crc kubenswrapper[4743]: > Oct 11 02:10:06 crc kubenswrapper[4743]: I1011 02:10:06.250471 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xgg8n" Oct 11 02:10:06 crc kubenswrapper[4743]: I1011 02:10:06.324072 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xgg8n" Oct 11 02:10:06 crc kubenswrapper[4743]: I1011 02:10:06.503945 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xgg8n"] Oct 11 02:10:07 crc kubenswrapper[4743]: I1011 02:10:07.432409 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xgg8n" podUID="270c7eec-829c-45d1-917a-b5282cc246e7" containerName="registry-server" containerID="cri-o://40f5e3e648fc267e34d142f525bd0fd00b6494f3dfd253854051e65ea50c2068" gracePeriod=2 Oct 11 02:10:08 crc kubenswrapper[4743]: I1011 02:10:08.444360 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xgg8n" Oct 11 02:10:08 crc kubenswrapper[4743]: I1011 02:10:08.447142 4743 generic.go:334] "Generic (PLEG): container finished" podID="270c7eec-829c-45d1-917a-b5282cc246e7" containerID="40f5e3e648fc267e34d142f525bd0fd00b6494f3dfd253854051e65ea50c2068" exitCode=0 Oct 11 02:10:08 crc kubenswrapper[4743]: I1011 02:10:08.447185 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgg8n" event={"ID":"270c7eec-829c-45d1-917a-b5282cc246e7","Type":"ContainerDied","Data":"40f5e3e648fc267e34d142f525bd0fd00b6494f3dfd253854051e65ea50c2068"} Oct 11 02:10:08 crc kubenswrapper[4743]: I1011 02:10:08.447212 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgg8n" event={"ID":"270c7eec-829c-45d1-917a-b5282cc246e7","Type":"ContainerDied","Data":"a5bfe5e6a688008c52af2b439524b229b3e4479c3b5a1e9191ded6428bcbecd6"} Oct 11 02:10:08 crc kubenswrapper[4743]: I1011 02:10:08.447231 4743 scope.go:117] "RemoveContainer" containerID="40f5e3e648fc267e34d142f525bd0fd00b6494f3dfd253854051e65ea50c2068" Oct 11 02:10:08 crc kubenswrapper[4743]: I1011 02:10:08.482574 4743 scope.go:117] "RemoveContainer" containerID="ea55c65bd5230d692023425bd5173225ede83da49afe878f55d9dcb7ebdaa203" Oct 11 02:10:08 crc kubenswrapper[4743]: I1011 02:10:08.511010 4743 scope.go:117] "RemoveContainer" containerID="773aec117266b64f9e22db2900b7edb4ef62487363d82967be05cfcfdf076bc8" Oct 11 02:10:08 crc kubenswrapper[4743]: I1011 02:10:08.560849 4743 scope.go:117] "RemoveContainer" containerID="40f5e3e648fc267e34d142f525bd0fd00b6494f3dfd253854051e65ea50c2068" Oct 11 02:10:08 crc kubenswrapper[4743]: E1011 02:10:08.561364 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40f5e3e648fc267e34d142f525bd0fd00b6494f3dfd253854051e65ea50c2068\": container with ID starting with 40f5e3e648fc267e34d142f525bd0fd00b6494f3dfd253854051e65ea50c2068 not found: ID does not exist" containerID="40f5e3e648fc267e34d142f525bd0fd00b6494f3dfd253854051e65ea50c2068" Oct 11 02:10:08 crc kubenswrapper[4743]: I1011 02:10:08.561411 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40f5e3e648fc267e34d142f525bd0fd00b6494f3dfd253854051e65ea50c2068"} err="failed to get container status \"40f5e3e648fc267e34d142f525bd0fd00b6494f3dfd253854051e65ea50c2068\": rpc error: code = NotFound desc = could not find container \"40f5e3e648fc267e34d142f525bd0fd00b6494f3dfd253854051e65ea50c2068\": container with ID starting with 40f5e3e648fc267e34d142f525bd0fd00b6494f3dfd253854051e65ea50c2068 not found: ID does not exist" Oct 11 02:10:08 crc kubenswrapper[4743]: I1011 02:10:08.561442 4743 scope.go:117] "RemoveContainer" containerID="ea55c65bd5230d692023425bd5173225ede83da49afe878f55d9dcb7ebdaa203" Oct 11 02:10:08 crc kubenswrapper[4743]: E1011 02:10:08.562198 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea55c65bd5230d692023425bd5173225ede83da49afe878f55d9dcb7ebdaa203\": container with ID starting with ea55c65bd5230d692023425bd5173225ede83da49afe878f55d9dcb7ebdaa203 not found: ID does not exist" containerID="ea55c65bd5230d692023425bd5173225ede83da49afe878f55d9dcb7ebdaa203" Oct 11 02:10:08 crc kubenswrapper[4743]: I1011 02:10:08.562260 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea55c65bd5230d692023425bd5173225ede83da49afe878f55d9dcb7ebdaa203"} err="failed to get container status \"ea55c65bd5230d692023425bd5173225ede83da49afe878f55d9dcb7ebdaa203\": rpc error: code = NotFound desc = could not find container \"ea55c65bd5230d692023425bd5173225ede83da49afe878f55d9dcb7ebdaa203\": container with ID starting with ea55c65bd5230d692023425bd5173225ede83da49afe878f55d9dcb7ebdaa203 not found: ID does not exist" Oct 11 02:10:08 crc kubenswrapper[4743]: I1011 02:10:08.562316 4743 scope.go:117] "RemoveContainer" containerID="773aec117266b64f9e22db2900b7edb4ef62487363d82967be05cfcfdf076bc8" Oct 11 02:10:08 crc kubenswrapper[4743]: E1011 02:10:08.562739 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"773aec117266b64f9e22db2900b7edb4ef62487363d82967be05cfcfdf076bc8\": container with ID starting with 773aec117266b64f9e22db2900b7edb4ef62487363d82967be05cfcfdf076bc8 not found: ID does not exist" containerID="773aec117266b64f9e22db2900b7edb4ef62487363d82967be05cfcfdf076bc8" Oct 11 02:10:08 crc kubenswrapper[4743]: I1011 02:10:08.562791 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"773aec117266b64f9e22db2900b7edb4ef62487363d82967be05cfcfdf076bc8"} err="failed to get container status \"773aec117266b64f9e22db2900b7edb4ef62487363d82967be05cfcfdf076bc8\": rpc error: code = NotFound desc = could not find container \"773aec117266b64f9e22db2900b7edb4ef62487363d82967be05cfcfdf076bc8\": container with ID starting with 773aec117266b64f9e22db2900b7edb4ef62487363d82967be05cfcfdf076bc8 not found: ID does not exist" Oct 11 02:10:08 crc kubenswrapper[4743]: I1011 02:10:08.566754 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/270c7eec-829c-45d1-917a-b5282cc246e7-utilities\") pod \"270c7eec-829c-45d1-917a-b5282cc246e7\" (UID: \"270c7eec-829c-45d1-917a-b5282cc246e7\") " Oct 11 02:10:08 crc kubenswrapper[4743]: I1011 02:10:08.566804 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/270c7eec-829c-45d1-917a-b5282cc246e7-catalog-content\") pod \"270c7eec-829c-45d1-917a-b5282cc246e7\" (UID: \"270c7eec-829c-45d1-917a-b5282cc246e7\") " Oct 11 02:10:08 crc kubenswrapper[4743]: I1011 02:10:08.566912 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n8mn\" (UniqueName: \"kubernetes.io/projected/270c7eec-829c-45d1-917a-b5282cc246e7-kube-api-access-9n8mn\") pod \"270c7eec-829c-45d1-917a-b5282cc246e7\" (UID: \"270c7eec-829c-45d1-917a-b5282cc246e7\") " Oct 11 02:10:08 crc kubenswrapper[4743]: I1011 02:10:08.568334 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/270c7eec-829c-45d1-917a-b5282cc246e7-utilities" (OuterVolumeSpecName: "utilities") pod "270c7eec-829c-45d1-917a-b5282cc246e7" (UID: "270c7eec-829c-45d1-917a-b5282cc246e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:10:08 crc kubenswrapper[4743]: I1011 02:10:08.573616 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/270c7eec-829c-45d1-917a-b5282cc246e7-kube-api-access-9n8mn" (OuterVolumeSpecName: "kube-api-access-9n8mn") pod "270c7eec-829c-45d1-917a-b5282cc246e7" (UID: "270c7eec-829c-45d1-917a-b5282cc246e7"). InnerVolumeSpecName "kube-api-access-9n8mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:10:08 crc kubenswrapper[4743]: I1011 02:10:08.607630 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/270c7eec-829c-45d1-917a-b5282cc246e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "270c7eec-829c-45d1-917a-b5282cc246e7" (UID: "270c7eec-829c-45d1-917a-b5282cc246e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:10:08 crc kubenswrapper[4743]: I1011 02:10:08.669915 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/270c7eec-829c-45d1-917a-b5282cc246e7-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 02:10:08 crc kubenswrapper[4743]: I1011 02:10:08.669959 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/270c7eec-829c-45d1-917a-b5282cc246e7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 02:10:08 crc kubenswrapper[4743]: I1011 02:10:08.669973 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n8mn\" (UniqueName: \"kubernetes.io/projected/270c7eec-829c-45d1-917a-b5282cc246e7-kube-api-access-9n8mn\") on node \"crc\" DevicePath \"\"" Oct 11 02:10:09 crc kubenswrapper[4743]: I1011 02:10:09.458199 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xgg8n" Oct 11 02:10:09 crc kubenswrapper[4743]: I1011 02:10:09.500543 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xgg8n"] Oct 11 02:10:09 crc kubenswrapper[4743]: I1011 02:10:09.517355 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xgg8n"] Oct 11 02:10:10 crc kubenswrapper[4743]: I1011 02:10:10.107487 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="270c7eec-829c-45d1-917a-b5282cc246e7" path="/var/lib/kubelet/pods/270c7eec-829c-45d1-917a-b5282cc246e7/volumes" Oct 11 02:10:44 crc kubenswrapper[4743]: I1011 02:10:44.458239 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:10:44 crc kubenswrapper[4743]: I1011 02:10:44.458761 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:11:14 crc kubenswrapper[4743]: I1011 02:11:14.458257 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:11:14 crc kubenswrapper[4743]: I1011 02:11:14.458926 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:11:44 crc kubenswrapper[4743]: I1011 02:11:44.458522 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:11:44 crc kubenswrapper[4743]: I1011 02:11:44.459153 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:11:44 crc kubenswrapper[4743]: I1011 02:11:44.459210 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 02:11:44 crc kubenswrapper[4743]: I1011 02:11:44.460071 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d142e23db0b3ed02fb38edd4569166021ea8fee0edf9a6ae9288591f32b91fb4"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 02:11:44 crc kubenswrapper[4743]: I1011 02:11:44.460127 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://d142e23db0b3ed02fb38edd4569166021ea8fee0edf9a6ae9288591f32b91fb4" gracePeriod=600 Oct 11 02:11:44 crc kubenswrapper[4743]: I1011 02:11:44.596153 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="d142e23db0b3ed02fb38edd4569166021ea8fee0edf9a6ae9288591f32b91fb4" exitCode=0 Oct 11 02:11:44 crc kubenswrapper[4743]: I1011 02:11:44.596212 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"d142e23db0b3ed02fb38edd4569166021ea8fee0edf9a6ae9288591f32b91fb4"} Oct 11 02:11:44 crc kubenswrapper[4743]: I1011 02:11:44.596620 4743 scope.go:117] "RemoveContainer" containerID="97301fbf68f97bd61a41cd45f206f9ebf6f1a6c4391a359908693157e0acfc57" Oct 11 02:11:45 crc kubenswrapper[4743]: I1011 02:11:45.612674 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db"} Oct 11 02:12:29 crc kubenswrapper[4743]: I1011 02:12:29.184816 4743 generic.go:334] "Generic (PLEG): container finished" podID="f10a464d-943b-4c74-88f8-7d76dbdac358" containerID="62c71862e75a3f0f504c234eb5bbec7e907dbf3aca7959f5ea11b7d9e19a6ff5" exitCode=0 Oct 11 02:12:29 crc kubenswrapper[4743]: I1011 02:12:29.184928 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" event={"ID":"f10a464d-943b-4c74-88f8-7d76dbdac358","Type":"ContainerDied","Data":"62c71862e75a3f0f504c234eb5bbec7e907dbf3aca7959f5ea11b7d9e19a6ff5"} Oct 11 02:12:30 crc kubenswrapper[4743]: I1011 02:12:30.820923 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:12:30 crc kubenswrapper[4743]: I1011 02:12:30.931293 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-telemetry-power-monitoring-combined-ca-bundle\") pod \"f10a464d-943b-4c74-88f8-7d76dbdac358\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " Oct 11 02:12:30 crc kubenswrapper[4743]: I1011 02:12:30.931679 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ceilometer-ipmi-config-data-0\") pod \"f10a464d-943b-4c74-88f8-7d76dbdac358\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " Oct 11 02:12:30 crc kubenswrapper[4743]: I1011 02:12:30.931730 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ssh-key\") pod \"f10a464d-943b-4c74-88f8-7d76dbdac358\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " Oct 11 02:12:30 crc kubenswrapper[4743]: I1011 02:12:30.931802 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ceph\") pod \"f10a464d-943b-4c74-88f8-7d76dbdac358\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " Oct 11 02:12:30 crc kubenswrapper[4743]: I1011 02:12:30.932058 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts2vx\" (UniqueName: \"kubernetes.io/projected/f10a464d-943b-4c74-88f8-7d76dbdac358-kube-api-access-ts2vx\") pod \"f10a464d-943b-4c74-88f8-7d76dbdac358\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " Oct 11 02:12:30 crc kubenswrapper[4743]: I1011 02:12:30.932144 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-inventory\") pod \"f10a464d-943b-4c74-88f8-7d76dbdac358\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " Oct 11 02:12:30 crc kubenswrapper[4743]: I1011 02:12:30.932199 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ceilometer-ipmi-config-data-1\") pod \"f10a464d-943b-4c74-88f8-7d76dbdac358\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " Oct 11 02:12:30 crc kubenswrapper[4743]: I1011 02:12:30.932732 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ceilometer-ipmi-config-data-2\") pod \"f10a464d-943b-4c74-88f8-7d76dbdac358\" (UID: \"f10a464d-943b-4c74-88f8-7d76dbdac358\") " Oct 11 02:12:30 crc kubenswrapper[4743]: I1011 02:12:30.940155 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "f10a464d-943b-4c74-88f8-7d76dbdac358" (UID: "f10a464d-943b-4c74-88f8-7d76dbdac358"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:12:30 crc kubenswrapper[4743]: I1011 02:12:30.940749 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f10a464d-943b-4c74-88f8-7d76dbdac358-kube-api-access-ts2vx" (OuterVolumeSpecName: "kube-api-access-ts2vx") pod "f10a464d-943b-4c74-88f8-7d76dbdac358" (UID: "f10a464d-943b-4c74-88f8-7d76dbdac358"). InnerVolumeSpecName "kube-api-access-ts2vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:12:30 crc kubenswrapper[4743]: I1011 02:12:30.943309 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ceph" (OuterVolumeSpecName: "ceph") pod "f10a464d-943b-4c74-88f8-7d76dbdac358" (UID: "f10a464d-943b-4c74-88f8-7d76dbdac358"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:12:30 crc kubenswrapper[4743]: I1011 02:12:30.969272 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "f10a464d-943b-4c74-88f8-7d76dbdac358" (UID: "f10a464d-943b-4c74-88f8-7d76dbdac358"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:12:30 crc kubenswrapper[4743]: I1011 02:12:30.981090 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "f10a464d-943b-4c74-88f8-7d76dbdac358" (UID: "f10a464d-943b-4c74-88f8-7d76dbdac358"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:12:30 crc kubenswrapper[4743]: I1011 02:12:30.985327 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "f10a464d-943b-4c74-88f8-7d76dbdac358" (UID: "f10a464d-943b-4c74-88f8-7d76dbdac358"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.006768 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f10a464d-943b-4c74-88f8-7d76dbdac358" (UID: "f10a464d-943b-4c74-88f8-7d76dbdac358"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.007556 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-inventory" (OuterVolumeSpecName: "inventory") pod "f10a464d-943b-4c74-88f8-7d76dbdac358" (UID: "f10a464d-943b-4c74-88f8-7d76dbdac358"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.036632 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ts2vx\" (UniqueName: \"kubernetes.io/projected/f10a464d-943b-4c74-88f8-7d76dbdac358-kube-api-access-ts2vx\") on node \"crc\" DevicePath \"\"" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.036681 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.036703 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.036724 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.036746 4743 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.036768 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.036786 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.036807 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f10a464d-943b-4c74-88f8-7d76dbdac358-ceph\") on node \"crc\" DevicePath \"\"" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.210756 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" event={"ID":"f10a464d-943b-4c74-88f8-7d76dbdac358","Type":"ContainerDied","Data":"d1a320585dd1540b8feaab8e12baeea8512edd207ad7486a11c800e70d7a941b"} Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.210805 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1a320585dd1540b8feaab8e12baeea8512edd207ad7486a11c800e70d7a941b" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.210904 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.408645 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw"] Oct 11 02:12:31 crc kubenswrapper[4743]: E1011 02:12:31.409325 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="270c7eec-829c-45d1-917a-b5282cc246e7" containerName="extract-content" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.409348 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="270c7eec-829c-45d1-917a-b5282cc246e7" containerName="extract-content" Oct 11 02:12:31 crc kubenswrapper[4743]: E1011 02:12:31.409374 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="270c7eec-829c-45d1-917a-b5282cc246e7" containerName="extract-utilities" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.409384 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="270c7eec-829c-45d1-917a-b5282cc246e7" containerName="extract-utilities" Oct 11 02:12:31 crc kubenswrapper[4743]: E1011 02:12:31.409423 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="270c7eec-829c-45d1-917a-b5282cc246e7" containerName="registry-server" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.409434 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="270c7eec-829c-45d1-917a-b5282cc246e7" containerName="registry-server" Oct 11 02:12:31 crc kubenswrapper[4743]: E1011 02:12:31.409449 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10a464d-943b-4c74-88f8-7d76dbdac358" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.409461 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10a464d-943b-4c74-88f8-7d76dbdac358" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.409803 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="270c7eec-829c-45d1-917a-b5282cc246e7" containerName="registry-server" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.409841 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f10a464d-943b-4c74-88f8-7d76dbdac358" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.410936 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.417283 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.417300 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.417429 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.417313 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8vmmn" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.417578 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.417733 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.421164 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw"] Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.554701 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvnfw\" (UID: \"abded9bf-eca7-43d5-bd5b-531d44751777\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.554756 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvnfw\" (UID: \"abded9bf-eca7-43d5-bd5b-531d44751777\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.554813 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-ceph\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvnfw\" (UID: \"abded9bf-eca7-43d5-bd5b-531d44751777\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.554842 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvnfw\" (UID: \"abded9bf-eca7-43d5-bd5b-531d44751777\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.554905 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvnfw\" (UID: \"abded9bf-eca7-43d5-bd5b-531d44751777\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.554985 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db5hf\" (UniqueName: \"kubernetes.io/projected/abded9bf-eca7-43d5-bd5b-531d44751777-kube-api-access-db5hf\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvnfw\" (UID: \"abded9bf-eca7-43d5-bd5b-531d44751777\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.656780 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvnfw\" (UID: \"abded9bf-eca7-43d5-bd5b-531d44751777\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.656877 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db5hf\" (UniqueName: \"kubernetes.io/projected/abded9bf-eca7-43d5-bd5b-531d44751777-kube-api-access-db5hf\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvnfw\" (UID: \"abded9bf-eca7-43d5-bd5b-531d44751777\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.657045 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvnfw\" (UID: \"abded9bf-eca7-43d5-bd5b-531d44751777\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.657074 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvnfw\" (UID: \"abded9bf-eca7-43d5-bd5b-531d44751777\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.657123 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-ceph\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvnfw\" (UID: \"abded9bf-eca7-43d5-bd5b-531d44751777\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.657146 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvnfw\" (UID: \"abded9bf-eca7-43d5-bd5b-531d44751777\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.661897 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvnfw\" (UID: \"abded9bf-eca7-43d5-bd5b-531d44751777\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.661974 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvnfw\" (UID: \"abded9bf-eca7-43d5-bd5b-531d44751777\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.662393 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvnfw\" (UID: \"abded9bf-eca7-43d5-bd5b-531d44751777\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.663147 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-ceph\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvnfw\" (UID: \"abded9bf-eca7-43d5-bd5b-531d44751777\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.675351 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvnfw\" (UID: \"abded9bf-eca7-43d5-bd5b-531d44751777\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.684279 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db5hf\" (UniqueName: \"kubernetes.io/projected/abded9bf-eca7-43d5-bd5b-531d44751777-kube-api-access-db5hf\") pod \"logging-edpm-deployment-openstack-edpm-ipam-bvnfw\" (UID: \"abded9bf-eca7-43d5-bd5b-531d44751777\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" Oct 11 02:12:31 crc kubenswrapper[4743]: I1011 02:12:31.728573 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" Oct 11 02:12:32 crc kubenswrapper[4743]: I1011 02:12:32.329891 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw"] Oct 11 02:12:32 crc kubenswrapper[4743]: W1011 02:12:32.582199 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabded9bf_eca7_43d5_bd5b_531d44751777.slice/crio-3edd983524284cb43341e3a38e4b80a407a588dc34dbda2d3bdb5b9820ad711f WatchSource:0}: Error finding container 3edd983524284cb43341e3a38e4b80a407a588dc34dbda2d3bdb5b9820ad711f: Status 404 returned error can't find the container with id 3edd983524284cb43341e3a38e4b80a407a588dc34dbda2d3bdb5b9820ad711f Oct 11 02:12:33 crc kubenswrapper[4743]: I1011 02:12:33.234271 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" event={"ID":"abded9bf-eca7-43d5-bd5b-531d44751777","Type":"ContainerStarted","Data":"3edd983524284cb43341e3a38e4b80a407a588dc34dbda2d3bdb5b9820ad711f"} Oct 11 02:12:34 crc kubenswrapper[4743]: I1011 02:12:34.245051 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" event={"ID":"abded9bf-eca7-43d5-bd5b-531d44751777","Type":"ContainerStarted","Data":"2ad599fc28e992e46e8ff7282e67a85537cae804f63f186fa8c69127109ba7be"} Oct 11 02:12:34 crc kubenswrapper[4743]: I1011 02:12:34.275590 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" podStartSLOduration=2.754931019 podStartE2EDuration="3.275562907s" podCreationTimestamp="2025-10-11 02:12:31 +0000 UTC" firstStartedPulling="2025-10-11 02:12:32.585133046 +0000 UTC m=+4847.238113453" lastFinishedPulling="2025-10-11 02:12:33.105764904 +0000 UTC m=+4847.758745341" observedRunningTime="2025-10-11 02:12:34.268535563 +0000 UTC m=+4848.921515970" watchObservedRunningTime="2025-10-11 02:12:34.275562907 +0000 UTC m=+4848.928543334" Oct 11 02:12:50 crc kubenswrapper[4743]: I1011 02:12:50.436691 4743 generic.go:334] "Generic (PLEG): container finished" podID="abded9bf-eca7-43d5-bd5b-531d44751777" containerID="2ad599fc28e992e46e8ff7282e67a85537cae804f63f186fa8c69127109ba7be" exitCode=0 Oct 11 02:12:50 crc kubenswrapper[4743]: I1011 02:12:50.436838 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" event={"ID":"abded9bf-eca7-43d5-bd5b-531d44751777","Type":"ContainerDied","Data":"2ad599fc28e992e46e8ff7282e67a85537cae804f63f186fa8c69127109ba7be"} Oct 11 02:12:51 crc kubenswrapper[4743]: I1011 02:12:51.925538 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" Oct 11 02:12:52 crc kubenswrapper[4743]: I1011 02:12:52.038796 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-logging-compute-config-data-1\") pod \"abded9bf-eca7-43d5-bd5b-531d44751777\" (UID: \"abded9bf-eca7-43d5-bd5b-531d44751777\") " Oct 11 02:12:52 crc kubenswrapper[4743]: I1011 02:12:52.038922 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-ssh-key\") pod \"abded9bf-eca7-43d5-bd5b-531d44751777\" (UID: \"abded9bf-eca7-43d5-bd5b-531d44751777\") " Oct 11 02:12:52 crc kubenswrapper[4743]: I1011 02:12:52.038974 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db5hf\" (UniqueName: \"kubernetes.io/projected/abded9bf-eca7-43d5-bd5b-531d44751777-kube-api-access-db5hf\") pod \"abded9bf-eca7-43d5-bd5b-531d44751777\" (UID: \"abded9bf-eca7-43d5-bd5b-531d44751777\") " Oct 11 02:12:52 crc kubenswrapper[4743]: I1011 02:12:52.039164 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-inventory\") pod \"abded9bf-eca7-43d5-bd5b-531d44751777\" (UID: \"abded9bf-eca7-43d5-bd5b-531d44751777\") " Oct 11 02:12:52 crc kubenswrapper[4743]: I1011 02:12:52.039231 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-ceph\") pod \"abded9bf-eca7-43d5-bd5b-531d44751777\" (UID: \"abded9bf-eca7-43d5-bd5b-531d44751777\") " Oct 11 02:12:52 crc kubenswrapper[4743]: I1011 02:12:52.039272 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-logging-compute-config-data-0\") pod \"abded9bf-eca7-43d5-bd5b-531d44751777\" (UID: \"abded9bf-eca7-43d5-bd5b-531d44751777\") " Oct 11 02:12:52 crc kubenswrapper[4743]: I1011 02:12:52.047001 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-ceph" (OuterVolumeSpecName: "ceph") pod "abded9bf-eca7-43d5-bd5b-531d44751777" (UID: "abded9bf-eca7-43d5-bd5b-531d44751777"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:12:52 crc kubenswrapper[4743]: I1011 02:12:52.062359 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abded9bf-eca7-43d5-bd5b-531d44751777-kube-api-access-db5hf" (OuterVolumeSpecName: "kube-api-access-db5hf") pod "abded9bf-eca7-43d5-bd5b-531d44751777" (UID: "abded9bf-eca7-43d5-bd5b-531d44751777"). InnerVolumeSpecName "kube-api-access-db5hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:12:52 crc kubenswrapper[4743]: I1011 02:12:52.072921 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "abded9bf-eca7-43d5-bd5b-531d44751777" (UID: "abded9bf-eca7-43d5-bd5b-531d44751777"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:12:52 crc kubenswrapper[4743]: I1011 02:12:52.074745 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "abded9bf-eca7-43d5-bd5b-531d44751777" (UID: "abded9bf-eca7-43d5-bd5b-531d44751777"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:12:52 crc kubenswrapper[4743]: I1011 02:12:52.086763 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-inventory" (OuterVolumeSpecName: "inventory") pod "abded9bf-eca7-43d5-bd5b-531d44751777" (UID: "abded9bf-eca7-43d5-bd5b-531d44751777"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:12:52 crc kubenswrapper[4743]: I1011 02:12:52.088245 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "abded9bf-eca7-43d5-bd5b-531d44751777" (UID: "abded9bf-eca7-43d5-bd5b-531d44751777"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:12:52 crc kubenswrapper[4743]: I1011 02:12:52.144362 4743 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 11 02:12:52 crc kubenswrapper[4743]: I1011 02:12:52.144400 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 02:12:52 crc kubenswrapper[4743]: I1011 02:12:52.144412 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db5hf\" (UniqueName: \"kubernetes.io/projected/abded9bf-eca7-43d5-bd5b-531d44751777-kube-api-access-db5hf\") on node \"crc\" DevicePath \"\"" Oct 11 02:12:52 crc kubenswrapper[4743]: I1011 02:12:52.144422 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-inventory\") on node \"crc\" DevicePath \"\"" Oct 11 02:12:52 crc kubenswrapper[4743]: I1011 02:12:52.144434 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-ceph\") on node \"crc\" DevicePath \"\"" Oct 11 02:12:52 crc kubenswrapper[4743]: I1011 02:12:52.144442 4743 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/abded9bf-eca7-43d5-bd5b-531d44751777-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 11 02:12:52 crc kubenswrapper[4743]: I1011 02:12:52.476746 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" event={"ID":"abded9bf-eca7-43d5-bd5b-531d44751777","Type":"ContainerDied","Data":"3edd983524284cb43341e3a38e4b80a407a588dc34dbda2d3bdb5b9820ad711f"} Oct 11 02:12:52 crc kubenswrapper[4743]: I1011 02:12:52.476793 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3edd983524284cb43341e3a38e4b80a407a588dc34dbda2d3bdb5b9820ad711f" Oct 11 02:12:52 crc kubenswrapper[4743]: I1011 02:12:52.476816 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-bvnfw" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.626262 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 11 02:13:08 crc kubenswrapper[4743]: E1011 02:13:08.627206 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abded9bf-eca7-43d5-bd5b-531d44751777" containerName="logging-edpm-deployment-openstack-edpm-ipam" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.627222 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="abded9bf-eca7-43d5-bd5b-531d44751777" containerName="logging-edpm-deployment-openstack-edpm-ipam" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.627452 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="abded9bf-eca7-43d5-bd5b-531d44751777" containerName="logging-edpm-deployment-openstack-edpm-ipam" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.628629 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.630805 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.631936 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.639599 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.648742 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.650550 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.658997 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.668212 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.749085 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f333d397-070a-4624-8b2d-856964010b75-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.749141 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-lib-modules\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.749165 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-dev\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.749187 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/129685c1-9de5-4c18-9219-172fe359aa89-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.749214 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw49v\" (UniqueName: \"kubernetes.io/projected/f333d397-070a-4624-8b2d-856964010b75-kube-api-access-vw49v\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.749239 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-run\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.749317 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-run\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.749417 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f333d397-070a-4624-8b2d-856964010b75-config-data-custom\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.749455 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxls6\" (UniqueName: \"kubernetes.io/projected/129685c1-9de5-4c18-9219-172fe359aa89-kube-api-access-cxls6\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.749475 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-etc-nvme\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.749501 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.749517 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.749535 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.749565 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.749582 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.749602 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/129685c1-9de5-4c18-9219-172fe359aa89-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.749616 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.749631 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.749646 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f333d397-070a-4624-8b2d-856964010b75-scripts\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.749665 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-sys\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.749683 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/129685c1-9de5-4c18-9219-172fe359aa89-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.749744 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.749828 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/129685c1-9de5-4c18-9219-172fe359aa89-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.750009 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f333d397-070a-4624-8b2d-856964010b75-ceph\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.750053 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-dev\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.750108 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.750134 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-sys\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.750176 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/129685c1-9de5-4c18-9219-172fe359aa89-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.750201 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f333d397-070a-4624-8b2d-856964010b75-config-data\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.750216 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.750237 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.750252 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.851719 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/129685c1-9de5-4c18-9219-172fe359aa89-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.851816 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f333d397-070a-4624-8b2d-856964010b75-ceph\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.851872 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-dev\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.851893 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-sys\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.851910 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.851926 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/129685c1-9de5-4c18-9219-172fe359aa89-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.851946 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f333d397-070a-4624-8b2d-856964010b75-config-data\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.851967 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.851986 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.852000 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.852029 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f333d397-070a-4624-8b2d-856964010b75-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.852050 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-lib-modules\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.852068 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-dev\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.852087 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/129685c1-9de5-4c18-9219-172fe359aa89-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.852112 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw49v\" (UniqueName: \"kubernetes.io/projected/f333d397-070a-4624-8b2d-856964010b75-kube-api-access-vw49v\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.852135 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-run\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.852152 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-run\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.852184 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f333d397-070a-4624-8b2d-856964010b75-config-data-custom\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.852219 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxls6\" (UniqueName: \"kubernetes.io/projected/129685c1-9de5-4c18-9219-172fe359aa89-kube-api-access-cxls6\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.852238 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-etc-nvme\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.852258 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.852275 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.852289 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.852306 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.852321 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.852341 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/129685c1-9de5-4c18-9219-172fe359aa89-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.852357 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.852372 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.852389 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f333d397-070a-4624-8b2d-856964010b75-scripts\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.852405 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-sys\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.852423 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/129685c1-9de5-4c18-9219-172fe359aa89-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.852438 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.852733 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.853068 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-run\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.853098 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-run\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.853160 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.853221 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.853243 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.853612 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.853691 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.853724 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-dev\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.853782 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-sys\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.853817 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.854300 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-etc-nvme\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.854402 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.854541 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.854624 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.854664 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/129685c1-9de5-4c18-9219-172fe359aa89-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.854692 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-lib-modules\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.854714 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-dev\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.854737 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-sys\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.854732 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f333d397-070a-4624-8b2d-856964010b75-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.857936 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f333d397-070a-4624-8b2d-856964010b75-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.858940 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f333d397-070a-4624-8b2d-856964010b75-config-data-custom\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.860141 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/129685c1-9de5-4c18-9219-172fe359aa89-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.860154 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f333d397-070a-4624-8b2d-856964010b75-scripts\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.860212 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/129685c1-9de5-4c18-9219-172fe359aa89-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.861278 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f333d397-070a-4624-8b2d-856964010b75-config-data\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.861375 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/129685c1-9de5-4c18-9219-172fe359aa89-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.863305 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/129685c1-9de5-4c18-9219-172fe359aa89-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.867174 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f333d397-070a-4624-8b2d-856964010b75-ceph\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.869477 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw49v\" (UniqueName: \"kubernetes.io/projected/f333d397-070a-4624-8b2d-856964010b75-kube-api-access-vw49v\") pod \"cinder-backup-0\" (UID: \"f333d397-070a-4624-8b2d-856964010b75\") " pod="openstack/cinder-backup-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.872817 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/129685c1-9de5-4c18-9219-172fe359aa89-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.877386 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxls6\" (UniqueName: \"kubernetes.io/projected/129685c1-9de5-4c18-9219-172fe359aa89-kube-api-access-cxls6\") pod \"cinder-volume-volume1-0\" (UID: \"129685c1-9de5-4c18-9219-172fe359aa89\") " pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.949552 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:08 crc kubenswrapper[4743]: I1011 02:13:08.978271 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.368082 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-qfwq7"] Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.375409 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-qfwq7" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.377395 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-qfwq7"] Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.469952 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.475727 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prr8b\" (UniqueName: \"kubernetes.io/projected/c0f86bb5-1ce9-44d0-a03a-ef624592cdc4-kube-api-access-prr8b\") pod \"manila-db-create-qfwq7\" (UID: \"c0f86bb5-1ce9-44d0-a03a-ef624592cdc4\") " pod="openstack/manila-db-create-qfwq7" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.486560 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.531252 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.531492 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9vxpt" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.531713 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.542975 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.546422 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.552893 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-859df554d7-5h8zd"] Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.559631 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-859df554d7-5h8zd" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.566296 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.566468 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.566582 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-2t5qd" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.570509 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.586747 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd5ead6a-c090-4c5f-8759-6080e99b8753-logs\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.586806 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd5ead6a-c090-4c5f-8759-6080e99b8753-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.586987 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd5ead6a-c090-4c5f-8759-6080e99b8753-scripts\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.587035 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd5ead6a-c090-4c5f-8759-6080e99b8753-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.587182 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd5ead6a-c090-4c5f-8759-6080e99b8753-config-data\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.587321 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fd5ead6a-c090-4c5f-8759-6080e99b8753-ceph\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.587380 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr4fb\" (UniqueName: \"kubernetes.io/projected/fd5ead6a-c090-4c5f-8759-6080e99b8753-kube-api-access-gr4fb\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.587417 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.587507 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fd5ead6a-c090-4c5f-8759-6080e99b8753-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.587545 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prr8b\" (UniqueName: \"kubernetes.io/projected/c0f86bb5-1ce9-44d0-a03a-ef624592cdc4-kube-api-access-prr8b\") pod \"manila-db-create-qfwq7\" (UID: \"c0f86bb5-1ce9-44d0-a03a-ef624592cdc4\") " pod="openstack/manila-db-create-qfwq7" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.617114 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prr8b\" (UniqueName: \"kubernetes.io/projected/c0f86bb5-1ce9-44d0-a03a-ef624592cdc4-kube-api-access-prr8b\") pod \"manila-db-create-qfwq7\" (UID: \"c0f86bb5-1ce9-44d0-a03a-ef624592cdc4\") " pod="openstack/manila-db-create-qfwq7" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.621236 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-859df554d7-5h8zd"] Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.648596 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.650968 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.653522 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.653724 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.666492 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.678124 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-9bcfcb9cc-5wvcg"] Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.680221 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9bcfcb9cc-5wvcg" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.693486 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.693535 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c58f083e-feb9-4f5c-961d-f61af07d794a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.693593 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fd5ead6a-c090-4c5f-8759-6080e99b8753-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.693615 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c58f083e-feb9-4f5c-961d-f61af07d794a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.693637 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c920be26-905f-40c5-a4b6-c80b9428e662-logs\") pod \"horizon-859df554d7-5h8zd\" (UID: \"c920be26-905f-40c5-a4b6-c80b9428e662\") " pod="openstack/horizon-859df554d7-5h8zd" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.693663 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c58f083e-feb9-4f5c-961d-f61af07d794a-logs\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.693680 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c920be26-905f-40c5-a4b6-c80b9428e662-horizon-secret-key\") pod \"horizon-859df554d7-5h8zd\" (UID: \"c920be26-905f-40c5-a4b6-c80b9428e662\") " pod="openstack/horizon-859df554d7-5h8zd" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.693704 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c58f083e-feb9-4f5c-961d-f61af07d794a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.693728 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd5ead6a-c090-4c5f-8759-6080e99b8753-logs\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.693771 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd5ead6a-c090-4c5f-8759-6080e99b8753-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.693789 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rccm8\" (UniqueName: \"kubernetes.io/projected/c920be26-905f-40c5-a4b6-c80b9428e662-kube-api-access-rccm8\") pod \"horizon-859df554d7-5h8zd\" (UID: \"c920be26-905f-40c5-a4b6-c80b9428e662\") " pod="openstack/horizon-859df554d7-5h8zd" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.693811 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd5ead6a-c090-4c5f-8759-6080e99b8753-scripts\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.693837 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd5ead6a-c090-4c5f-8759-6080e99b8753-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.693890 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58f083e-feb9-4f5c-961d-f61af07d794a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.693920 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksw5q\" (UniqueName: \"kubernetes.io/projected/c58f083e-feb9-4f5c-961d-f61af07d794a-kube-api-access-ksw5q\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.693938 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c920be26-905f-40c5-a4b6-c80b9428e662-config-data\") pod \"horizon-859df554d7-5h8zd\" (UID: \"c920be26-905f-40c5-a4b6-c80b9428e662\") " pod="openstack/horizon-859df554d7-5h8zd" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.693961 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c58f083e-feb9-4f5c-961d-f61af07d794a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.693981 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd5ead6a-c090-4c5f-8759-6080e99b8753-config-data\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.694019 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c58f083e-feb9-4f5c-961d-f61af07d794a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.694067 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c920be26-905f-40c5-a4b6-c80b9428e662-scripts\") pod \"horizon-859df554d7-5h8zd\" (UID: \"c920be26-905f-40c5-a4b6-c80b9428e662\") " pod="openstack/horizon-859df554d7-5h8zd" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.694091 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fd5ead6a-c090-4c5f-8759-6080e99b8753-ceph\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.694117 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.694145 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr4fb\" (UniqueName: \"kubernetes.io/projected/fd5ead6a-c090-4c5f-8759-6080e99b8753-kube-api-access-gr4fb\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.694662 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.698720 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd5ead6a-c090-4c5f-8759-6080e99b8753-scripts\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.698939 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 11 02:13:09 crc kubenswrapper[4743]: E1011 02:13:09.699746 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceph combined-ca-bundle config-data glance httpd-run kube-api-access-gr4fb logs public-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-external-api-0" podUID="fd5ead6a-c090-4c5f-8759-6080e99b8753" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.702837 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd5ead6a-c090-4c5f-8759-6080e99b8753-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.703368 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-qfwq7" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.704098 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd5ead6a-c090-4c5f-8759-6080e99b8753-logs\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.713031 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd5ead6a-c090-4c5f-8759-6080e99b8753-config-data\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.713812 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fd5ead6a-c090-4c5f-8759-6080e99b8753-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.714670 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fd5ead6a-c090-4c5f-8759-6080e99b8753-ceph\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.716060 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr4fb\" (UniqueName: \"kubernetes.io/projected/fd5ead6a-c090-4c5f-8759-6080e99b8753-kube-api-access-gr4fb\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.716208 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd5ead6a-c090-4c5f-8759-6080e99b8753-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.722770 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9bcfcb9cc-5wvcg"] Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.734273 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.798608 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r84v\" (UniqueName: \"kubernetes.io/projected/d39c4e33-bfe6-4c48-bc00-f2713e45103b-kube-api-access-7r84v\") pod \"horizon-9bcfcb9cc-5wvcg\" (UID: \"d39c4e33-bfe6-4c48-bc00-f2713e45103b\") " pod="openstack/horizon-9bcfcb9cc-5wvcg" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.798794 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c58f083e-feb9-4f5c-961d-f61af07d794a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.798814 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c920be26-905f-40c5-a4b6-c80b9428e662-logs\") pod \"horizon-859df554d7-5h8zd\" (UID: \"c920be26-905f-40c5-a4b6-c80b9428e662\") " pod="openstack/horizon-859df554d7-5h8zd" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.798849 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c58f083e-feb9-4f5c-961d-f61af07d794a-logs\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.798878 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c920be26-905f-40c5-a4b6-c80b9428e662-horizon-secret-key\") pod \"horizon-859df554d7-5h8zd\" (UID: \"c920be26-905f-40c5-a4b6-c80b9428e662\") " pod="openstack/horizon-859df554d7-5h8zd" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.798899 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c58f083e-feb9-4f5c-961d-f61af07d794a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.798918 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d39c4e33-bfe6-4c48-bc00-f2713e45103b-logs\") pod \"horizon-9bcfcb9cc-5wvcg\" (UID: \"d39c4e33-bfe6-4c48-bc00-f2713e45103b\") " pod="openstack/horizon-9bcfcb9cc-5wvcg" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.798936 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d39c4e33-bfe6-4c48-bc00-f2713e45103b-scripts\") pod \"horizon-9bcfcb9cc-5wvcg\" (UID: \"d39c4e33-bfe6-4c48-bc00-f2713e45103b\") " pod="openstack/horizon-9bcfcb9cc-5wvcg" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.798952 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d39c4e33-bfe6-4c48-bc00-f2713e45103b-horizon-secret-key\") pod \"horizon-9bcfcb9cc-5wvcg\" (UID: \"d39c4e33-bfe6-4c48-bc00-f2713e45103b\") " pod="openstack/horizon-9bcfcb9cc-5wvcg" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.798989 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rccm8\" (UniqueName: \"kubernetes.io/projected/c920be26-905f-40c5-a4b6-c80b9428e662-kube-api-access-rccm8\") pod \"horizon-859df554d7-5h8zd\" (UID: \"c920be26-905f-40c5-a4b6-c80b9428e662\") " pod="openstack/horizon-859df554d7-5h8zd" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.799017 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d39c4e33-bfe6-4c48-bc00-f2713e45103b-config-data\") pod \"horizon-9bcfcb9cc-5wvcg\" (UID: \"d39c4e33-bfe6-4c48-bc00-f2713e45103b\") " pod="openstack/horizon-9bcfcb9cc-5wvcg" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.799044 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58f083e-feb9-4f5c-961d-f61af07d794a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.799072 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksw5q\" (UniqueName: \"kubernetes.io/projected/c58f083e-feb9-4f5c-961d-f61af07d794a-kube-api-access-ksw5q\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.799090 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c920be26-905f-40c5-a4b6-c80b9428e662-config-data\") pod \"horizon-859df554d7-5h8zd\" (UID: \"c920be26-905f-40c5-a4b6-c80b9428e662\") " pod="openstack/horizon-859df554d7-5h8zd" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.799105 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c58f083e-feb9-4f5c-961d-f61af07d794a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.799133 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c58f083e-feb9-4f5c-961d-f61af07d794a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.799170 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c920be26-905f-40c5-a4b6-c80b9428e662-scripts\") pod \"horizon-859df554d7-5h8zd\" (UID: \"c920be26-905f-40c5-a4b6-c80b9428e662\") " pod="openstack/horizon-859df554d7-5h8zd" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.799197 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.799230 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c58f083e-feb9-4f5c-961d-f61af07d794a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.801745 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.802029 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c920be26-905f-40c5-a4b6-c80b9428e662-config-data\") pod \"horizon-859df554d7-5h8zd\" (UID: \"c920be26-905f-40c5-a4b6-c80b9428e662\") " pod="openstack/horizon-859df554d7-5h8zd" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.802283 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c920be26-905f-40c5-a4b6-c80b9428e662-logs\") pod \"horizon-859df554d7-5h8zd\" (UID: \"c920be26-905f-40c5-a4b6-c80b9428e662\") " pod="openstack/horizon-859df554d7-5h8zd" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.802331 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c920be26-905f-40c5-a4b6-c80b9428e662-scripts\") pod \"horizon-859df554d7-5h8zd\" (UID: \"c920be26-905f-40c5-a4b6-c80b9428e662\") " pod="openstack/horizon-859df554d7-5h8zd" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.802344 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c58f083e-feb9-4f5c-961d-f61af07d794a-logs\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.802522 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c58f083e-feb9-4f5c-961d-f61af07d794a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.804762 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c58f083e-feb9-4f5c-961d-f61af07d794a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.807600 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58f083e-feb9-4f5c-961d-f61af07d794a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.811435 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c58f083e-feb9-4f5c-961d-f61af07d794a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.814256 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c920be26-905f-40c5-a4b6-c80b9428e662-horizon-secret-key\") pod \"horizon-859df554d7-5h8zd\" (UID: \"c920be26-905f-40c5-a4b6-c80b9428e662\") " pod="openstack/horizon-859df554d7-5h8zd" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.816153 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c58f083e-feb9-4f5c-961d-f61af07d794a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.817732 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksw5q\" (UniqueName: \"kubernetes.io/projected/c58f083e-feb9-4f5c-961d-f61af07d794a-kube-api-access-ksw5q\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.820423 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c58f083e-feb9-4f5c-961d-f61af07d794a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.847403 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rccm8\" (UniqueName: \"kubernetes.io/projected/c920be26-905f-40c5-a4b6-c80b9428e662-kube-api-access-rccm8\") pod \"horizon-859df554d7-5h8zd\" (UID: \"c920be26-905f-40c5-a4b6-c80b9428e662\") " pod="openstack/horizon-859df554d7-5h8zd" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.900537 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d39c4e33-bfe6-4c48-bc00-f2713e45103b-config-data\") pod \"horizon-9bcfcb9cc-5wvcg\" (UID: \"d39c4e33-bfe6-4c48-bc00-f2713e45103b\") " pod="openstack/horizon-9bcfcb9cc-5wvcg" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.900699 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r84v\" (UniqueName: \"kubernetes.io/projected/d39c4e33-bfe6-4c48-bc00-f2713e45103b-kube-api-access-7r84v\") pod \"horizon-9bcfcb9cc-5wvcg\" (UID: \"d39c4e33-bfe6-4c48-bc00-f2713e45103b\") " pod="openstack/horizon-9bcfcb9cc-5wvcg" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.900742 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d39c4e33-bfe6-4c48-bc00-f2713e45103b-logs\") pod \"horizon-9bcfcb9cc-5wvcg\" (UID: \"d39c4e33-bfe6-4c48-bc00-f2713e45103b\") " pod="openstack/horizon-9bcfcb9cc-5wvcg" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.900759 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d39c4e33-bfe6-4c48-bc00-f2713e45103b-scripts\") pod \"horizon-9bcfcb9cc-5wvcg\" (UID: \"d39c4e33-bfe6-4c48-bc00-f2713e45103b\") " pod="openstack/horizon-9bcfcb9cc-5wvcg" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.900776 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d39c4e33-bfe6-4c48-bc00-f2713e45103b-horizon-secret-key\") pod \"horizon-9bcfcb9cc-5wvcg\" (UID: \"d39c4e33-bfe6-4c48-bc00-f2713e45103b\") " pod="openstack/horizon-9bcfcb9cc-5wvcg" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.903237 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d39c4e33-bfe6-4c48-bc00-f2713e45103b-logs\") pod \"horizon-9bcfcb9cc-5wvcg\" (UID: \"d39c4e33-bfe6-4c48-bc00-f2713e45103b\") " pod="openstack/horizon-9bcfcb9cc-5wvcg" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.903736 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d39c4e33-bfe6-4c48-bc00-f2713e45103b-config-data\") pod \"horizon-9bcfcb9cc-5wvcg\" (UID: \"d39c4e33-bfe6-4c48-bc00-f2713e45103b\") " pod="openstack/horizon-9bcfcb9cc-5wvcg" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.904379 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d39c4e33-bfe6-4c48-bc00-f2713e45103b-scripts\") pod \"horizon-9bcfcb9cc-5wvcg\" (UID: \"d39c4e33-bfe6-4c48-bc00-f2713e45103b\") " pod="openstack/horizon-9bcfcb9cc-5wvcg" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.910072 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d39c4e33-bfe6-4c48-bc00-f2713e45103b-horizon-secret-key\") pod \"horizon-9bcfcb9cc-5wvcg\" (UID: \"d39c4e33-bfe6-4c48-bc00-f2713e45103b\") " pod="openstack/horizon-9bcfcb9cc-5wvcg" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.911453 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-859df554d7-5h8zd" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.922491 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r84v\" (UniqueName: \"kubernetes.io/projected/d39c4e33-bfe6-4c48-bc00-f2713e45103b-kube-api-access-7r84v\") pod \"horizon-9bcfcb9cc-5wvcg\" (UID: \"d39c4e33-bfe6-4c48-bc00-f2713e45103b\") " pod="openstack/horizon-9bcfcb9cc-5wvcg" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.957056 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:09 crc kubenswrapper[4743]: I1011 02:13:09.985869 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.007426 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9bcfcb9cc-5wvcg" Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.436656 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 11 02:13:10 crc kubenswrapper[4743]: W1011 02:13:10.448997 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod129685c1_9de5_4c18_9219_172fe359aa89.slice/crio-d5da2cf514594805a29c9d58cf777c1b831daef9dd62e8d5810f39a645612c3a WatchSource:0}: Error finding container d5da2cf514594805a29c9d58cf777c1b831daef9dd62e8d5810f39a645612c3a: Status 404 returned error can't find the container with id d5da2cf514594805a29c9d58cf777c1b831daef9dd62e8d5810f39a645612c3a Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.559003 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-qfwq7"] Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.666430 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-qfwq7" event={"ID":"c0f86bb5-1ce9-44d0-a03a-ef624592cdc4","Type":"ContainerStarted","Data":"1dd9207d031b971a3d0ecb54e13a6493e4961ac58eeca85bc9d6d8d7b6bd5c05"} Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.668919 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.669553 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"129685c1-9de5-4c18-9219-172fe359aa89","Type":"ContainerStarted","Data":"d5da2cf514594805a29c9d58cf777c1b831daef9dd62e8d5810f39a645612c3a"} Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.695052 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.726346 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd5ead6a-c090-4c5f-8759-6080e99b8753-scripts\") pod \"fd5ead6a-c090-4c5f-8759-6080e99b8753\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.726389 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd5ead6a-c090-4c5f-8759-6080e99b8753-public-tls-certs\") pod \"fd5ead6a-c090-4c5f-8759-6080e99b8753\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.726424 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd5ead6a-c090-4c5f-8759-6080e99b8753-logs\") pod \"fd5ead6a-c090-4c5f-8759-6080e99b8753\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.726466 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr4fb\" (UniqueName: \"kubernetes.io/projected/fd5ead6a-c090-4c5f-8759-6080e99b8753-kube-api-access-gr4fb\") pod \"fd5ead6a-c090-4c5f-8759-6080e99b8753\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.726500 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fd5ead6a-c090-4c5f-8759-6080e99b8753-ceph\") pod \"fd5ead6a-c090-4c5f-8759-6080e99b8753\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.726536 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd5ead6a-c090-4c5f-8759-6080e99b8753-combined-ca-bundle\") pod \"fd5ead6a-c090-4c5f-8759-6080e99b8753\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.726563 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd5ead6a-c090-4c5f-8759-6080e99b8753-config-data\") pod \"fd5ead6a-c090-4c5f-8759-6080e99b8753\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.726599 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fd5ead6a-c090-4c5f-8759-6080e99b8753-httpd-run\") pod \"fd5ead6a-c090-4c5f-8759-6080e99b8753\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.726646 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"fd5ead6a-c090-4c5f-8759-6080e99b8753\" (UID: \"fd5ead6a-c090-4c5f-8759-6080e99b8753\") " Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.731807 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd5ead6a-c090-4c5f-8759-6080e99b8753-logs" (OuterVolumeSpecName: "logs") pod "fd5ead6a-c090-4c5f-8759-6080e99b8753" (UID: "fd5ead6a-c090-4c5f-8759-6080e99b8753"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.732656 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd5ead6a-c090-4c5f-8759-6080e99b8753-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fd5ead6a-c090-4c5f-8759-6080e99b8753" (UID: "fd5ead6a-c090-4c5f-8759-6080e99b8753"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.732754 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "fd5ead6a-c090-4c5f-8759-6080e99b8753" (UID: "fd5ead6a-c090-4c5f-8759-6080e99b8753"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.735291 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd5ead6a-c090-4c5f-8759-6080e99b8753-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd5ead6a-c090-4c5f-8759-6080e99b8753" (UID: "fd5ead6a-c090-4c5f-8759-6080e99b8753"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.736583 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd5ead6a-c090-4c5f-8759-6080e99b8753-scripts" (OuterVolumeSpecName: "scripts") pod "fd5ead6a-c090-4c5f-8759-6080e99b8753" (UID: "fd5ead6a-c090-4c5f-8759-6080e99b8753"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.739229 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd5ead6a-c090-4c5f-8759-6080e99b8753-ceph" (OuterVolumeSpecName: "ceph") pod "fd5ead6a-c090-4c5f-8759-6080e99b8753" (UID: "fd5ead6a-c090-4c5f-8759-6080e99b8753"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.740041 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd5ead6a-c090-4c5f-8759-6080e99b8753-kube-api-access-gr4fb" (OuterVolumeSpecName: "kube-api-access-gr4fb") pod "fd5ead6a-c090-4c5f-8759-6080e99b8753" (UID: "fd5ead6a-c090-4c5f-8759-6080e99b8753"). InnerVolumeSpecName "kube-api-access-gr4fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.740526 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd5ead6a-c090-4c5f-8759-6080e99b8753-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fd5ead6a-c090-4c5f-8759-6080e99b8753" (UID: "fd5ead6a-c090-4c5f-8759-6080e99b8753"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.745226 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd5ead6a-c090-4c5f-8759-6080e99b8753-config-data" (OuterVolumeSpecName: "config-data") pod "fd5ead6a-c090-4c5f-8759-6080e99b8753" (UID: "fd5ead6a-c090-4c5f-8759-6080e99b8753"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.806656 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-859df554d7-5h8zd"] Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.829297 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fd5ead6a-c090-4c5f-8759-6080e99b8753-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.829361 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.829372 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd5ead6a-c090-4c5f-8759-6080e99b8753-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.829383 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd5ead6a-c090-4c5f-8759-6080e99b8753-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.829393 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd5ead6a-c090-4c5f-8759-6080e99b8753-logs\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.829402 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr4fb\" (UniqueName: \"kubernetes.io/projected/fd5ead6a-c090-4c5f-8759-6080e99b8753-kube-api-access-gr4fb\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.829412 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fd5ead6a-c090-4c5f-8759-6080e99b8753-ceph\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.829420 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd5ead6a-c090-4c5f-8759-6080e99b8753-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.829432 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd5ead6a-c090-4c5f-8759-6080e99b8753-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.859712 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.928697 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.931001 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:10 crc kubenswrapper[4743]: I1011 02:13:10.979282 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9bcfcb9cc-5wvcg"] Oct 11 02:13:10 crc kubenswrapper[4743]: W1011 02:13:10.996432 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd39c4e33_bfe6_4c48_bc00_f2713e45103b.slice/crio-6d18370203a2225428f46f70802e9854a7715cc03e791b89a5c53edea4a34ee8 WatchSource:0}: Error finding container 6d18370203a2225428f46f70802e9854a7715cc03e791b89a5c53edea4a34ee8: Status 404 returned error can't find the container with id 6d18370203a2225428f46f70802e9854a7715cc03e791b89a5c53edea4a34ee8 Oct 11 02:13:11 crc kubenswrapper[4743]: I1011 02:13:11.239645 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 11 02:13:11 crc kubenswrapper[4743]: W1011 02:13:11.262184 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf333d397_070a_4624_8b2d_856964010b75.slice/crio-7e84f7f4c998788047ea512d0aeed8aa0590e4f5500fde95b550a49836a36146 WatchSource:0}: Error finding container 7e84f7f4c998788047ea512d0aeed8aa0590e4f5500fde95b550a49836a36146: Status 404 returned error can't find the container with id 7e84f7f4c998788047ea512d0aeed8aa0590e4f5500fde95b550a49836a36146 Oct 11 02:13:11 crc kubenswrapper[4743]: I1011 02:13:11.714018 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c58f083e-feb9-4f5c-961d-f61af07d794a","Type":"ContainerStarted","Data":"2786b7ed75bb6eb011ca8a6d2b7e81ce11ab85b8f751e0b2b063cb8d49e22ed1"} Oct 11 02:13:11 crc kubenswrapper[4743]: I1011 02:13:11.717630 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-859df554d7-5h8zd" event={"ID":"c920be26-905f-40c5-a4b6-c80b9428e662","Type":"ContainerStarted","Data":"859de03282d67c6cada03fe841cb450631c1709afa53e8d0e2d467389ad31c9f"} Oct 11 02:13:11 crc kubenswrapper[4743]: I1011 02:13:11.719375 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9bcfcb9cc-5wvcg" event={"ID":"d39c4e33-bfe6-4c48-bc00-f2713e45103b","Type":"ContainerStarted","Data":"6d18370203a2225428f46f70802e9854a7715cc03e791b89a5c53edea4a34ee8"} Oct 11 02:13:11 crc kubenswrapper[4743]: I1011 02:13:11.721681 4743 generic.go:334] "Generic (PLEG): container finished" podID="c0f86bb5-1ce9-44d0-a03a-ef624592cdc4" containerID="c20d4ec301e1c2892e1e29c4f4911cf5b7ec93056223930f00414901e82ddc0d" exitCode=0 Oct 11 02:13:11 crc kubenswrapper[4743]: I1011 02:13:11.721740 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-qfwq7" event={"ID":"c0f86bb5-1ce9-44d0-a03a-ef624592cdc4","Type":"ContainerDied","Data":"c20d4ec301e1c2892e1e29c4f4911cf5b7ec93056223930f00414901e82ddc0d"} Oct 11 02:13:11 crc kubenswrapper[4743]: I1011 02:13:11.727270 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"f333d397-070a-4624-8b2d-856964010b75","Type":"ContainerStarted","Data":"7e84f7f4c998788047ea512d0aeed8aa0590e4f5500fde95b550a49836a36146"} Oct 11 02:13:11 crc kubenswrapper[4743]: I1011 02:13:11.727293 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 11 02:13:11 crc kubenswrapper[4743]: I1011 02:13:11.825077 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 11 02:13:11 crc kubenswrapper[4743]: I1011 02:13:11.834090 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 11 02:13:11 crc kubenswrapper[4743]: I1011 02:13:11.843542 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 11 02:13:11 crc kubenswrapper[4743]: I1011 02:13:11.846405 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 11 02:13:11 crc kubenswrapper[4743]: I1011 02:13:11.854200 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 11 02:13:11 crc kubenswrapper[4743]: I1011 02:13:11.855425 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 11 02:13:11 crc kubenswrapper[4743]: I1011 02:13:11.856805 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 11 02:13:11 crc kubenswrapper[4743]: I1011 02:13:11.964079 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e55c62fb-39e0-41ac-8f79-c82c962b035f-scripts\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:11 crc kubenswrapper[4743]: I1011 02:13:11.964335 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e55c62fb-39e0-41ac-8f79-c82c962b035f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:11 crc kubenswrapper[4743]: I1011 02:13:11.964354 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e55c62fb-39e0-41ac-8f79-c82c962b035f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:11 crc kubenswrapper[4743]: I1011 02:13:11.964422 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e55c62fb-39e0-41ac-8f79-c82c962b035f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:11 crc kubenswrapper[4743]: I1011 02:13:11.964450 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:11 crc kubenswrapper[4743]: I1011 02:13:11.964519 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e55c62fb-39e0-41ac-8f79-c82c962b035f-logs\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:11 crc kubenswrapper[4743]: I1011 02:13:11.964538 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e55c62fb-39e0-41ac-8f79-c82c962b035f-ceph\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:11 crc kubenswrapper[4743]: I1011 02:13:11.964566 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87krr\" (UniqueName: \"kubernetes.io/projected/e55c62fb-39e0-41ac-8f79-c82c962b035f-kube-api-access-87krr\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:11 crc kubenswrapper[4743]: I1011 02:13:11.964608 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e55c62fb-39e0-41ac-8f79-c82c962b035f-config-data\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.065822 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e55c62fb-39e0-41ac-8f79-c82c962b035f-config-data\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.066125 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e55c62fb-39e0-41ac-8f79-c82c962b035f-scripts\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.066156 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e55c62fb-39e0-41ac-8f79-c82c962b035f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.066170 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e55c62fb-39e0-41ac-8f79-c82c962b035f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.066204 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e55c62fb-39e0-41ac-8f79-c82c962b035f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.066230 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.066297 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e55c62fb-39e0-41ac-8f79-c82c962b035f-logs\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.066314 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e55c62fb-39e0-41ac-8f79-c82c962b035f-ceph\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.066342 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87krr\" (UniqueName: \"kubernetes.io/projected/e55c62fb-39e0-41ac-8f79-c82c962b035f-kube-api-access-87krr\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.066683 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.067421 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e55c62fb-39e0-41ac-8f79-c82c962b035f-logs\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.067448 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e55c62fb-39e0-41ac-8f79-c82c962b035f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.074143 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e55c62fb-39e0-41ac-8f79-c82c962b035f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.074349 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e55c62fb-39e0-41ac-8f79-c82c962b035f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.075454 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e55c62fb-39e0-41ac-8f79-c82c962b035f-scripts\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.075535 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e55c62fb-39e0-41ac-8f79-c82c962b035f-ceph\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.082079 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e55c62fb-39e0-41ac-8f79-c82c962b035f-config-data\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.089755 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87krr\" (UniqueName: \"kubernetes.io/projected/e55c62fb-39e0-41ac-8f79-c82c962b035f-kube-api-access-87krr\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.138133 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd5ead6a-c090-4c5f-8759-6080e99b8753" path="/var/lib/kubelet/pods/fd5ead6a-c090-4c5f-8759-6080e99b8753/volumes" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.138589 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-859df554d7-5h8zd"] Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.173320 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5bc9759f8-b6qgh"] Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.175926 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.187951 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.216733 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 11 02:13:12 crc kubenswrapper[4743]: E1011 02:13:12.218263 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[glance], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-external-api-0" podUID="e55c62fb-39e0-41ac-8f79-c82c962b035f" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.236045 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5bc9759f8-b6qgh"] Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.270013 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9bcfcb9cc-5wvcg"] Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.271433 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-scripts\") pod \"horizon-5bc9759f8-b6qgh\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.271475 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxw7h\" (UniqueName: \"kubernetes.io/projected/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-kube-api-access-wxw7h\") pod \"horizon-5bc9759f8-b6qgh\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.271539 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-config-data\") pod \"horizon-5bc9759f8-b6qgh\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.271632 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-horizon-secret-key\") pod \"horizon-5bc9759f8-b6qgh\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.271652 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-combined-ca-bundle\") pod \"horizon-5bc9759f8-b6qgh\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.271673 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-horizon-tls-certs\") pod \"horizon-5bc9759f8-b6qgh\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.271748 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-logs\") pod \"horizon-5bc9759f8-b6qgh\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.272190 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.341944 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-f46b79456-dm9d6"] Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.343815 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.348279 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.353818 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f46b79456-dm9d6"] Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.379844 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-logs\") pod \"horizon-5bc9759f8-b6qgh\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.380334 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-scripts\") pod \"horizon-5bc9759f8-b6qgh\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.380375 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxw7h\" (UniqueName: \"kubernetes.io/projected/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-kube-api-access-wxw7h\") pod \"horizon-5bc9759f8-b6qgh\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.380418 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-config-data\") pod \"horizon-5bc9759f8-b6qgh\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.380482 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-horizon-secret-key\") pod \"horizon-5bc9759f8-b6qgh\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.380499 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-combined-ca-bundle\") pod \"horizon-5bc9759f8-b6qgh\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.380518 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-horizon-tls-certs\") pod \"horizon-5bc9759f8-b6qgh\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.381657 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-logs\") pod \"horizon-5bc9759f8-b6qgh\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.389698 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-scripts\") pod \"horizon-5bc9759f8-b6qgh\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.392442 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-config-data\") pod \"horizon-5bc9759f8-b6qgh\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.435957 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-horizon-tls-certs\") pod \"horizon-5bc9759f8-b6qgh\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.436098 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-horizon-secret-key\") pod \"horizon-5bc9759f8-b6qgh\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.437434 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxw7h\" (UniqueName: \"kubernetes.io/projected/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-kube-api-access-wxw7h\") pod \"horizon-5bc9759f8-b6qgh\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.446469 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-combined-ca-bundle\") pod \"horizon-5bc9759f8-b6qgh\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.452742 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.484888 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36f566d2-9c6b-4bc3-a1a3-47a11e6eee45-combined-ca-bundle\") pod \"horizon-f46b79456-dm9d6\" (UID: \"36f566d2-9c6b-4bc3-a1a3-47a11e6eee45\") " pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.485082 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36f566d2-9c6b-4bc3-a1a3-47a11e6eee45-scripts\") pod \"horizon-f46b79456-dm9d6\" (UID: \"36f566d2-9c6b-4bc3-a1a3-47a11e6eee45\") " pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.485235 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lr5s\" (UniqueName: \"kubernetes.io/projected/36f566d2-9c6b-4bc3-a1a3-47a11e6eee45-kube-api-access-8lr5s\") pod \"horizon-f46b79456-dm9d6\" (UID: \"36f566d2-9c6b-4bc3-a1a3-47a11e6eee45\") " pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.485300 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/36f566d2-9c6b-4bc3-a1a3-47a11e6eee45-horizon-secret-key\") pod \"horizon-f46b79456-dm9d6\" (UID: \"36f566d2-9c6b-4bc3-a1a3-47a11e6eee45\") " pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.485368 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36f566d2-9c6b-4bc3-a1a3-47a11e6eee45-config-data\") pod \"horizon-f46b79456-dm9d6\" (UID: \"36f566d2-9c6b-4bc3-a1a3-47a11e6eee45\") " pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.485387 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/36f566d2-9c6b-4bc3-a1a3-47a11e6eee45-horizon-tls-certs\") pod \"horizon-f46b79456-dm9d6\" (UID: \"36f566d2-9c6b-4bc3-a1a3-47a11e6eee45\") " pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.485555 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36f566d2-9c6b-4bc3-a1a3-47a11e6eee45-logs\") pod \"horizon-f46b79456-dm9d6\" (UID: \"36f566d2-9c6b-4bc3-a1a3-47a11e6eee45\") " pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.587658 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36f566d2-9c6b-4bc3-a1a3-47a11e6eee45-scripts\") pod \"horizon-f46b79456-dm9d6\" (UID: \"36f566d2-9c6b-4bc3-a1a3-47a11e6eee45\") " pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.587755 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lr5s\" (UniqueName: \"kubernetes.io/projected/36f566d2-9c6b-4bc3-a1a3-47a11e6eee45-kube-api-access-8lr5s\") pod \"horizon-f46b79456-dm9d6\" (UID: \"36f566d2-9c6b-4bc3-a1a3-47a11e6eee45\") " pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.587793 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/36f566d2-9c6b-4bc3-a1a3-47a11e6eee45-horizon-secret-key\") pod \"horizon-f46b79456-dm9d6\" (UID: \"36f566d2-9c6b-4bc3-a1a3-47a11e6eee45\") " pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.587826 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36f566d2-9c6b-4bc3-a1a3-47a11e6eee45-config-data\") pod \"horizon-f46b79456-dm9d6\" (UID: \"36f566d2-9c6b-4bc3-a1a3-47a11e6eee45\") " pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.587873 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/36f566d2-9c6b-4bc3-a1a3-47a11e6eee45-horizon-tls-certs\") pod \"horizon-f46b79456-dm9d6\" (UID: \"36f566d2-9c6b-4bc3-a1a3-47a11e6eee45\") " pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.587968 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36f566d2-9c6b-4bc3-a1a3-47a11e6eee45-logs\") pod \"horizon-f46b79456-dm9d6\" (UID: \"36f566d2-9c6b-4bc3-a1a3-47a11e6eee45\") " pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.588005 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36f566d2-9c6b-4bc3-a1a3-47a11e6eee45-combined-ca-bundle\") pod \"horizon-f46b79456-dm9d6\" (UID: \"36f566d2-9c6b-4bc3-a1a3-47a11e6eee45\") " pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.588575 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36f566d2-9c6b-4bc3-a1a3-47a11e6eee45-logs\") pod \"horizon-f46b79456-dm9d6\" (UID: \"36f566d2-9c6b-4bc3-a1a3-47a11e6eee45\") " pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.589994 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36f566d2-9c6b-4bc3-a1a3-47a11e6eee45-config-data\") pod \"horizon-f46b79456-dm9d6\" (UID: \"36f566d2-9c6b-4bc3-a1a3-47a11e6eee45\") " pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.597348 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/36f566d2-9c6b-4bc3-a1a3-47a11e6eee45-horizon-secret-key\") pod \"horizon-f46b79456-dm9d6\" (UID: \"36f566d2-9c6b-4bc3-a1a3-47a11e6eee45\") " pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.599120 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36f566d2-9c6b-4bc3-a1a3-47a11e6eee45-scripts\") pod \"horizon-f46b79456-dm9d6\" (UID: \"36f566d2-9c6b-4bc3-a1a3-47a11e6eee45\") " pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.602418 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/36f566d2-9c6b-4bc3-a1a3-47a11e6eee45-horizon-tls-certs\") pod \"horizon-f46b79456-dm9d6\" (UID: \"36f566d2-9c6b-4bc3-a1a3-47a11e6eee45\") " pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.610904 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36f566d2-9c6b-4bc3-a1a3-47a11e6eee45-combined-ca-bundle\") pod \"horizon-f46b79456-dm9d6\" (UID: \"36f566d2-9c6b-4bc3-a1a3-47a11e6eee45\") " pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.666686 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lr5s\" (UniqueName: \"kubernetes.io/projected/36f566d2-9c6b-4bc3-a1a3-47a11e6eee45-kube-api-access-8lr5s\") pod \"horizon-f46b79456-dm9d6\" (UID: \"36f566d2-9c6b-4bc3-a1a3-47a11e6eee45\") " pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.768159 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.790137 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"129685c1-9de5-4c18-9219-172fe359aa89","Type":"ContainerStarted","Data":"25cdaaf2df5cabbbc72e7fe06d3aa065256e653b7d4fcfb2eea13fed4de11b3e"} Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.790174 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"129685c1-9de5-4c18-9219-172fe359aa89","Type":"ContainerStarted","Data":"b2aa87f49fbbcc2cf014175b0ea07da23db8e1db9473c3985b1807b7711d5fa6"} Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.821968 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c58f083e-feb9-4f5c-961d-f61af07d794a","Type":"ContainerStarted","Data":"8654aa9afa4d26fdd386a0b3c789b262c909c546dc573b53c67f3ba0409e4775"} Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.822040 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.827069 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.796224471 podStartE2EDuration="4.827054518s" podCreationTimestamp="2025-10-11 02:13:08 +0000 UTC" firstStartedPulling="2025-10-11 02:13:10.4595153 +0000 UTC m=+4885.112495697" lastFinishedPulling="2025-10-11 02:13:11.490345347 +0000 UTC m=+4886.143325744" observedRunningTime="2025-10-11 02:13:12.826675098 +0000 UTC m=+4887.479655495" watchObservedRunningTime="2025-10-11 02:13:12.827054518 +0000 UTC m=+4887.480034915" Oct 11 02:13:12 crc kubenswrapper[4743]: I1011 02:13:12.839399 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.008676 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e55c62fb-39e0-41ac-8f79-c82c962b035f-scripts\") pod \"e55c62fb-39e0-41ac-8f79-c82c962b035f\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.009401 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e55c62fb-39e0-41ac-8f79-c82c962b035f-httpd-run\") pod \"e55c62fb-39e0-41ac-8f79-c82c962b035f\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.009446 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e55c62fb-39e0-41ac-8f79-c82c962b035f-public-tls-certs\") pod \"e55c62fb-39e0-41ac-8f79-c82c962b035f\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.009506 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e55c62fb-39e0-41ac-8f79-c82c962b035f-logs\") pod \"e55c62fb-39e0-41ac-8f79-c82c962b035f\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.009541 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"e55c62fb-39e0-41ac-8f79-c82c962b035f\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.009638 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e55c62fb-39e0-41ac-8f79-c82c962b035f-config-data\") pod \"e55c62fb-39e0-41ac-8f79-c82c962b035f\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.009728 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87krr\" (UniqueName: \"kubernetes.io/projected/e55c62fb-39e0-41ac-8f79-c82c962b035f-kube-api-access-87krr\") pod \"e55c62fb-39e0-41ac-8f79-c82c962b035f\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.009769 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e55c62fb-39e0-41ac-8f79-c82c962b035f-ceph\") pod \"e55c62fb-39e0-41ac-8f79-c82c962b035f\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.009815 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e55c62fb-39e0-41ac-8f79-c82c962b035f-combined-ca-bundle\") pod \"e55c62fb-39e0-41ac-8f79-c82c962b035f\" (UID: \"e55c62fb-39e0-41ac-8f79-c82c962b035f\") " Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.024283 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e55c62fb-39e0-41ac-8f79-c82c962b035f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e55c62fb-39e0-41ac-8f79-c82c962b035f" (UID: "e55c62fb-39e0-41ac-8f79-c82c962b035f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.024570 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e55c62fb-39e0-41ac-8f79-c82c962b035f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e55c62fb-39e0-41ac-8f79-c82c962b035f" (UID: "e55c62fb-39e0-41ac-8f79-c82c962b035f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.027530 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e55c62fb-39e0-41ac-8f79-c82c962b035f-scripts" (OuterVolumeSpecName: "scripts") pod "e55c62fb-39e0-41ac-8f79-c82c962b035f" (UID: "e55c62fb-39e0-41ac-8f79-c82c962b035f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.027606 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e55c62fb-39e0-41ac-8f79-c82c962b035f-logs" (OuterVolumeSpecName: "logs") pod "e55c62fb-39e0-41ac-8f79-c82c962b035f" (UID: "e55c62fb-39e0-41ac-8f79-c82c962b035f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.028103 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e55c62fb-39e0-41ac-8f79-c82c962b035f-config-data" (OuterVolumeSpecName: "config-data") pod "e55c62fb-39e0-41ac-8f79-c82c962b035f" (UID: "e55c62fb-39e0-41ac-8f79-c82c962b035f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.034117 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "e55c62fb-39e0-41ac-8f79-c82c962b035f" (UID: "e55c62fb-39e0-41ac-8f79-c82c962b035f"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.034147 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e55c62fb-39e0-41ac-8f79-c82c962b035f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e55c62fb-39e0-41ac-8f79-c82c962b035f" (UID: "e55c62fb-39e0-41ac-8f79-c82c962b035f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.034188 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e55c62fb-39e0-41ac-8f79-c82c962b035f-kube-api-access-87krr" (OuterVolumeSpecName: "kube-api-access-87krr") pod "e55c62fb-39e0-41ac-8f79-c82c962b035f" (UID: "e55c62fb-39e0-41ac-8f79-c82c962b035f"). InnerVolumeSpecName "kube-api-access-87krr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.042051 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e55c62fb-39e0-41ac-8f79-c82c962b035f-ceph" (OuterVolumeSpecName: "ceph") pod "e55c62fb-39e0-41ac-8f79-c82c962b035f" (UID: "e55c62fb-39e0-41ac-8f79-c82c962b035f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.112445 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e55c62fb-39e0-41ac-8f79-c82c962b035f-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.112474 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87krr\" (UniqueName: \"kubernetes.io/projected/e55c62fb-39e0-41ac-8f79-c82c962b035f-kube-api-access-87krr\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.112485 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e55c62fb-39e0-41ac-8f79-c82c962b035f-ceph\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.112494 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e55c62fb-39e0-41ac-8f79-c82c962b035f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.112503 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e55c62fb-39e0-41ac-8f79-c82c962b035f-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.112510 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e55c62fb-39e0-41ac-8f79-c82c962b035f-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.112521 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e55c62fb-39e0-41ac-8f79-c82c962b035f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.112528 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e55c62fb-39e0-41ac-8f79-c82c962b035f-logs\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.112558 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.189974 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.218323 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.556112 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-qfwq7" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.730436 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prr8b\" (UniqueName: \"kubernetes.io/projected/c0f86bb5-1ce9-44d0-a03a-ef624592cdc4-kube-api-access-prr8b\") pod \"c0f86bb5-1ce9-44d0-a03a-ef624592cdc4\" (UID: \"c0f86bb5-1ce9-44d0-a03a-ef624592cdc4\") " Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.734416 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0f86bb5-1ce9-44d0-a03a-ef624592cdc4-kube-api-access-prr8b" (OuterVolumeSpecName: "kube-api-access-prr8b") pod "c0f86bb5-1ce9-44d0-a03a-ef624592cdc4" (UID: "c0f86bb5-1ce9-44d0-a03a-ef624592cdc4"). InnerVolumeSpecName "kube-api-access-prr8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.736124 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f46b79456-dm9d6"] Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.808400 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5bc9759f8-b6qgh"] Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.833743 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prr8b\" (UniqueName: \"kubernetes.io/projected/c0f86bb5-1ce9-44d0-a03a-ef624592cdc4-kube-api-access-prr8b\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.863165 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-qfwq7" event={"ID":"c0f86bb5-1ce9-44d0-a03a-ef624592cdc4","Type":"ContainerDied","Data":"1dd9207d031b971a3d0ecb54e13a6493e4961ac58eeca85bc9d6d8d7b6bd5c05"} Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.864280 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dd9207d031b971a3d0ecb54e13a6493e4961ac58eeca85bc9d6d8d7b6bd5c05" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.864398 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-qfwq7" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.881496 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"f333d397-070a-4624-8b2d-856964010b75","Type":"ContainerStarted","Data":"b0d86c907d6695da98c8ad0f825dcab7f699a761ea0e9b8f440233a9b774539b"} Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.881535 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"f333d397-070a-4624-8b2d-856964010b75","Type":"ContainerStarted","Data":"f644e526dc996f4909d183dc78a9734149d87ae9a70469906b6c2ee2383b9325"} Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.888330 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f46b79456-dm9d6" event={"ID":"36f566d2-9c6b-4bc3-a1a3-47a11e6eee45","Type":"ContainerStarted","Data":"8e2c5d6dbbecef9473555c2b56ea011d86b75ffedd0d3be0460a50f26398147e"} Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.889957 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c58f083e-feb9-4f5c-961d-f61af07d794a","Type":"ContainerStarted","Data":"286bf0def7f9b24806f0126af365526d89817a60633bb16c7c9e1667946a9b48"} Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.890292 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c58f083e-feb9-4f5c-961d-f61af07d794a" containerName="glance-log" containerID="cri-o://8654aa9afa4d26fdd386a0b3c789b262c909c546dc573b53c67f3ba0409e4775" gracePeriod=30 Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.890797 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c58f083e-feb9-4f5c-961d-f61af07d794a" containerName="glance-httpd" containerID="cri-o://286bf0def7f9b24806f0126af365526d89817a60633bb16c7c9e1667946a9b48" gracePeriod=30 Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.896340 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.896390 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bc9759f8-b6qgh" event={"ID":"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd","Type":"ContainerStarted","Data":"b000cfaab42b0bf540f76658399b367b4b7a3dd49d79845db87bebcddd1b7ff5"} Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.918921 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=4.348672197 podStartE2EDuration="5.918902468s" podCreationTimestamp="2025-10-11 02:13:08 +0000 UTC" firstStartedPulling="2025-10-11 02:13:11.267378489 +0000 UTC m=+4885.920358886" lastFinishedPulling="2025-10-11 02:13:12.83760876 +0000 UTC m=+4887.490589157" observedRunningTime="2025-10-11 02:13:13.901239881 +0000 UTC m=+4888.554220278" watchObservedRunningTime="2025-10-11 02:13:13.918902468 +0000 UTC m=+4888.571882855" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.938988 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.938967716 podStartE2EDuration="4.938967716s" podCreationTimestamp="2025-10-11 02:13:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 02:13:13.930565168 +0000 UTC m=+4888.583545565" watchObservedRunningTime="2025-10-11 02:13:13.938967716 +0000 UTC m=+4888.591948113" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.950237 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:13 crc kubenswrapper[4743]: I1011 02:13:13.982294 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.031622 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.043410 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.055310 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 11 02:13:14 crc kubenswrapper[4743]: E1011 02:13:14.055790 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0f86bb5-1ce9-44d0-a03a-ef624592cdc4" containerName="mariadb-database-create" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.055802 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0f86bb5-1ce9-44d0-a03a-ef624592cdc4" containerName="mariadb-database-create" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.056155 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0f86bb5-1ce9-44d0-a03a-ef624592cdc4" containerName="mariadb-database-create" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.057443 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.064215 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.064434 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.080939 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.112017 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e55c62fb-39e0-41ac-8f79-c82c962b035f" path="/var/lib/kubelet/pods/e55c62fb-39e0-41ac-8f79-c82c962b035f/volumes" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.246412 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.246460 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e1cee17-cf14-4bf2-bda0-2f651412f042-logs\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.246503 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e1cee17-cf14-4bf2-bda0-2f651412f042-scripts\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.246554 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfswj\" (UniqueName: \"kubernetes.io/projected/2e1cee17-cf14-4bf2-bda0-2f651412f042-kube-api-access-nfswj\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.246611 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e1cee17-cf14-4bf2-bda0-2f651412f042-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.246694 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e1cee17-cf14-4bf2-bda0-2f651412f042-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.246718 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e1cee17-cf14-4bf2-bda0-2f651412f042-config-data\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.246743 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e1cee17-cf14-4bf2-bda0-2f651412f042-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.246762 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2e1cee17-cf14-4bf2-bda0-2f651412f042-ceph\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.351217 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e1cee17-cf14-4bf2-bda0-2f651412f042-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.352666 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e1cee17-cf14-4bf2-bda0-2f651412f042-config-data\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.352777 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e1cee17-cf14-4bf2-bda0-2f651412f042-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.358528 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2e1cee17-cf14-4bf2-bda0-2f651412f042-ceph\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.358720 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.358752 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e1cee17-cf14-4bf2-bda0-2f651412f042-logs\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.358810 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e1cee17-cf14-4bf2-bda0-2f651412f042-scripts\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.358929 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfswj\" (UniqueName: \"kubernetes.io/projected/2e1cee17-cf14-4bf2-bda0-2f651412f042-kube-api-access-nfswj\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.359077 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e1cee17-cf14-4bf2-bda0-2f651412f042-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.359682 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e1cee17-cf14-4bf2-bda0-2f651412f042-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.360150 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e1cee17-cf14-4bf2-bda0-2f651412f042-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.360588 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e1cee17-cf14-4bf2-bda0-2f651412f042-logs\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.360873 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e1cee17-cf14-4bf2-bda0-2f651412f042-config-data\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.361013 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.365737 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e1cee17-cf14-4bf2-bda0-2f651412f042-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.367914 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e1cee17-cf14-4bf2-bda0-2f651412f042-scripts\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.369932 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2e1cee17-cf14-4bf2-bda0-2f651412f042-ceph\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.410217 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfswj\" (UniqueName: \"kubernetes.io/projected/2e1cee17-cf14-4bf2-bda0-2f651412f042-kube-api-access-nfswj\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.414950 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"2e1cee17-cf14-4bf2-bda0-2f651412f042\") " pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.710875 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.727683 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.875526 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"c58f083e-feb9-4f5c-961d-f61af07d794a\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.875596 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksw5q\" (UniqueName: \"kubernetes.io/projected/c58f083e-feb9-4f5c-961d-f61af07d794a-kube-api-access-ksw5q\") pod \"c58f083e-feb9-4f5c-961d-f61af07d794a\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.875663 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c58f083e-feb9-4f5c-961d-f61af07d794a-internal-tls-certs\") pod \"c58f083e-feb9-4f5c-961d-f61af07d794a\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.875785 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c58f083e-feb9-4f5c-961d-f61af07d794a-config-data\") pod \"c58f083e-feb9-4f5c-961d-f61af07d794a\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.875844 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c58f083e-feb9-4f5c-961d-f61af07d794a-httpd-run\") pod \"c58f083e-feb9-4f5c-961d-f61af07d794a\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.875895 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c58f083e-feb9-4f5c-961d-f61af07d794a-scripts\") pod \"c58f083e-feb9-4f5c-961d-f61af07d794a\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.875989 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58f083e-feb9-4f5c-961d-f61af07d794a-combined-ca-bundle\") pod \"c58f083e-feb9-4f5c-961d-f61af07d794a\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.876016 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c58f083e-feb9-4f5c-961d-f61af07d794a-logs\") pod \"c58f083e-feb9-4f5c-961d-f61af07d794a\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.876032 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c58f083e-feb9-4f5c-961d-f61af07d794a-ceph\") pod \"c58f083e-feb9-4f5c-961d-f61af07d794a\" (UID: \"c58f083e-feb9-4f5c-961d-f61af07d794a\") " Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.878405 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c58f083e-feb9-4f5c-961d-f61af07d794a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c58f083e-feb9-4f5c-961d-f61af07d794a" (UID: "c58f083e-feb9-4f5c-961d-f61af07d794a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.878594 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c58f083e-feb9-4f5c-961d-f61af07d794a-logs" (OuterVolumeSpecName: "logs") pod "c58f083e-feb9-4f5c-961d-f61af07d794a" (UID: "c58f083e-feb9-4f5c-961d-f61af07d794a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.883188 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "c58f083e-feb9-4f5c-961d-f61af07d794a" (UID: "c58f083e-feb9-4f5c-961d-f61af07d794a"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.885598 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c58f083e-feb9-4f5c-961d-f61af07d794a-scripts" (OuterVolumeSpecName: "scripts") pod "c58f083e-feb9-4f5c-961d-f61af07d794a" (UID: "c58f083e-feb9-4f5c-961d-f61af07d794a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.891539 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c58f083e-feb9-4f5c-961d-f61af07d794a-ceph" (OuterVolumeSpecName: "ceph") pod "c58f083e-feb9-4f5c-961d-f61af07d794a" (UID: "c58f083e-feb9-4f5c-961d-f61af07d794a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.902151 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c58f083e-feb9-4f5c-961d-f61af07d794a-kube-api-access-ksw5q" (OuterVolumeSpecName: "kube-api-access-ksw5q") pod "c58f083e-feb9-4f5c-961d-f61af07d794a" (UID: "c58f083e-feb9-4f5c-961d-f61af07d794a"). InnerVolumeSpecName "kube-api-access-ksw5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.944560 4743 generic.go:334] "Generic (PLEG): container finished" podID="c58f083e-feb9-4f5c-961d-f61af07d794a" containerID="286bf0def7f9b24806f0126af365526d89817a60633bb16c7c9e1667946a9b48" exitCode=0 Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.944758 4743 generic.go:334] "Generic (PLEG): container finished" podID="c58f083e-feb9-4f5c-961d-f61af07d794a" containerID="8654aa9afa4d26fdd386a0b3c789b262c909c546dc573b53c67f3ba0409e4775" exitCode=143 Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.945830 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.947620 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c58f083e-feb9-4f5c-961d-f61af07d794a","Type":"ContainerDied","Data":"286bf0def7f9b24806f0126af365526d89817a60633bb16c7c9e1667946a9b48"} Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.947753 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c58f083e-feb9-4f5c-961d-f61af07d794a","Type":"ContainerDied","Data":"8654aa9afa4d26fdd386a0b3c789b262c909c546dc573b53c67f3ba0409e4775"} Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.947812 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c58f083e-feb9-4f5c-961d-f61af07d794a","Type":"ContainerDied","Data":"2786b7ed75bb6eb011ca8a6d2b7e81ce11ab85b8f751e0b2b063cb8d49e22ed1"} Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.952942 4743 scope.go:117] "RemoveContainer" containerID="286bf0def7f9b24806f0126af365526d89817a60633bb16c7c9e1667946a9b48" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.955053 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c58f083e-feb9-4f5c-961d-f61af07d794a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c58f083e-feb9-4f5c-961d-f61af07d794a" (UID: "c58f083e-feb9-4f5c-961d-f61af07d794a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.978778 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58f083e-feb9-4f5c-961d-f61af07d794a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.978805 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c58f083e-feb9-4f5c-961d-f61af07d794a-logs\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.978836 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c58f083e-feb9-4f5c-961d-f61af07d794a-ceph\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.979011 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.979031 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksw5q\" (UniqueName: \"kubernetes.io/projected/c58f083e-feb9-4f5c-961d-f61af07d794a-kube-api-access-ksw5q\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.979042 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c58f083e-feb9-4f5c-961d-f61af07d794a-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.979053 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c58f083e-feb9-4f5c-961d-f61af07d794a-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:14 crc kubenswrapper[4743]: I1011 02:13:14.986717 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c58f083e-feb9-4f5c-961d-f61af07d794a-config-data" (OuterVolumeSpecName: "config-data") pod "c58f083e-feb9-4f5c-961d-f61af07d794a" (UID: "c58f083e-feb9-4f5c-961d-f61af07d794a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.017554 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.054983 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c58f083e-feb9-4f5c-961d-f61af07d794a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c58f083e-feb9-4f5c-961d-f61af07d794a" (UID: "c58f083e-feb9-4f5c-961d-f61af07d794a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.084434 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c58f083e-feb9-4f5c-961d-f61af07d794a-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.084701 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.084712 4743 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c58f083e-feb9-4f5c-961d-f61af07d794a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.294640 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.315944 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.330581 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 11 02:13:15 crc kubenswrapper[4743]: E1011 02:13:15.332063 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c58f083e-feb9-4f5c-961d-f61af07d794a" containerName="glance-log" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.332085 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c58f083e-feb9-4f5c-961d-f61af07d794a" containerName="glance-log" Oct 11 02:13:15 crc kubenswrapper[4743]: E1011 02:13:15.332123 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c58f083e-feb9-4f5c-961d-f61af07d794a" containerName="glance-httpd" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.332131 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c58f083e-feb9-4f5c-961d-f61af07d794a" containerName="glance-httpd" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.332445 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c58f083e-feb9-4f5c-961d-f61af07d794a" containerName="glance-log" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.332463 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c58f083e-feb9-4f5c-961d-f61af07d794a" containerName="glance-httpd" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.332815 4743 scope.go:117] "RemoveContainer" containerID="8654aa9afa4d26fdd386a0b3c789b262c909c546dc573b53c67f3ba0409e4775" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.338587 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.345722 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.346532 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.360797 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.375613 4743 scope.go:117] "RemoveContainer" containerID="286bf0def7f9b24806f0126af365526d89817a60633bb16c7c9e1667946a9b48" Oct 11 02:13:15 crc kubenswrapper[4743]: E1011 02:13:15.376243 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"286bf0def7f9b24806f0126af365526d89817a60633bb16c7c9e1667946a9b48\": container with ID starting with 286bf0def7f9b24806f0126af365526d89817a60633bb16c7c9e1667946a9b48 not found: ID does not exist" containerID="286bf0def7f9b24806f0126af365526d89817a60633bb16c7c9e1667946a9b48" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.376270 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"286bf0def7f9b24806f0126af365526d89817a60633bb16c7c9e1667946a9b48"} err="failed to get container status \"286bf0def7f9b24806f0126af365526d89817a60633bb16c7c9e1667946a9b48\": rpc error: code = NotFound desc = could not find container \"286bf0def7f9b24806f0126af365526d89817a60633bb16c7c9e1667946a9b48\": container with ID starting with 286bf0def7f9b24806f0126af365526d89817a60633bb16c7c9e1667946a9b48 not found: ID does not exist" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.376288 4743 scope.go:117] "RemoveContainer" containerID="8654aa9afa4d26fdd386a0b3c789b262c909c546dc573b53c67f3ba0409e4775" Oct 11 02:13:15 crc kubenswrapper[4743]: E1011 02:13:15.376550 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8654aa9afa4d26fdd386a0b3c789b262c909c546dc573b53c67f3ba0409e4775\": container with ID starting with 8654aa9afa4d26fdd386a0b3c789b262c909c546dc573b53c67f3ba0409e4775 not found: ID does not exist" containerID="8654aa9afa4d26fdd386a0b3c789b262c909c546dc573b53c67f3ba0409e4775" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.376567 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8654aa9afa4d26fdd386a0b3c789b262c909c546dc573b53c67f3ba0409e4775"} err="failed to get container status \"8654aa9afa4d26fdd386a0b3c789b262c909c546dc573b53c67f3ba0409e4775\": rpc error: code = NotFound desc = could not find container \"8654aa9afa4d26fdd386a0b3c789b262c909c546dc573b53c67f3ba0409e4775\": container with ID starting with 8654aa9afa4d26fdd386a0b3c789b262c909c546dc573b53c67f3ba0409e4775 not found: ID does not exist" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.376578 4743 scope.go:117] "RemoveContainer" containerID="286bf0def7f9b24806f0126af365526d89817a60633bb16c7c9e1667946a9b48" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.376839 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"286bf0def7f9b24806f0126af365526d89817a60633bb16c7c9e1667946a9b48"} err="failed to get container status \"286bf0def7f9b24806f0126af365526d89817a60633bb16c7c9e1667946a9b48\": rpc error: code = NotFound desc = could not find container \"286bf0def7f9b24806f0126af365526d89817a60633bb16c7c9e1667946a9b48\": container with ID starting with 286bf0def7f9b24806f0126af365526d89817a60633bb16c7c9e1667946a9b48 not found: ID does not exist" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.376883 4743 scope.go:117] "RemoveContainer" containerID="8654aa9afa4d26fdd386a0b3c789b262c909c546dc573b53c67f3ba0409e4775" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.377230 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8654aa9afa4d26fdd386a0b3c789b262c909c546dc573b53c67f3ba0409e4775"} err="failed to get container status \"8654aa9afa4d26fdd386a0b3c789b262c909c546dc573b53c67f3ba0409e4775\": rpc error: code = NotFound desc = could not find container \"8654aa9afa4d26fdd386a0b3c789b262c909c546dc573b53c67f3ba0409e4775\": container with ID starting with 8654aa9afa4d26fdd386a0b3c789b262c909c546dc573b53c67f3ba0409e4775 not found: ID does not exist" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.504549 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.504842 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.504928 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964-logs\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.504947 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.505116 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.505278 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrrnf\" (UniqueName: \"kubernetes.io/projected/d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964-kube-api-access-rrrnf\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.505346 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.505418 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.505600 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.607434 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.607519 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.607547 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.607605 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964-logs\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.607627 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.607675 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.607702 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrrnf\" (UniqueName: \"kubernetes.io/projected/d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964-kube-api-access-rrrnf\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.607722 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.607751 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.608126 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.608241 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.608346 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964-logs\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.615488 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.616659 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.618471 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.619515 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.626134 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.632343 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.649593 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrrnf\" (UniqueName: \"kubernetes.io/projected/d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964-kube-api-access-rrrnf\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: W1011 02:13:15.690231 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e1cee17_cf14_4bf2_bda0_2f651412f042.slice/crio-db519a16aec5a515e21b1c80aa70fe9bebc05cb488fd5cd1791d9699d297df04 WatchSource:0}: Error finding container db519a16aec5a515e21b1c80aa70fe9bebc05cb488fd5cd1791d9699d297df04: Status 404 returned error can't find the container with id db519a16aec5a515e21b1c80aa70fe9bebc05cb488fd5cd1791d9699d297df04 Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.721194 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964\") " pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.963788 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 11 02:13:15 crc kubenswrapper[4743]: I1011 02:13:15.985127 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e1cee17-cf14-4bf2-bda0-2f651412f042","Type":"ContainerStarted","Data":"db519a16aec5a515e21b1c80aa70fe9bebc05cb488fd5cd1791d9699d297df04"} Oct 11 02:13:16 crc kubenswrapper[4743]: I1011 02:13:16.191584 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c58f083e-feb9-4f5c-961d-f61af07d794a" path="/var/lib/kubelet/pods/c58f083e-feb9-4f5c-961d-f61af07d794a/volumes" Oct 11 02:13:16 crc kubenswrapper[4743]: I1011 02:13:16.643971 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 11 02:13:17 crc kubenswrapper[4743]: I1011 02:13:17.008677 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964","Type":"ContainerStarted","Data":"97056eab6553cbeb32aaf32ba7c0cc54302c1158904e8e18297c7339ae05a655"} Oct 11 02:13:17 crc kubenswrapper[4743]: I1011 02:13:17.011779 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e1cee17-cf14-4bf2-bda0-2f651412f042","Type":"ContainerStarted","Data":"39ea01bc9aadbf76a5ebde41ab8e9c5a15af71ccfa0ef4fdddcd0d17a94d8b75"} Oct 11 02:13:18 crc kubenswrapper[4743]: I1011 02:13:18.042063 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e1cee17-cf14-4bf2-bda0-2f651412f042","Type":"ContainerStarted","Data":"ab33198fd874822b9625b11dbb804eef425216427a65efdca258b8ffb989143e"} Oct 11 02:13:18 crc kubenswrapper[4743]: I1011 02:13:18.045063 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964","Type":"ContainerStarted","Data":"d2277fc80e43d56c555976f21843063c559a40f4177d791f5b32d509658be5ce"} Oct 11 02:13:18 crc kubenswrapper[4743]: I1011 02:13:18.074553 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.07453467 podStartE2EDuration="5.07453467s" podCreationTimestamp="2025-10-11 02:13:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 02:13:18.061766503 +0000 UTC m=+4892.714746890" watchObservedRunningTime="2025-10-11 02:13:18.07453467 +0000 UTC m=+4892.727515067" Oct 11 02:13:19 crc kubenswrapper[4743]: I1011 02:13:19.085052 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964","Type":"ContainerStarted","Data":"e1c4fa0592e233336ed43e0463ac6bc721f16a3e8dd364ab2a04bae0f939020e"} Oct 11 02:13:19 crc kubenswrapper[4743]: I1011 02:13:19.117669 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.117650181 podStartE2EDuration="4.117650181s" podCreationTimestamp="2025-10-11 02:13:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 02:13:19.111759905 +0000 UTC m=+4893.764740302" watchObservedRunningTime="2025-10-11 02:13:19.117650181 +0000 UTC m=+4893.770630578" Oct 11 02:13:19 crc kubenswrapper[4743]: I1011 02:13:19.166608 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Oct 11 02:13:19 crc kubenswrapper[4743]: I1011 02:13:19.270730 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Oct 11 02:13:24 crc kubenswrapper[4743]: I1011 02:13:24.145914 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-859df554d7-5h8zd" event={"ID":"c920be26-905f-40c5-a4b6-c80b9428e662","Type":"ContainerStarted","Data":"fee3ecff320c6c393ce9c7d5f264d8d7e967976cdd6a7ff78943733e6965bb28"} Oct 11 02:13:24 crc kubenswrapper[4743]: I1011 02:13:24.146356 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-859df554d7-5h8zd" event={"ID":"c920be26-905f-40c5-a4b6-c80b9428e662","Type":"ContainerStarted","Data":"ea6c2dcdccf12c57ed883eab65300e78c176d46beb6bce3a1f66dea3733277f5"} Oct 11 02:13:24 crc kubenswrapper[4743]: I1011 02:13:24.146132 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-859df554d7-5h8zd" podUID="c920be26-905f-40c5-a4b6-c80b9428e662" containerName="horizon" containerID="cri-o://fee3ecff320c6c393ce9c7d5f264d8d7e967976cdd6a7ff78943733e6965bb28" gracePeriod=30 Oct 11 02:13:24 crc kubenswrapper[4743]: I1011 02:13:24.146022 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-859df554d7-5h8zd" podUID="c920be26-905f-40c5-a4b6-c80b9428e662" containerName="horizon-log" containerID="cri-o://ea6c2dcdccf12c57ed883eab65300e78c176d46beb6bce3a1f66dea3733277f5" gracePeriod=30 Oct 11 02:13:24 crc kubenswrapper[4743]: I1011 02:13:24.151067 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9bcfcb9cc-5wvcg" event={"ID":"d39c4e33-bfe6-4c48-bc00-f2713e45103b","Type":"ContainerStarted","Data":"960a4a30c95015be9bd9b9cafce3f72f8b46b567ff972d6cf002c7f8c8f49082"} Oct 11 02:13:24 crc kubenswrapper[4743]: I1011 02:13:24.155280 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f46b79456-dm9d6" event={"ID":"36f566d2-9c6b-4bc3-a1a3-47a11e6eee45","Type":"ContainerStarted","Data":"0e53ae0dc3facf55baa6c1a5ff57807c2867c4517b7e200e05fd576d6967848e"} Oct 11 02:13:24 crc kubenswrapper[4743]: I1011 02:13:24.156708 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bc9759f8-b6qgh" event={"ID":"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd","Type":"ContainerStarted","Data":"d1b484db2a51b4a01449527fdf8e1ed8a08f52dd1f27e1abe2a9f0f1d2e2b660"} Oct 11 02:13:24 crc kubenswrapper[4743]: I1011 02:13:24.169072 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-859df554d7-5h8zd" podStartSLOduration=2.599875543 podStartE2EDuration="15.169054232s" podCreationTimestamp="2025-10-11 02:13:09 +0000 UTC" firstStartedPulling="2025-10-11 02:13:10.825937404 +0000 UTC m=+4885.478917801" lastFinishedPulling="2025-10-11 02:13:23.395116053 +0000 UTC m=+4898.048096490" observedRunningTime="2025-10-11 02:13:24.163426662 +0000 UTC m=+4898.816407069" watchObservedRunningTime="2025-10-11 02:13:24.169054232 +0000 UTC m=+4898.822034629" Oct 11 02:13:24 crc kubenswrapper[4743]: I1011 02:13:24.711967 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 11 02:13:24 crc kubenswrapper[4743]: I1011 02:13:24.712347 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 11 02:13:24 crc kubenswrapper[4743]: I1011 02:13:24.748547 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 11 02:13:24 crc kubenswrapper[4743]: I1011 02:13:24.801635 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 11 02:13:25 crc kubenswrapper[4743]: I1011 02:13:25.168335 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9bcfcb9cc-5wvcg" event={"ID":"d39c4e33-bfe6-4c48-bc00-f2713e45103b","Type":"ContainerStarted","Data":"c70ade596b21bb0b30a5c94e6dc731d8e14182dcea30b227dee18d688140bc81"} Oct 11 02:13:25 crc kubenswrapper[4743]: I1011 02:13:25.168736 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-9bcfcb9cc-5wvcg" podUID="d39c4e33-bfe6-4c48-bc00-f2713e45103b" containerName="horizon-log" containerID="cri-o://960a4a30c95015be9bd9b9cafce3f72f8b46b567ff972d6cf002c7f8c8f49082" gracePeriod=30 Oct 11 02:13:25 crc kubenswrapper[4743]: I1011 02:13:25.168884 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-9bcfcb9cc-5wvcg" podUID="d39c4e33-bfe6-4c48-bc00-f2713e45103b" containerName="horizon" containerID="cri-o://c70ade596b21bb0b30a5c94e6dc731d8e14182dcea30b227dee18d688140bc81" gracePeriod=30 Oct 11 02:13:25 crc kubenswrapper[4743]: I1011 02:13:25.171091 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f46b79456-dm9d6" event={"ID":"36f566d2-9c6b-4bc3-a1a3-47a11e6eee45","Type":"ContainerStarted","Data":"c54b35c9bccfa8c18ad18c66be266860888956affc209f4d441856ded2284e56"} Oct 11 02:13:25 crc kubenswrapper[4743]: I1011 02:13:25.175342 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bc9759f8-b6qgh" event={"ID":"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd","Type":"ContainerStarted","Data":"1ae29a0313861a928c7541c1589ba7b739f50bb56d203301166706e6650049d2"} Oct 11 02:13:25 crc kubenswrapper[4743]: I1011 02:13:25.175543 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 11 02:13:25 crc kubenswrapper[4743]: I1011 02:13:25.175566 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 11 02:13:25 crc kubenswrapper[4743]: I1011 02:13:25.198045 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-9bcfcb9cc-5wvcg" podStartSLOduration=3.670465116 podStartE2EDuration="16.198027083s" podCreationTimestamp="2025-10-11 02:13:09 +0000 UTC" firstStartedPulling="2025-10-11 02:13:10.998076982 +0000 UTC m=+4885.651057379" lastFinishedPulling="2025-10-11 02:13:23.525638949 +0000 UTC m=+4898.178619346" observedRunningTime="2025-10-11 02:13:25.186850376 +0000 UTC m=+4899.839830773" watchObservedRunningTime="2025-10-11 02:13:25.198027083 +0000 UTC m=+4899.851007480" Oct 11 02:13:25 crc kubenswrapper[4743]: I1011 02:13:25.212875 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5bc9759f8-b6qgh" podStartSLOduration=3.503201829 podStartE2EDuration="13.212841531s" podCreationTimestamp="2025-10-11 02:13:12 +0000 UTC" firstStartedPulling="2025-10-11 02:13:13.815064564 +0000 UTC m=+4888.468044961" lastFinishedPulling="2025-10-11 02:13:23.524704236 +0000 UTC m=+4898.177684663" observedRunningTime="2025-10-11 02:13:25.211865916 +0000 UTC m=+4899.864846313" watchObservedRunningTime="2025-10-11 02:13:25.212841531 +0000 UTC m=+4899.865821928" Oct 11 02:13:25 crc kubenswrapper[4743]: I1011 02:13:25.237387 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-f46b79456-dm9d6" podStartSLOduration=3.571193735 podStartE2EDuration="13.237370719s" podCreationTimestamp="2025-10-11 02:13:12 +0000 UTC" firstStartedPulling="2025-10-11 02:13:13.754364419 +0000 UTC m=+4888.407344816" lastFinishedPulling="2025-10-11 02:13:23.420541393 +0000 UTC m=+4898.073521800" observedRunningTime="2025-10-11 02:13:25.232450257 +0000 UTC m=+4899.885430664" watchObservedRunningTime="2025-10-11 02:13:25.237370719 +0000 UTC m=+4899.890351116" Oct 11 02:13:25 crc kubenswrapper[4743]: I1011 02:13:25.964310 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 11 02:13:25 crc kubenswrapper[4743]: I1011 02:13:25.964460 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 11 02:13:26 crc kubenswrapper[4743]: I1011 02:13:26.008086 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 11 02:13:26 crc kubenswrapper[4743]: I1011 02:13:26.009821 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 11 02:13:26 crc kubenswrapper[4743]: I1011 02:13:26.186277 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 11 02:13:26 crc kubenswrapper[4743]: I1011 02:13:26.186323 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 11 02:13:28 crc kubenswrapper[4743]: I1011 02:13:28.876916 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 11 02:13:28 crc kubenswrapper[4743]: I1011 02:13:28.877794 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 11 02:13:28 crc kubenswrapper[4743]: I1011 02:13:28.881482 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 11 02:13:28 crc kubenswrapper[4743]: I1011 02:13:28.881643 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 11 02:13:28 crc kubenswrapper[4743]: I1011 02:13:28.884229 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 11 02:13:28 crc kubenswrapper[4743]: I1011 02:13:28.953217 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 11 02:13:29 crc kubenswrapper[4743]: I1011 02:13:29.501815 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-d886-account-create-4g9gc"] Oct 11 02:13:29 crc kubenswrapper[4743]: I1011 02:13:29.503248 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-d886-account-create-4g9gc" Oct 11 02:13:29 crc kubenswrapper[4743]: I1011 02:13:29.506736 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Oct 11 02:13:29 crc kubenswrapper[4743]: I1011 02:13:29.517004 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-d886-account-create-4g9gc"] Oct 11 02:13:29 crc kubenswrapper[4743]: I1011 02:13:29.541697 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sc5g\" (UniqueName: \"kubernetes.io/projected/a02defd0-0720-4bc9-a2ba-c8262b4b4432-kube-api-access-9sc5g\") pod \"manila-d886-account-create-4g9gc\" (UID: \"a02defd0-0720-4bc9-a2ba-c8262b4b4432\") " pod="openstack/manila-d886-account-create-4g9gc" Oct 11 02:13:29 crc kubenswrapper[4743]: I1011 02:13:29.648917 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sc5g\" (UniqueName: \"kubernetes.io/projected/a02defd0-0720-4bc9-a2ba-c8262b4b4432-kube-api-access-9sc5g\") pod \"manila-d886-account-create-4g9gc\" (UID: \"a02defd0-0720-4bc9-a2ba-c8262b4b4432\") " pod="openstack/manila-d886-account-create-4g9gc" Oct 11 02:13:29 crc kubenswrapper[4743]: I1011 02:13:29.674071 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sc5g\" (UniqueName: \"kubernetes.io/projected/a02defd0-0720-4bc9-a2ba-c8262b4b4432-kube-api-access-9sc5g\") pod \"manila-d886-account-create-4g9gc\" (UID: \"a02defd0-0720-4bc9-a2ba-c8262b4b4432\") " pod="openstack/manila-d886-account-create-4g9gc" Oct 11 02:13:29 crc kubenswrapper[4743]: I1011 02:13:29.838435 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-d886-account-create-4g9gc" Oct 11 02:13:29 crc kubenswrapper[4743]: I1011 02:13:29.912083 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-859df554d7-5h8zd" Oct 11 02:13:30 crc kubenswrapper[4743]: I1011 02:13:30.009927 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-9bcfcb9cc-5wvcg" Oct 11 02:13:30 crc kubenswrapper[4743]: I1011 02:13:30.506920 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-d886-account-create-4g9gc"] Oct 11 02:13:31 crc kubenswrapper[4743]: I1011 02:13:31.245670 4743 generic.go:334] "Generic (PLEG): container finished" podID="a02defd0-0720-4bc9-a2ba-c8262b4b4432" containerID="c0c28bfcf170da34a692035c0335b91c84848516661ea1b2a7965f3180fdd070" exitCode=0 Oct 11 02:13:31 crc kubenswrapper[4743]: I1011 02:13:31.245770 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-d886-account-create-4g9gc" event={"ID":"a02defd0-0720-4bc9-a2ba-c8262b4b4432","Type":"ContainerDied","Data":"c0c28bfcf170da34a692035c0335b91c84848516661ea1b2a7965f3180fdd070"} Oct 11 02:13:31 crc kubenswrapper[4743]: I1011 02:13:31.246151 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-d886-account-create-4g9gc" event={"ID":"a02defd0-0720-4bc9-a2ba-c8262b4b4432","Type":"ContainerStarted","Data":"f4a5f0156ae9748ea01f75579fa0b1c6f9d8a163abf0a68e80536b435cd41f24"} Oct 11 02:13:32 crc kubenswrapper[4743]: I1011 02:13:32.454141 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:32 crc kubenswrapper[4743]: I1011 02:13:32.455262 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:32 crc kubenswrapper[4743]: I1011 02:13:32.769148 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:32 crc kubenswrapper[4743]: I1011 02:13:32.769206 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:32 crc kubenswrapper[4743]: I1011 02:13:32.982053 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-d886-account-create-4g9gc" Oct 11 02:13:33 crc kubenswrapper[4743]: I1011 02:13:33.161162 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sc5g\" (UniqueName: \"kubernetes.io/projected/a02defd0-0720-4bc9-a2ba-c8262b4b4432-kube-api-access-9sc5g\") pod \"a02defd0-0720-4bc9-a2ba-c8262b4b4432\" (UID: \"a02defd0-0720-4bc9-a2ba-c8262b4b4432\") " Oct 11 02:13:33 crc kubenswrapper[4743]: I1011 02:13:33.170073 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a02defd0-0720-4bc9-a2ba-c8262b4b4432-kube-api-access-9sc5g" (OuterVolumeSpecName: "kube-api-access-9sc5g") pod "a02defd0-0720-4bc9-a2ba-c8262b4b4432" (UID: "a02defd0-0720-4bc9-a2ba-c8262b4b4432"). InnerVolumeSpecName "kube-api-access-9sc5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:13:33 crc kubenswrapper[4743]: I1011 02:13:33.264429 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sc5g\" (UniqueName: \"kubernetes.io/projected/a02defd0-0720-4bc9-a2ba-c8262b4b4432-kube-api-access-9sc5g\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:33 crc kubenswrapper[4743]: I1011 02:13:33.273003 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-d886-account-create-4g9gc" Oct 11 02:13:33 crc kubenswrapper[4743]: I1011 02:13:33.282827 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-d886-account-create-4g9gc" event={"ID":"a02defd0-0720-4bc9-a2ba-c8262b4b4432","Type":"ContainerDied","Data":"f4a5f0156ae9748ea01f75579fa0b1c6f9d8a163abf0a68e80536b435cd41f24"} Oct 11 02:13:33 crc kubenswrapper[4743]: I1011 02:13:33.282916 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4a5f0156ae9748ea01f75579fa0b1c6f9d8a163abf0a68e80536b435cd41f24" Oct 11 02:13:34 crc kubenswrapper[4743]: I1011 02:13:34.875945 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-7s99b"] Oct 11 02:13:34 crc kubenswrapper[4743]: E1011 02:13:34.876612 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a02defd0-0720-4bc9-a2ba-c8262b4b4432" containerName="mariadb-account-create" Oct 11 02:13:34 crc kubenswrapper[4743]: I1011 02:13:34.876625 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a02defd0-0720-4bc9-a2ba-c8262b4b4432" containerName="mariadb-account-create" Oct 11 02:13:34 crc kubenswrapper[4743]: I1011 02:13:34.876865 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a02defd0-0720-4bc9-a2ba-c8262b4b4432" containerName="mariadb-account-create" Oct 11 02:13:34 crc kubenswrapper[4743]: I1011 02:13:34.877596 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-7s99b" Oct 11 02:13:34 crc kubenswrapper[4743]: I1011 02:13:34.884933 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-66b9l" Oct 11 02:13:34 crc kubenswrapper[4743]: I1011 02:13:34.885258 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 11 02:13:34 crc kubenswrapper[4743]: I1011 02:13:34.924561 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-7s99b"] Oct 11 02:13:35 crc kubenswrapper[4743]: I1011 02:13:35.015213 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595-combined-ca-bundle\") pod \"manila-db-sync-7s99b\" (UID: \"b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595\") " pod="openstack/manila-db-sync-7s99b" Oct 11 02:13:35 crc kubenswrapper[4743]: I1011 02:13:35.015283 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595-job-config-data\") pod \"manila-db-sync-7s99b\" (UID: \"b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595\") " pod="openstack/manila-db-sync-7s99b" Oct 11 02:13:35 crc kubenswrapper[4743]: I1011 02:13:35.015475 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjxq5\" (UniqueName: \"kubernetes.io/projected/b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595-kube-api-access-tjxq5\") pod \"manila-db-sync-7s99b\" (UID: \"b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595\") " pod="openstack/manila-db-sync-7s99b" Oct 11 02:13:35 crc kubenswrapper[4743]: I1011 02:13:35.015549 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595-config-data\") pod \"manila-db-sync-7s99b\" (UID: \"b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595\") " pod="openstack/manila-db-sync-7s99b" Oct 11 02:13:35 crc kubenswrapper[4743]: I1011 02:13:35.117919 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595-config-data\") pod \"manila-db-sync-7s99b\" (UID: \"b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595\") " pod="openstack/manila-db-sync-7s99b" Oct 11 02:13:35 crc kubenswrapper[4743]: I1011 02:13:35.118075 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595-combined-ca-bundle\") pod \"manila-db-sync-7s99b\" (UID: \"b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595\") " pod="openstack/manila-db-sync-7s99b" Oct 11 02:13:35 crc kubenswrapper[4743]: I1011 02:13:35.118116 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595-job-config-data\") pod \"manila-db-sync-7s99b\" (UID: \"b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595\") " pod="openstack/manila-db-sync-7s99b" Oct 11 02:13:35 crc kubenswrapper[4743]: I1011 02:13:35.118189 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjxq5\" (UniqueName: \"kubernetes.io/projected/b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595-kube-api-access-tjxq5\") pod \"manila-db-sync-7s99b\" (UID: \"b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595\") " pod="openstack/manila-db-sync-7s99b" Oct 11 02:13:35 crc kubenswrapper[4743]: I1011 02:13:35.130520 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595-job-config-data\") pod \"manila-db-sync-7s99b\" (UID: \"b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595\") " pod="openstack/manila-db-sync-7s99b" Oct 11 02:13:35 crc kubenswrapper[4743]: I1011 02:13:35.130636 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595-combined-ca-bundle\") pod \"manila-db-sync-7s99b\" (UID: \"b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595\") " pod="openstack/manila-db-sync-7s99b" Oct 11 02:13:35 crc kubenswrapper[4743]: I1011 02:13:35.131177 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595-config-data\") pod \"manila-db-sync-7s99b\" (UID: \"b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595\") " pod="openstack/manila-db-sync-7s99b" Oct 11 02:13:35 crc kubenswrapper[4743]: I1011 02:13:35.135051 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjxq5\" (UniqueName: \"kubernetes.io/projected/b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595-kube-api-access-tjxq5\") pod \"manila-db-sync-7s99b\" (UID: \"b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595\") " pod="openstack/manila-db-sync-7s99b" Oct 11 02:13:35 crc kubenswrapper[4743]: I1011 02:13:35.236884 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-7s99b" Oct 11 02:13:35 crc kubenswrapper[4743]: I1011 02:13:35.967779 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-7s99b"] Oct 11 02:13:36 crc kubenswrapper[4743]: I1011 02:13:36.312448 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-7s99b" event={"ID":"b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595","Type":"ContainerStarted","Data":"2f066bf589408cf20d19d0d1ba95cd7ada254f4bdd6785072b6024acfe5e5e8e"} Oct 11 02:13:42 crc kubenswrapper[4743]: I1011 02:13:42.455546 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5bc9759f8-b6qgh" podUID="d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.67:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.67:8443: connect: connection refused" Oct 11 02:13:42 crc kubenswrapper[4743]: I1011 02:13:42.770762 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-f46b79456-dm9d6" podUID="36f566d2-9c6b-4bc3-a1a3-47a11e6eee45" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.68:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.68:8443: connect: connection refused" Oct 11 02:13:43 crc kubenswrapper[4743]: I1011 02:13:43.393084 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-7s99b" event={"ID":"b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595","Type":"ContainerStarted","Data":"94f1700a2ee279a090b01900d8d39c3b6841a8bd9bee1f675dddcb424b5a0fbc"} Oct 11 02:13:43 crc kubenswrapper[4743]: I1011 02:13:43.413992 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-7s99b" podStartSLOduration=3.476022493 podStartE2EDuration="9.413972743s" podCreationTimestamp="2025-10-11 02:13:34 +0000 UTC" firstStartedPulling="2025-10-11 02:13:35.971882931 +0000 UTC m=+4910.624863338" lastFinishedPulling="2025-10-11 02:13:41.909833191 +0000 UTC m=+4916.562813588" observedRunningTime="2025-10-11 02:13:43.408695503 +0000 UTC m=+4918.061675900" watchObservedRunningTime="2025-10-11 02:13:43.413972743 +0000 UTC m=+4918.066953150" Oct 11 02:13:44 crc kubenswrapper[4743]: I1011 02:13:44.458147 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:13:44 crc kubenswrapper[4743]: I1011 02:13:44.458210 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:13:52 crc kubenswrapper[4743]: I1011 02:13:52.516159 4743 generic.go:334] "Generic (PLEG): container finished" podID="b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595" containerID="94f1700a2ee279a090b01900d8d39c3b6841a8bd9bee1f675dddcb424b5a0fbc" exitCode=0 Oct 11 02:13:52 crc kubenswrapper[4743]: I1011 02:13:52.516252 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-7s99b" event={"ID":"b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595","Type":"ContainerDied","Data":"94f1700a2ee279a090b01900d8d39c3b6841a8bd9bee1f675dddcb424b5a0fbc"} Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.160734 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-7s99b" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.299617 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjxq5\" (UniqueName: \"kubernetes.io/projected/b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595-kube-api-access-tjxq5\") pod \"b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595\" (UID: \"b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595\") " Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.300014 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595-config-data\") pod \"b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595\" (UID: \"b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595\") " Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.300040 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595-combined-ca-bundle\") pod \"b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595\" (UID: \"b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595\") " Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.300167 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595-job-config-data\") pod \"b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595\" (UID: \"b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595\") " Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.311513 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595-config-data" (OuterVolumeSpecName: "config-data") pod "b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595" (UID: "b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.317385 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595-kube-api-access-tjxq5" (OuterVolumeSpecName: "kube-api-access-tjxq5") pod "b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595" (UID: "b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595"). InnerVolumeSpecName "kube-api-access-tjxq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.318981 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595" (UID: "b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.359196 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595" (UID: "b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.402801 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjxq5\" (UniqueName: \"kubernetes.io/projected/b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595-kube-api-access-tjxq5\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.402842 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.402852 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.402862 4743 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595-job-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.561472 4743 generic.go:334] "Generic (PLEG): container finished" podID="c920be26-905f-40c5-a4b6-c80b9428e662" containerID="fee3ecff320c6c393ce9c7d5f264d8d7e967976cdd6a7ff78943733e6965bb28" exitCode=137 Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.561506 4743 generic.go:334] "Generic (PLEG): container finished" podID="c920be26-905f-40c5-a4b6-c80b9428e662" containerID="ea6c2dcdccf12c57ed883eab65300e78c176d46beb6bce3a1f66dea3733277f5" exitCode=137 Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.561568 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-859df554d7-5h8zd" event={"ID":"c920be26-905f-40c5-a4b6-c80b9428e662","Type":"ContainerDied","Data":"fee3ecff320c6c393ce9c7d5f264d8d7e967976cdd6a7ff78943733e6965bb28"} Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.561593 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-859df554d7-5h8zd" event={"ID":"c920be26-905f-40c5-a4b6-c80b9428e662","Type":"ContainerDied","Data":"ea6c2dcdccf12c57ed883eab65300e78c176d46beb6bce3a1f66dea3733277f5"} Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.567319 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-7s99b" event={"ID":"b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595","Type":"ContainerDied","Data":"2f066bf589408cf20d19d0d1ba95cd7ada254f4bdd6785072b6024acfe5e5e8e"} Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.567359 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f066bf589408cf20d19d0d1ba95cd7ada254f4bdd6785072b6024acfe5e5e8e" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.567412 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-7s99b" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.620529 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-859df554d7-5h8zd" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.707754 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c920be26-905f-40c5-a4b6-c80b9428e662-logs\") pod \"c920be26-905f-40c5-a4b6-c80b9428e662\" (UID: \"c920be26-905f-40c5-a4b6-c80b9428e662\") " Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.707914 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c920be26-905f-40c5-a4b6-c80b9428e662-horizon-secret-key\") pod \"c920be26-905f-40c5-a4b6-c80b9428e662\" (UID: \"c920be26-905f-40c5-a4b6-c80b9428e662\") " Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.707948 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rccm8\" (UniqueName: \"kubernetes.io/projected/c920be26-905f-40c5-a4b6-c80b9428e662-kube-api-access-rccm8\") pod \"c920be26-905f-40c5-a4b6-c80b9428e662\" (UID: \"c920be26-905f-40c5-a4b6-c80b9428e662\") " Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.708068 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c920be26-905f-40c5-a4b6-c80b9428e662-config-data\") pod \"c920be26-905f-40c5-a4b6-c80b9428e662\" (UID: \"c920be26-905f-40c5-a4b6-c80b9428e662\") " Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.709504 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c920be26-905f-40c5-a4b6-c80b9428e662-scripts\") pod \"c920be26-905f-40c5-a4b6-c80b9428e662\" (UID: \"c920be26-905f-40c5-a4b6-c80b9428e662\") " Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.711057 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c920be26-905f-40c5-a4b6-c80b9428e662-logs" (OuterVolumeSpecName: "logs") pod "c920be26-905f-40c5-a4b6-c80b9428e662" (UID: "c920be26-905f-40c5-a4b6-c80b9428e662"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.715645 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c920be26-905f-40c5-a4b6-c80b9428e662-kube-api-access-rccm8" (OuterVolumeSpecName: "kube-api-access-rccm8") pod "c920be26-905f-40c5-a4b6-c80b9428e662" (UID: "c920be26-905f-40c5-a4b6-c80b9428e662"). InnerVolumeSpecName "kube-api-access-rccm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.721015 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c920be26-905f-40c5-a4b6-c80b9428e662-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c920be26-905f-40c5-a4b6-c80b9428e662" (UID: "c920be26-905f-40c5-a4b6-c80b9428e662"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.737997 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c920be26-905f-40c5-a4b6-c80b9428e662-config-data" (OuterVolumeSpecName: "config-data") pod "c920be26-905f-40c5-a4b6-c80b9428e662" (UID: "c920be26-905f-40c5-a4b6-c80b9428e662"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.738862 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c920be26-905f-40c5-a4b6-c80b9428e662-scripts" (OuterVolumeSpecName: "scripts") pod "c920be26-905f-40c5-a4b6-c80b9428e662" (UID: "c920be26-905f-40c5-a4b6-c80b9428e662"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.793498 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.804365 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.811485 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c920be26-905f-40c5-a4b6-c80b9428e662-logs\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.811513 4743 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c920be26-905f-40c5-a4b6-c80b9428e662-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.811525 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rccm8\" (UniqueName: \"kubernetes.io/projected/c920be26-905f-40c5-a4b6-c80b9428e662-kube-api-access-rccm8\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.811535 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c920be26-905f-40c5-a4b6-c80b9428e662-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.811544 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c920be26-905f-40c5-a4b6-c80b9428e662-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.839959 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 11 02:13:54 crc kubenswrapper[4743]: E1011 02:13:54.840492 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595" containerName="manila-db-sync" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.840517 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595" containerName="manila-db-sync" Oct 11 02:13:54 crc kubenswrapper[4743]: E1011 02:13:54.840546 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c920be26-905f-40c5-a4b6-c80b9428e662" containerName="horizon-log" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.840554 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c920be26-905f-40c5-a4b6-c80b9428e662" containerName="horizon-log" Oct 11 02:13:54 crc kubenswrapper[4743]: E1011 02:13:54.840598 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c920be26-905f-40c5-a4b6-c80b9428e662" containerName="horizon" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.840607 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c920be26-905f-40c5-a4b6-c80b9428e662" containerName="horizon" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.840837 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c920be26-905f-40c5-a4b6-c80b9428e662" containerName="horizon-log" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.840878 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595" containerName="manila-db-sync" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.840912 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c920be26-905f-40c5-a4b6-c80b9428e662" containerName="horizon" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.842498 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.844750 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.844778 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.844752 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-66b9l" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.853429 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.878119 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.913881 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrpsd\" (UniqueName: \"kubernetes.io/projected/e1a8e9c6-a225-4017-8752-c980f44618c5-kube-api-access-wrpsd\") pod \"manila-scheduler-0\" (UID: \"e1a8e9c6-a225-4017-8752-c980f44618c5\") " pod="openstack/manila-scheduler-0" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.913937 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e1a8e9c6-a225-4017-8752-c980f44618c5-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e1a8e9c6-a225-4017-8752-c980f44618c5\") " pod="openstack/manila-scheduler-0" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.913955 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1a8e9c6-a225-4017-8752-c980f44618c5-scripts\") pod \"manila-scheduler-0\" (UID: \"e1a8e9c6-a225-4017-8752-c980f44618c5\") " pod="openstack/manila-scheduler-0" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.914104 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1a8e9c6-a225-4017-8752-c980f44618c5-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e1a8e9c6-a225-4017-8752-c980f44618c5\") " pod="openstack/manila-scheduler-0" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.915809 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1a8e9c6-a225-4017-8752-c980f44618c5-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e1a8e9c6-a225-4017-8752-c980f44618c5\") " pod="openstack/manila-scheduler-0" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.915844 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1a8e9c6-a225-4017-8752-c980f44618c5-config-data\") pod \"manila-scheduler-0\" (UID: \"e1a8e9c6-a225-4017-8752-c980f44618c5\") " pod="openstack/manila-scheduler-0" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.953478 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.955565 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 11 02:13:54 crc kubenswrapper[4743]: I1011 02:13:54.957724 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.018100 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d3cbb84-0280-484f-a137-47cfea670423-scripts\") pod \"manila-share-share1-0\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " pod="openstack/manila-share-share1-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.018178 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2d3cbb84-0280-484f-a137-47cfea670423-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " pod="openstack/manila-share-share1-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.018232 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1a8e9c6-a225-4017-8752-c980f44618c5-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e1a8e9c6-a225-4017-8752-c980f44618c5\") " pod="openstack/manila-scheduler-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.018320 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1a8e9c6-a225-4017-8752-c980f44618c5-config-data\") pod \"manila-scheduler-0\" (UID: \"e1a8e9c6-a225-4017-8752-c980f44618c5\") " pod="openstack/manila-scheduler-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.018508 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrpsd\" (UniqueName: \"kubernetes.io/projected/e1a8e9c6-a225-4017-8752-c980f44618c5-kube-api-access-wrpsd\") pod \"manila-scheduler-0\" (UID: \"e1a8e9c6-a225-4017-8752-c980f44618c5\") " pod="openstack/manila-scheduler-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.018535 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1a8e9c6-a225-4017-8752-c980f44618c5-scripts\") pod \"manila-scheduler-0\" (UID: \"e1a8e9c6-a225-4017-8752-c980f44618c5\") " pod="openstack/manila-scheduler-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.018553 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e1a8e9c6-a225-4017-8752-c980f44618c5-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e1a8e9c6-a225-4017-8752-c980f44618c5\") " pod="openstack/manila-scheduler-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.018634 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d3cbb84-0280-484f-a137-47cfea670423-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " pod="openstack/manila-share-share1-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.018651 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvxwj\" (UniqueName: \"kubernetes.io/projected/2d3cbb84-0280-484f-a137-47cfea670423-kube-api-access-fvxwj\") pod \"manila-share-share1-0\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " pod="openstack/manila-share-share1-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.018831 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2d3cbb84-0280-484f-a137-47cfea670423-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " pod="openstack/manila-share-share1-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.018968 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1a8e9c6-a225-4017-8752-c980f44618c5-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e1a8e9c6-a225-4017-8752-c980f44618c5\") " pod="openstack/manila-scheduler-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.019011 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d3cbb84-0280-484f-a137-47cfea670423-config-data\") pod \"manila-share-share1-0\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " pod="openstack/manila-share-share1-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.019068 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d3cbb84-0280-484f-a137-47cfea670423-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " pod="openstack/manila-share-share1-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.019120 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2d3cbb84-0280-484f-a137-47cfea670423-ceph\") pod \"manila-share-share1-0\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " pod="openstack/manila-share-share1-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.026103 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1a8e9c6-a225-4017-8752-c980f44618c5-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e1a8e9c6-a225-4017-8752-c980f44618c5\") " pod="openstack/manila-scheduler-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.026235 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e1a8e9c6-a225-4017-8752-c980f44618c5-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e1a8e9c6-a225-4017-8752-c980f44618c5\") " pod="openstack/manila-scheduler-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.029476 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1a8e9c6-a225-4017-8752-c980f44618c5-config-data\") pod \"manila-scheduler-0\" (UID: \"e1a8e9c6-a225-4017-8752-c980f44618c5\") " pod="openstack/manila-scheduler-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.038623 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1a8e9c6-a225-4017-8752-c980f44618c5-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e1a8e9c6-a225-4017-8752-c980f44618c5\") " pod="openstack/manila-scheduler-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.043627 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1a8e9c6-a225-4017-8752-c980f44618c5-scripts\") pod \"manila-scheduler-0\" (UID: \"e1a8e9c6-a225-4017-8752-c980f44618c5\") " pod="openstack/manila-scheduler-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.050401 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrpsd\" (UniqueName: \"kubernetes.io/projected/e1a8e9c6-a225-4017-8752-c980f44618c5-kube-api-access-wrpsd\") pod \"manila-scheduler-0\" (UID: \"e1a8e9c6-a225-4017-8752-c980f44618c5\") " pod="openstack/manila-scheduler-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.063942 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.085977 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c8d8d886c-dsztl"] Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.114623 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.123163 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvxwj\" (UniqueName: \"kubernetes.io/projected/2d3cbb84-0280-484f-a137-47cfea670423-kube-api-access-fvxwj\") pod \"manila-share-share1-0\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " pod="openstack/manila-share-share1-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.123197 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d3cbb84-0280-484f-a137-47cfea670423-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " pod="openstack/manila-share-share1-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.123261 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2d3cbb84-0280-484f-a137-47cfea670423-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " pod="openstack/manila-share-share1-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.123271 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c8d8d886c-dsztl"] Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.123313 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d3cbb84-0280-484f-a137-47cfea670423-config-data\") pod \"manila-share-share1-0\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " pod="openstack/manila-share-share1-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.123343 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d3cbb84-0280-484f-a137-47cfea670423-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " pod="openstack/manila-share-share1-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.123371 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2d3cbb84-0280-484f-a137-47cfea670423-ceph\") pod \"manila-share-share1-0\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " pod="openstack/manila-share-share1-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.123392 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d3cbb84-0280-484f-a137-47cfea670423-scripts\") pod \"manila-share-share1-0\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " pod="openstack/manila-share-share1-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.123413 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2d3cbb84-0280-484f-a137-47cfea670423-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " pod="openstack/manila-share-share1-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.123525 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2d3cbb84-0280-484f-a137-47cfea670423-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " pod="openstack/manila-share-share1-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.123663 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2d3cbb84-0280-484f-a137-47cfea670423-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " pod="openstack/manila-share-share1-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.134243 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d3cbb84-0280-484f-a137-47cfea670423-config-data\") pod \"manila-share-share1-0\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " pod="openstack/manila-share-share1-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.135499 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d3cbb84-0280-484f-a137-47cfea670423-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " pod="openstack/manila-share-share1-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.136498 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d3cbb84-0280-484f-a137-47cfea670423-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " pod="openstack/manila-share-share1-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.137026 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d3cbb84-0280-484f-a137-47cfea670423-scripts\") pod \"manila-share-share1-0\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " pod="openstack/manila-share-share1-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.152491 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2d3cbb84-0280-484f-a137-47cfea670423-ceph\") pod \"manila-share-share1-0\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " pod="openstack/manila-share-share1-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.155982 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvxwj\" (UniqueName: \"kubernetes.io/projected/2d3cbb84-0280-484f-a137-47cfea670423-kube-api-access-fvxwj\") pod \"manila-share-share1-0\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " pod="openstack/manila-share-share1-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.169596 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.227098 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1654f6a5-1abf-4c9e-b956-3bfc60c7077c-dns-svc\") pod \"dnsmasq-dns-c8d8d886c-dsztl\" (UID: \"1654f6a5-1abf-4c9e-b956-3bfc60c7077c\") " pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.227510 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h2qp\" (UniqueName: \"kubernetes.io/projected/1654f6a5-1abf-4c9e-b956-3bfc60c7077c-kube-api-access-9h2qp\") pod \"dnsmasq-dns-c8d8d886c-dsztl\" (UID: \"1654f6a5-1abf-4c9e-b956-3bfc60c7077c\") " pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.227537 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1654f6a5-1abf-4c9e-b956-3bfc60c7077c-dns-swift-storage-0\") pod \"dnsmasq-dns-c8d8d886c-dsztl\" (UID: \"1654f6a5-1abf-4c9e-b956-3bfc60c7077c\") " pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.227569 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1654f6a5-1abf-4c9e-b956-3bfc60c7077c-ovsdbserver-sb\") pod \"dnsmasq-dns-c8d8d886c-dsztl\" (UID: \"1654f6a5-1abf-4c9e-b956-3bfc60c7077c\") " pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.227590 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1654f6a5-1abf-4c9e-b956-3bfc60c7077c-openstack-edpm-ipam\") pod \"dnsmasq-dns-c8d8d886c-dsztl\" (UID: \"1654f6a5-1abf-4c9e-b956-3bfc60c7077c\") " pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.227662 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1654f6a5-1abf-4c9e-b956-3bfc60c7077c-config\") pod \"dnsmasq-dns-c8d8d886c-dsztl\" (UID: \"1654f6a5-1abf-4c9e-b956-3bfc60c7077c\") " pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.227685 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1654f6a5-1abf-4c9e-b956-3bfc60c7077c-ovsdbserver-nb\") pod \"dnsmasq-dns-c8d8d886c-dsztl\" (UID: \"1654f6a5-1abf-4c9e-b956-3bfc60c7077c\") " pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.247566 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.249474 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.251758 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.266473 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.287612 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.334979 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9b9n\" (UniqueName: \"kubernetes.io/projected/58be64fa-db82-4b4a-9b66-716bba3e27c8-kube-api-access-l9b9n\") pod \"manila-api-0\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " pod="openstack/manila-api-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.335066 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h2qp\" (UniqueName: \"kubernetes.io/projected/1654f6a5-1abf-4c9e-b956-3bfc60c7077c-kube-api-access-9h2qp\") pod \"dnsmasq-dns-c8d8d886c-dsztl\" (UID: \"1654f6a5-1abf-4c9e-b956-3bfc60c7077c\") " pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.335091 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1654f6a5-1abf-4c9e-b956-3bfc60c7077c-dns-swift-storage-0\") pod \"dnsmasq-dns-c8d8d886c-dsztl\" (UID: \"1654f6a5-1abf-4c9e-b956-3bfc60c7077c\") " pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.335235 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58be64fa-db82-4b4a-9b66-716bba3e27c8-config-data-custom\") pod \"manila-api-0\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " pod="openstack/manila-api-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.336261 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1654f6a5-1abf-4c9e-b956-3bfc60c7077c-ovsdbserver-sb\") pod \"dnsmasq-dns-c8d8d886c-dsztl\" (UID: \"1654f6a5-1abf-4c9e-b956-3bfc60c7077c\") " pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.336267 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1654f6a5-1abf-4c9e-b956-3bfc60c7077c-dns-swift-storage-0\") pod \"dnsmasq-dns-c8d8d886c-dsztl\" (UID: \"1654f6a5-1abf-4c9e-b956-3bfc60c7077c\") " pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.335264 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1654f6a5-1abf-4c9e-b956-3bfc60c7077c-ovsdbserver-sb\") pod \"dnsmasq-dns-c8d8d886c-dsztl\" (UID: \"1654f6a5-1abf-4c9e-b956-3bfc60c7077c\") " pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.336430 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1654f6a5-1abf-4c9e-b956-3bfc60c7077c-openstack-edpm-ipam\") pod \"dnsmasq-dns-c8d8d886c-dsztl\" (UID: \"1654f6a5-1abf-4c9e-b956-3bfc60c7077c\") " pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.337094 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1654f6a5-1abf-4c9e-b956-3bfc60c7077c-openstack-edpm-ipam\") pod \"dnsmasq-dns-c8d8d886c-dsztl\" (UID: \"1654f6a5-1abf-4c9e-b956-3bfc60c7077c\") " pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.337393 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58be64fa-db82-4b4a-9b66-716bba3e27c8-scripts\") pod \"manila-api-0\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " pod="openstack/manila-api-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.337449 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58be64fa-db82-4b4a-9b66-716bba3e27c8-etc-machine-id\") pod \"manila-api-0\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " pod="openstack/manila-api-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.337496 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1654f6a5-1abf-4c9e-b956-3bfc60c7077c-config\") pod \"dnsmasq-dns-c8d8d886c-dsztl\" (UID: \"1654f6a5-1abf-4c9e-b956-3bfc60c7077c\") " pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.337541 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1654f6a5-1abf-4c9e-b956-3bfc60c7077c-ovsdbserver-nb\") pod \"dnsmasq-dns-c8d8d886c-dsztl\" (UID: \"1654f6a5-1abf-4c9e-b956-3bfc60c7077c\") " pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.337567 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1654f6a5-1abf-4c9e-b956-3bfc60c7077c-dns-svc\") pod \"dnsmasq-dns-c8d8d886c-dsztl\" (UID: \"1654f6a5-1abf-4c9e-b956-3bfc60c7077c\") " pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.337663 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58be64fa-db82-4b4a-9b66-716bba3e27c8-config-data\") pod \"manila-api-0\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " pod="openstack/manila-api-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.337800 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58be64fa-db82-4b4a-9b66-716bba3e27c8-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " pod="openstack/manila-api-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.337817 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58be64fa-db82-4b4a-9b66-716bba3e27c8-logs\") pod \"manila-api-0\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " pod="openstack/manila-api-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.338433 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1654f6a5-1abf-4c9e-b956-3bfc60c7077c-config\") pod \"dnsmasq-dns-c8d8d886c-dsztl\" (UID: \"1654f6a5-1abf-4c9e-b956-3bfc60c7077c\") " pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.338692 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1654f6a5-1abf-4c9e-b956-3bfc60c7077c-ovsdbserver-nb\") pod \"dnsmasq-dns-c8d8d886c-dsztl\" (UID: \"1654f6a5-1abf-4c9e-b956-3bfc60c7077c\") " pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.339347 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1654f6a5-1abf-4c9e-b956-3bfc60c7077c-dns-svc\") pod \"dnsmasq-dns-c8d8d886c-dsztl\" (UID: \"1654f6a5-1abf-4c9e-b956-3bfc60c7077c\") " pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.354309 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h2qp\" (UniqueName: \"kubernetes.io/projected/1654f6a5-1abf-4c9e-b956-3bfc60c7077c-kube-api-access-9h2qp\") pod \"dnsmasq-dns-c8d8d886c-dsztl\" (UID: \"1654f6a5-1abf-4c9e-b956-3bfc60c7077c\") " pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.415071 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.440522 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58be64fa-db82-4b4a-9b66-716bba3e27c8-scripts\") pod \"manila-api-0\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " pod="openstack/manila-api-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.440808 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58be64fa-db82-4b4a-9b66-716bba3e27c8-etc-machine-id\") pod \"manila-api-0\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " pod="openstack/manila-api-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.440905 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58be64fa-db82-4b4a-9b66-716bba3e27c8-config-data\") pod \"manila-api-0\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " pod="openstack/manila-api-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.440952 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58be64fa-db82-4b4a-9b66-716bba3e27c8-logs\") pod \"manila-api-0\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " pod="openstack/manila-api-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.440969 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58be64fa-db82-4b4a-9b66-716bba3e27c8-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " pod="openstack/manila-api-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.440997 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9b9n\" (UniqueName: \"kubernetes.io/projected/58be64fa-db82-4b4a-9b66-716bba3e27c8-kube-api-access-l9b9n\") pod \"manila-api-0\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " pod="openstack/manila-api-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.441058 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58be64fa-db82-4b4a-9b66-716bba3e27c8-config-data-custom\") pod \"manila-api-0\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " pod="openstack/manila-api-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.445505 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58be64fa-db82-4b4a-9b66-716bba3e27c8-logs\") pod \"manila-api-0\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " pod="openstack/manila-api-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.445564 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58be64fa-db82-4b4a-9b66-716bba3e27c8-etc-machine-id\") pod \"manila-api-0\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " pod="openstack/manila-api-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.446957 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58be64fa-db82-4b4a-9b66-716bba3e27c8-config-data\") pod \"manila-api-0\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " pod="openstack/manila-api-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.448460 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58be64fa-db82-4b4a-9b66-716bba3e27c8-scripts\") pod \"manila-api-0\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " pod="openstack/manila-api-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.448618 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58be64fa-db82-4b4a-9b66-716bba3e27c8-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " pod="openstack/manila-api-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.474918 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9b9n\" (UniqueName: \"kubernetes.io/projected/58be64fa-db82-4b4a-9b66-716bba3e27c8-kube-api-access-l9b9n\") pod \"manila-api-0\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " pod="openstack/manila-api-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.477609 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58be64fa-db82-4b4a-9b66-716bba3e27c8-config-data-custom\") pod \"manila-api-0\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " pod="openstack/manila-api-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.487845 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.653903 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-859df554d7-5h8zd" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.653135 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-859df554d7-5h8zd" event={"ID":"c920be26-905f-40c5-a4b6-c80b9428e662","Type":"ContainerDied","Data":"859de03282d67c6cada03fe841cb450631c1709afa53e8d0e2d467389ad31c9f"} Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.654070 4743 scope.go:117] "RemoveContainer" containerID="fee3ecff320c6c393ce9c7d5f264d8d7e967976cdd6a7ff78943733e6965bb28" Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.705069 4743 generic.go:334] "Generic (PLEG): container finished" podID="d39c4e33-bfe6-4c48-bc00-f2713e45103b" containerID="c70ade596b21bb0b30a5c94e6dc731d8e14182dcea30b227dee18d688140bc81" exitCode=137 Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.705309 4743 generic.go:334] "Generic (PLEG): container finished" podID="d39c4e33-bfe6-4c48-bc00-f2713e45103b" containerID="960a4a30c95015be9bd9b9cafce3f72f8b46b567ff972d6cf002c7f8c8f49082" exitCode=137 Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.705136 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9bcfcb9cc-5wvcg" event={"ID":"d39c4e33-bfe6-4c48-bc00-f2713e45103b","Type":"ContainerDied","Data":"c70ade596b21bb0b30a5c94e6dc731d8e14182dcea30b227dee18d688140bc81"} Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.705341 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9bcfcb9cc-5wvcg" event={"ID":"d39c4e33-bfe6-4c48-bc00-f2713e45103b","Type":"ContainerDied","Data":"960a4a30c95015be9bd9b9cafce3f72f8b46b567ff972d6cf002c7f8c8f49082"} Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.839954 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-859df554d7-5h8zd"] Oct 11 02:13:55 crc kubenswrapper[4743]: I1011 02:13:55.855356 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-859df554d7-5h8zd"] Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.000127 4743 scope.go:117] "RemoveContainer" containerID="ea6c2dcdccf12c57ed883eab65300e78c176d46beb6bce3a1f66dea3733277f5" Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.189883 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c920be26-905f-40c5-a4b6-c80b9428e662" path="/var/lib/kubelet/pods/c920be26-905f-40c5-a4b6-c80b9428e662/volumes" Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.280907 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.487460 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9bcfcb9cc-5wvcg" Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.515832 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.579120 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d39c4e33-bfe6-4c48-bc00-f2713e45103b-config-data\") pod \"d39c4e33-bfe6-4c48-bc00-f2713e45103b\" (UID: \"d39c4e33-bfe6-4c48-bc00-f2713e45103b\") " Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.579351 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d39c4e33-bfe6-4c48-bc00-f2713e45103b-logs\") pod \"d39c4e33-bfe6-4c48-bc00-f2713e45103b\" (UID: \"d39c4e33-bfe6-4c48-bc00-f2713e45103b\") " Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.579382 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d39c4e33-bfe6-4c48-bc00-f2713e45103b-horizon-secret-key\") pod \"d39c4e33-bfe6-4c48-bc00-f2713e45103b\" (UID: \"d39c4e33-bfe6-4c48-bc00-f2713e45103b\") " Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.579475 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r84v\" (UniqueName: \"kubernetes.io/projected/d39c4e33-bfe6-4c48-bc00-f2713e45103b-kube-api-access-7r84v\") pod \"d39c4e33-bfe6-4c48-bc00-f2713e45103b\" (UID: \"d39c4e33-bfe6-4c48-bc00-f2713e45103b\") " Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.579518 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d39c4e33-bfe6-4c48-bc00-f2713e45103b-scripts\") pod \"d39c4e33-bfe6-4c48-bc00-f2713e45103b\" (UID: \"d39c4e33-bfe6-4c48-bc00-f2713e45103b\") " Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.580842 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d39c4e33-bfe6-4c48-bc00-f2713e45103b-logs" (OuterVolumeSpecName: "logs") pod "d39c4e33-bfe6-4c48-bc00-f2713e45103b" (UID: "d39c4e33-bfe6-4c48-bc00-f2713e45103b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.581648 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d39c4e33-bfe6-4c48-bc00-f2713e45103b-logs\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.589972 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d39c4e33-bfe6-4c48-bc00-f2713e45103b-kube-api-access-7r84v" (OuterVolumeSpecName: "kube-api-access-7r84v") pod "d39c4e33-bfe6-4c48-bc00-f2713e45103b" (UID: "d39c4e33-bfe6-4c48-bc00-f2713e45103b"). InnerVolumeSpecName "kube-api-access-7r84v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.590103 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d39c4e33-bfe6-4c48-bc00-f2713e45103b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d39c4e33-bfe6-4c48-bc00-f2713e45103b" (UID: "d39c4e33-bfe6-4c48-bc00-f2713e45103b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.622517 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d39c4e33-bfe6-4c48-bc00-f2713e45103b-config-data" (OuterVolumeSpecName: "config-data") pod "d39c4e33-bfe6-4c48-bc00-f2713e45103b" (UID: "d39c4e33-bfe6-4c48-bc00-f2713e45103b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.624068 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d39c4e33-bfe6-4c48-bc00-f2713e45103b-scripts" (OuterVolumeSpecName: "scripts") pod "d39c4e33-bfe6-4c48-bc00-f2713e45103b" (UID: "d39c4e33-bfe6-4c48-bc00-f2713e45103b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.686252 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r84v\" (UniqueName: \"kubernetes.io/projected/d39c4e33-bfe6-4c48-bc00-f2713e45103b-kube-api-access-7r84v\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.686287 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d39c4e33-bfe6-4c48-bc00-f2713e45103b-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.686301 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d39c4e33-bfe6-4c48-bc00-f2713e45103b-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.686316 4743 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d39c4e33-bfe6-4c48-bc00-f2713e45103b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.726167 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e1a8e9c6-a225-4017-8752-c980f44618c5","Type":"ContainerStarted","Data":"b63757734f2b0d1fd774e0248013ab3bc0d6593761fc3f21f58e43c8c58da731"} Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.727476 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c8d8d886c-dsztl"] Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.733350 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9bcfcb9cc-5wvcg" event={"ID":"d39c4e33-bfe6-4c48-bc00-f2713e45103b","Type":"ContainerDied","Data":"6d18370203a2225428f46f70802e9854a7715cc03e791b89a5c53edea4a34ee8"} Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.733382 4743 scope.go:117] "RemoveContainer" containerID="c70ade596b21bb0b30a5c94e6dc731d8e14182dcea30b227dee18d688140bc81" Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.733481 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9bcfcb9cc-5wvcg" Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.739604 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2d3cbb84-0280-484f-a137-47cfea670423","Type":"ContainerStarted","Data":"5c0da06da740f7ffe876ffc857cb702d17fd1cf87bf98d3dfe6b1003cc1b2c21"} Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.797646 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9bcfcb9cc-5wvcg"] Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.805906 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-9bcfcb9cc-5wvcg"] Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.945291 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 11 02:13:56 crc kubenswrapper[4743]: I1011 02:13:56.954084 4743 scope.go:117] "RemoveContainer" containerID="960a4a30c95015be9bd9b9cafce3f72f8b46b567ff972d6cf002c7f8c8f49082" Oct 11 02:13:57 crc kubenswrapper[4743]: I1011 02:13:57.553689 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:13:57 crc kubenswrapper[4743]: I1011 02:13:57.803155 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"58be64fa-db82-4b4a-9b66-716bba3e27c8","Type":"ContainerStarted","Data":"5b5fbb7d545226ef809bd5b8e30d4eff28355851ab3983cb462f20ae64142ed0"} Oct 11 02:13:57 crc kubenswrapper[4743]: I1011 02:13:57.803482 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"58be64fa-db82-4b4a-9b66-716bba3e27c8","Type":"ContainerStarted","Data":"8fbe4ed73a2c4a935d1a96dc7e79843e3a1d6f221825c11856ad257bcb0b5cb4"} Oct 11 02:13:57 crc kubenswrapper[4743]: I1011 02:13:57.830171 4743 generic.go:334] "Generic (PLEG): container finished" podID="1654f6a5-1abf-4c9e-b956-3bfc60c7077c" containerID="ec9d4ded50135bd465a8ebf17a2694f311353d94e6ebe0bbd26049744489e5b5" exitCode=0 Oct 11 02:13:57 crc kubenswrapper[4743]: I1011 02:13:57.830216 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" event={"ID":"1654f6a5-1abf-4c9e-b956-3bfc60c7077c","Type":"ContainerDied","Data":"ec9d4ded50135bd465a8ebf17a2694f311353d94e6ebe0bbd26049744489e5b5"} Oct 11 02:13:57 crc kubenswrapper[4743]: I1011 02:13:57.830240 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" event={"ID":"1654f6a5-1abf-4c9e-b956-3bfc60c7077c","Type":"ContainerStarted","Data":"1b7d425f382971b70d0242cfbe66adfb0a6ba2d00a2240fafcaee5c2519518c9"} Oct 11 02:13:58 crc kubenswrapper[4743]: I1011 02:13:58.182668 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d39c4e33-bfe6-4c48-bc00-f2713e45103b" path="/var/lib/kubelet/pods/d39c4e33-bfe6-4c48-bc00-f2713e45103b/volumes" Oct 11 02:13:58 crc kubenswrapper[4743]: I1011 02:13:58.238062 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-f46b79456-dm9d6" Oct 11 02:13:58 crc kubenswrapper[4743]: I1011 02:13:58.351300 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5bc9759f8-b6qgh"] Oct 11 02:13:58 crc kubenswrapper[4743]: I1011 02:13:58.351822 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5bc9759f8-b6qgh" podUID="d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd" containerName="horizon-log" containerID="cri-o://d1b484db2a51b4a01449527fdf8e1ed8a08f52dd1f27e1abe2a9f0f1d2e2b660" gracePeriod=30 Oct 11 02:13:58 crc kubenswrapper[4743]: I1011 02:13:58.352214 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5bc9759f8-b6qgh" podUID="d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd" containerName="horizon" containerID="cri-o://1ae29a0313861a928c7541c1589ba7b739f50bb56d203301166706e6650049d2" gracePeriod=30 Oct 11 02:13:58 crc kubenswrapper[4743]: I1011 02:13:58.541403 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Oct 11 02:13:58 crc kubenswrapper[4743]: I1011 02:13:58.848075 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"58be64fa-db82-4b4a-9b66-716bba3e27c8","Type":"ContainerStarted","Data":"65ed5539fd10598e9c4f0f177fdde4ab6b24f0dfbf85872aebb023c2e92bdcbd"} Oct 11 02:13:58 crc kubenswrapper[4743]: I1011 02:13:58.848185 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 11 02:13:58 crc kubenswrapper[4743]: I1011 02:13:58.848238 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="58be64fa-db82-4b4a-9b66-716bba3e27c8" containerName="manila-api" containerID="cri-o://65ed5539fd10598e9c4f0f177fdde4ab6b24f0dfbf85872aebb023c2e92bdcbd" gracePeriod=30 Oct 11 02:13:58 crc kubenswrapper[4743]: I1011 02:13:58.848196 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="58be64fa-db82-4b4a-9b66-716bba3e27c8" containerName="manila-api-log" containerID="cri-o://5b5fbb7d545226ef809bd5b8e30d4eff28355851ab3983cb462f20ae64142ed0" gracePeriod=30 Oct 11 02:13:58 crc kubenswrapper[4743]: I1011 02:13:58.853292 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" event={"ID":"1654f6a5-1abf-4c9e-b956-3bfc60c7077c","Type":"ContainerStarted","Data":"d071f9e62c51c3509c0cba82bb163faff17bd94796ff628df94018b78ce5a6d2"} Oct 11 02:13:58 crc kubenswrapper[4743]: I1011 02:13:58.853623 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" Oct 11 02:13:58 crc kubenswrapper[4743]: I1011 02:13:58.856846 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e1a8e9c6-a225-4017-8752-c980f44618c5","Type":"ContainerStarted","Data":"a76fabdfdc6e7cd76ae905a7c8f4ea711990506bc58c9dc7cb4fbe954418498a"} Oct 11 02:13:58 crc kubenswrapper[4743]: I1011 02:13:58.856907 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e1a8e9c6-a225-4017-8752-c980f44618c5","Type":"ContainerStarted","Data":"eccefbca5db579ad67191df996781478f4dff0f0d65c0d4218c5288d73943c07"} Oct 11 02:13:58 crc kubenswrapper[4743]: I1011 02:13:58.870172 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.87015485 podStartE2EDuration="3.87015485s" podCreationTimestamp="2025-10-11 02:13:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 02:13:58.867377741 +0000 UTC m=+4933.520358138" watchObservedRunningTime="2025-10-11 02:13:58.87015485 +0000 UTC m=+4933.523135237" Oct 11 02:13:58 crc kubenswrapper[4743]: I1011 02:13:58.893771 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" podStartSLOduration=3.893748515 podStartE2EDuration="3.893748515s" podCreationTimestamp="2025-10-11 02:13:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 02:13:58.887295905 +0000 UTC m=+4933.540276312" watchObservedRunningTime="2025-10-11 02:13:58.893748515 +0000 UTC m=+4933.546728902" Oct 11 02:13:58 crc kubenswrapper[4743]: I1011 02:13:58.917537 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.955180405 podStartE2EDuration="4.917515994s" podCreationTimestamp="2025-10-11 02:13:54 +0000 UTC" firstStartedPulling="2025-10-11 02:13:56.251512436 +0000 UTC m=+4930.904492843" lastFinishedPulling="2025-10-11 02:13:57.213848035 +0000 UTC m=+4931.866828432" observedRunningTime="2025-10-11 02:13:58.909818623 +0000 UTC m=+4933.562799020" watchObservedRunningTime="2025-10-11 02:13:58.917515994 +0000 UTC m=+4933.570496381" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.719923 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.808122 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58be64fa-db82-4b4a-9b66-716bba3e27c8-etc-machine-id\") pod \"58be64fa-db82-4b4a-9b66-716bba3e27c8\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.808235 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58be64fa-db82-4b4a-9b66-716bba3e27c8-scripts\") pod \"58be64fa-db82-4b4a-9b66-716bba3e27c8\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.808283 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58be64fa-db82-4b4a-9b66-716bba3e27c8-logs\") pod \"58be64fa-db82-4b4a-9b66-716bba3e27c8\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.808322 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58be64fa-db82-4b4a-9b66-716bba3e27c8-combined-ca-bundle\") pod \"58be64fa-db82-4b4a-9b66-716bba3e27c8\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.808442 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58be64fa-db82-4b4a-9b66-716bba3e27c8-config-data\") pod \"58be64fa-db82-4b4a-9b66-716bba3e27c8\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.808495 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9b9n\" (UniqueName: \"kubernetes.io/projected/58be64fa-db82-4b4a-9b66-716bba3e27c8-kube-api-access-l9b9n\") pod \"58be64fa-db82-4b4a-9b66-716bba3e27c8\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.808543 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58be64fa-db82-4b4a-9b66-716bba3e27c8-config-data-custom\") pod \"58be64fa-db82-4b4a-9b66-716bba3e27c8\" (UID: \"58be64fa-db82-4b4a-9b66-716bba3e27c8\") " Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.808983 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58be64fa-db82-4b4a-9b66-716bba3e27c8-logs" (OuterVolumeSpecName: "logs") pod "58be64fa-db82-4b4a-9b66-716bba3e27c8" (UID: "58be64fa-db82-4b4a-9b66-716bba3e27c8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.809024 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58be64fa-db82-4b4a-9b66-716bba3e27c8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "58be64fa-db82-4b4a-9b66-716bba3e27c8" (UID: "58be64fa-db82-4b4a-9b66-716bba3e27c8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.816251 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58be64fa-db82-4b4a-9b66-716bba3e27c8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "58be64fa-db82-4b4a-9b66-716bba3e27c8" (UID: "58be64fa-db82-4b4a-9b66-716bba3e27c8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.824991 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58be64fa-db82-4b4a-9b66-716bba3e27c8-kube-api-access-l9b9n" (OuterVolumeSpecName: "kube-api-access-l9b9n") pod "58be64fa-db82-4b4a-9b66-716bba3e27c8" (UID: "58be64fa-db82-4b4a-9b66-716bba3e27c8"). InnerVolumeSpecName "kube-api-access-l9b9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.827849 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58be64fa-db82-4b4a-9b66-716bba3e27c8-scripts" (OuterVolumeSpecName: "scripts") pod "58be64fa-db82-4b4a-9b66-716bba3e27c8" (UID: "58be64fa-db82-4b4a-9b66-716bba3e27c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.867368 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58be64fa-db82-4b4a-9b66-716bba3e27c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58be64fa-db82-4b4a-9b66-716bba3e27c8" (UID: "58be64fa-db82-4b4a-9b66-716bba3e27c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.878821 4743 generic.go:334] "Generic (PLEG): container finished" podID="58be64fa-db82-4b4a-9b66-716bba3e27c8" containerID="65ed5539fd10598e9c4f0f177fdde4ab6b24f0dfbf85872aebb023c2e92bdcbd" exitCode=143 Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.879126 4743 generic.go:334] "Generic (PLEG): container finished" podID="58be64fa-db82-4b4a-9b66-716bba3e27c8" containerID="5b5fbb7d545226ef809bd5b8e30d4eff28355851ab3983cb462f20ae64142ed0" exitCode=143 Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.878933 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"58be64fa-db82-4b4a-9b66-716bba3e27c8","Type":"ContainerDied","Data":"65ed5539fd10598e9c4f0f177fdde4ab6b24f0dfbf85872aebb023c2e92bdcbd"} Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.879242 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"58be64fa-db82-4b4a-9b66-716bba3e27c8","Type":"ContainerDied","Data":"5b5fbb7d545226ef809bd5b8e30d4eff28355851ab3983cb462f20ae64142ed0"} Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.879274 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"58be64fa-db82-4b4a-9b66-716bba3e27c8","Type":"ContainerDied","Data":"8fbe4ed73a2c4a935d1a96dc7e79843e3a1d6f221825c11856ad257bcb0b5cb4"} Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.879292 4743 scope.go:117] "RemoveContainer" containerID="65ed5539fd10598e9c4f0f177fdde4ab6b24f0dfbf85872aebb023c2e92bdcbd" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.878917 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.882090 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58be64fa-db82-4b4a-9b66-716bba3e27c8-config-data" (OuterVolumeSpecName: "config-data") pod "58be64fa-db82-4b4a-9b66-716bba3e27c8" (UID: "58be64fa-db82-4b4a-9b66-716bba3e27c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.911639 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58be64fa-db82-4b4a-9b66-716bba3e27c8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.911866 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58be64fa-db82-4b4a-9b66-716bba3e27c8-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.911985 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58be64fa-db82-4b4a-9b66-716bba3e27c8-logs\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.912046 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58be64fa-db82-4b4a-9b66-716bba3e27c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.912101 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58be64fa-db82-4b4a-9b66-716bba3e27c8-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.912152 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9b9n\" (UniqueName: \"kubernetes.io/projected/58be64fa-db82-4b4a-9b66-716bba3e27c8-kube-api-access-l9b9n\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.912205 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58be64fa-db82-4b4a-9b66-716bba3e27c8-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.969329 4743 scope.go:117] "RemoveContainer" containerID="5b5fbb7d545226ef809bd5b8e30d4eff28355851ab3983cb462f20ae64142ed0" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.991425 4743 scope.go:117] "RemoveContainer" containerID="65ed5539fd10598e9c4f0f177fdde4ab6b24f0dfbf85872aebb023c2e92bdcbd" Oct 11 02:13:59 crc kubenswrapper[4743]: E1011 02:13:59.991813 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65ed5539fd10598e9c4f0f177fdde4ab6b24f0dfbf85872aebb023c2e92bdcbd\": container with ID starting with 65ed5539fd10598e9c4f0f177fdde4ab6b24f0dfbf85872aebb023c2e92bdcbd not found: ID does not exist" containerID="65ed5539fd10598e9c4f0f177fdde4ab6b24f0dfbf85872aebb023c2e92bdcbd" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.991842 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65ed5539fd10598e9c4f0f177fdde4ab6b24f0dfbf85872aebb023c2e92bdcbd"} err="failed to get container status \"65ed5539fd10598e9c4f0f177fdde4ab6b24f0dfbf85872aebb023c2e92bdcbd\": rpc error: code = NotFound desc = could not find container \"65ed5539fd10598e9c4f0f177fdde4ab6b24f0dfbf85872aebb023c2e92bdcbd\": container with ID starting with 65ed5539fd10598e9c4f0f177fdde4ab6b24f0dfbf85872aebb023c2e92bdcbd not found: ID does not exist" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.991874 4743 scope.go:117] "RemoveContainer" containerID="5b5fbb7d545226ef809bd5b8e30d4eff28355851ab3983cb462f20ae64142ed0" Oct 11 02:13:59 crc kubenswrapper[4743]: E1011 02:13:59.992053 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b5fbb7d545226ef809bd5b8e30d4eff28355851ab3983cb462f20ae64142ed0\": container with ID starting with 5b5fbb7d545226ef809bd5b8e30d4eff28355851ab3983cb462f20ae64142ed0 not found: ID does not exist" containerID="5b5fbb7d545226ef809bd5b8e30d4eff28355851ab3983cb462f20ae64142ed0" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.992075 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b5fbb7d545226ef809bd5b8e30d4eff28355851ab3983cb462f20ae64142ed0"} err="failed to get container status \"5b5fbb7d545226ef809bd5b8e30d4eff28355851ab3983cb462f20ae64142ed0\": rpc error: code = NotFound desc = could not find container \"5b5fbb7d545226ef809bd5b8e30d4eff28355851ab3983cb462f20ae64142ed0\": container with ID starting with 5b5fbb7d545226ef809bd5b8e30d4eff28355851ab3983cb462f20ae64142ed0 not found: ID does not exist" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.992088 4743 scope.go:117] "RemoveContainer" containerID="65ed5539fd10598e9c4f0f177fdde4ab6b24f0dfbf85872aebb023c2e92bdcbd" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.992275 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65ed5539fd10598e9c4f0f177fdde4ab6b24f0dfbf85872aebb023c2e92bdcbd"} err="failed to get container status \"65ed5539fd10598e9c4f0f177fdde4ab6b24f0dfbf85872aebb023c2e92bdcbd\": rpc error: code = NotFound desc = could not find container \"65ed5539fd10598e9c4f0f177fdde4ab6b24f0dfbf85872aebb023c2e92bdcbd\": container with ID starting with 65ed5539fd10598e9c4f0f177fdde4ab6b24f0dfbf85872aebb023c2e92bdcbd not found: ID does not exist" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.992292 4743 scope.go:117] "RemoveContainer" containerID="5b5fbb7d545226ef809bd5b8e30d4eff28355851ab3983cb462f20ae64142ed0" Oct 11 02:13:59 crc kubenswrapper[4743]: I1011 02:13:59.992460 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b5fbb7d545226ef809bd5b8e30d4eff28355851ab3983cb462f20ae64142ed0"} err="failed to get container status \"5b5fbb7d545226ef809bd5b8e30d4eff28355851ab3983cb462f20ae64142ed0\": rpc error: code = NotFound desc = could not find container \"5b5fbb7d545226ef809bd5b8e30d4eff28355851ab3983cb462f20ae64142ed0\": container with ID starting with 5b5fbb7d545226ef809bd5b8e30d4eff28355851ab3983cb462f20ae64142ed0 not found: ID does not exist" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.206972 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.228611 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.246047 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 11 02:14:00 crc kubenswrapper[4743]: E1011 02:14:00.246583 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58be64fa-db82-4b4a-9b66-716bba3e27c8" containerName="manila-api-log" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.246604 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="58be64fa-db82-4b4a-9b66-716bba3e27c8" containerName="manila-api-log" Oct 11 02:14:00 crc kubenswrapper[4743]: E1011 02:14:00.246620 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58be64fa-db82-4b4a-9b66-716bba3e27c8" containerName="manila-api" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.246628 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="58be64fa-db82-4b4a-9b66-716bba3e27c8" containerName="manila-api" Oct 11 02:14:00 crc kubenswrapper[4743]: E1011 02:14:00.246646 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d39c4e33-bfe6-4c48-bc00-f2713e45103b" containerName="horizon-log" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.246655 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d39c4e33-bfe6-4c48-bc00-f2713e45103b" containerName="horizon-log" Oct 11 02:14:00 crc kubenswrapper[4743]: E1011 02:14:00.246688 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d39c4e33-bfe6-4c48-bc00-f2713e45103b" containerName="horizon" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.246697 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d39c4e33-bfe6-4c48-bc00-f2713e45103b" containerName="horizon" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.246974 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="58be64fa-db82-4b4a-9b66-716bba3e27c8" containerName="manila-api" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.247001 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="58be64fa-db82-4b4a-9b66-716bba3e27c8" containerName="manila-api-log" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.247030 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d39c4e33-bfe6-4c48-bc00-f2713e45103b" containerName="horizon" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.247045 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d39c4e33-bfe6-4c48-bc00-f2713e45103b" containerName="horizon-log" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.248543 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.253966 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.254192 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.254533 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.256586 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.322191 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7959191f-6ca5-4f63-84e9-815b7378c505-config-data-custom\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.322250 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7959191f-6ca5-4f63-84e9-815b7378c505-scripts\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.322377 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7959191f-6ca5-4f63-84e9-815b7378c505-config-data\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.322528 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7959191f-6ca5-4f63-84e9-815b7378c505-internal-tls-certs\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.322721 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whgfj\" (UniqueName: \"kubernetes.io/projected/7959191f-6ca5-4f63-84e9-815b7378c505-kube-api-access-whgfj\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.322765 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7959191f-6ca5-4f63-84e9-815b7378c505-etc-machine-id\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.322914 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7959191f-6ca5-4f63-84e9-815b7378c505-logs\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.323009 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7959191f-6ca5-4f63-84e9-815b7378c505-public-tls-certs\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.323066 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7959191f-6ca5-4f63-84e9-815b7378c505-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.424673 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7959191f-6ca5-4f63-84e9-815b7378c505-logs\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.424739 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7959191f-6ca5-4f63-84e9-815b7378c505-public-tls-certs\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.424770 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7959191f-6ca5-4f63-84e9-815b7378c505-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.424809 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7959191f-6ca5-4f63-84e9-815b7378c505-config-data-custom\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.424836 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7959191f-6ca5-4f63-84e9-815b7378c505-scripts\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.424859 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7959191f-6ca5-4f63-84e9-815b7378c505-config-data\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.424935 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7959191f-6ca5-4f63-84e9-815b7378c505-internal-tls-certs\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.424994 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whgfj\" (UniqueName: \"kubernetes.io/projected/7959191f-6ca5-4f63-84e9-815b7378c505-kube-api-access-whgfj\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.425018 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7959191f-6ca5-4f63-84e9-815b7378c505-etc-machine-id\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.425098 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7959191f-6ca5-4f63-84e9-815b7378c505-etc-machine-id\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:00 crc kubenswrapper[4743]: I1011 02:14:00.425657 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7959191f-6ca5-4f63-84e9-815b7378c505-logs\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:01 crc kubenswrapper[4743]: I1011 02:14:01.056349 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7959191f-6ca5-4f63-84e9-815b7378c505-scripts\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:01 crc kubenswrapper[4743]: I1011 02:14:01.056530 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7959191f-6ca5-4f63-84e9-815b7378c505-public-tls-certs\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:01 crc kubenswrapper[4743]: I1011 02:14:01.056734 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7959191f-6ca5-4f63-84e9-815b7378c505-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:01 crc kubenswrapper[4743]: I1011 02:14:01.057281 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7959191f-6ca5-4f63-84e9-815b7378c505-config-data\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:01 crc kubenswrapper[4743]: I1011 02:14:01.058637 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whgfj\" (UniqueName: \"kubernetes.io/projected/7959191f-6ca5-4f63-84e9-815b7378c505-kube-api-access-whgfj\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:01 crc kubenswrapper[4743]: I1011 02:14:01.059324 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7959191f-6ca5-4f63-84e9-815b7378c505-internal-tls-certs\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:01 crc kubenswrapper[4743]: I1011 02:14:01.059949 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7959191f-6ca5-4f63-84e9-815b7378c505-config-data-custom\") pod \"manila-api-0\" (UID: \"7959191f-6ca5-4f63-84e9-815b7378c505\") " pod="openstack/manila-api-0" Oct 11 02:14:01 crc kubenswrapper[4743]: I1011 02:14:01.182596 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 11 02:14:01 crc kubenswrapper[4743]: I1011 02:14:01.826762 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 11 02:14:01 crc kubenswrapper[4743]: I1011 02:14:01.913011 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"7959191f-6ca5-4f63-84e9-815b7378c505","Type":"ContainerStarted","Data":"4eb4cb9ea9e2a1c65769c0adbccdfca619a9bf0ad271d199b6415b9196c698e7"} Oct 11 02:14:01 crc kubenswrapper[4743]: I1011 02:14:01.916024 4743 generic.go:334] "Generic (PLEG): container finished" podID="d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd" containerID="1ae29a0313861a928c7541c1589ba7b739f50bb56d203301166706e6650049d2" exitCode=0 Oct 11 02:14:01 crc kubenswrapper[4743]: I1011 02:14:01.916068 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bc9759f8-b6qgh" event={"ID":"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd","Type":"ContainerDied","Data":"1ae29a0313861a928c7541c1589ba7b739f50bb56d203301166706e6650049d2"} Oct 11 02:14:01 crc kubenswrapper[4743]: I1011 02:14:01.943569 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 11 02:14:01 crc kubenswrapper[4743]: I1011 02:14:01.943840 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d54c922a-5193-47c9-9148-7fe897065885" containerName="ceilometer-central-agent" containerID="cri-o://b5df9e25f81c30a9e8d87999631a1284a92d5a9f28aa2a96a6785f0d24640fde" gracePeriod=30 Oct 11 02:14:01 crc kubenswrapper[4743]: I1011 02:14:01.944329 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d54c922a-5193-47c9-9148-7fe897065885" containerName="proxy-httpd" containerID="cri-o://974cbe5c248e09ea2b0b16ba354ab112eba57fa4bbda53a9ab34bddc530fc71f" gracePeriod=30 Oct 11 02:14:01 crc kubenswrapper[4743]: I1011 02:14:01.944382 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d54c922a-5193-47c9-9148-7fe897065885" containerName="sg-core" containerID="cri-o://611e40249e0b290add3d2eec42c4b75c490cb1a3909644d026855afd4c4cd9a1" gracePeriod=30 Oct 11 02:14:01 crc kubenswrapper[4743]: I1011 02:14:01.944415 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d54c922a-5193-47c9-9148-7fe897065885" containerName="ceilometer-notification-agent" containerID="cri-o://eaa60174b8128222df5518c4c906a810bde6bf618a4dc1f13b7192b8d1bb7381" gracePeriod=30 Oct 11 02:14:02 crc kubenswrapper[4743]: I1011 02:14:02.112596 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58be64fa-db82-4b4a-9b66-716bba3e27c8" path="/var/lib/kubelet/pods/58be64fa-db82-4b4a-9b66-716bba3e27c8/volumes" Oct 11 02:14:02 crc kubenswrapper[4743]: I1011 02:14:02.454585 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5bc9759f8-b6qgh" podUID="d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.67:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.67:8443: connect: connection refused" Oct 11 02:14:02 crc kubenswrapper[4743]: I1011 02:14:02.932170 4743 generic.go:334] "Generic (PLEG): container finished" podID="d54c922a-5193-47c9-9148-7fe897065885" containerID="974cbe5c248e09ea2b0b16ba354ab112eba57fa4bbda53a9ab34bddc530fc71f" exitCode=0 Oct 11 02:14:02 crc kubenswrapper[4743]: I1011 02:14:02.932199 4743 generic.go:334] "Generic (PLEG): container finished" podID="d54c922a-5193-47c9-9148-7fe897065885" containerID="611e40249e0b290add3d2eec42c4b75c490cb1a3909644d026855afd4c4cd9a1" exitCode=2 Oct 11 02:14:02 crc kubenswrapper[4743]: I1011 02:14:02.932219 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d54c922a-5193-47c9-9148-7fe897065885","Type":"ContainerDied","Data":"974cbe5c248e09ea2b0b16ba354ab112eba57fa4bbda53a9ab34bddc530fc71f"} Oct 11 02:14:02 crc kubenswrapper[4743]: I1011 02:14:02.932242 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d54c922a-5193-47c9-9148-7fe897065885","Type":"ContainerDied","Data":"611e40249e0b290add3d2eec42c4b75c490cb1a3909644d026855afd4c4cd9a1"} Oct 11 02:14:03 crc kubenswrapper[4743]: I1011 02:14:03.953733 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"7959191f-6ca5-4f63-84e9-815b7378c505","Type":"ContainerStarted","Data":"36538a55c76d64c4319bbe6c3d82b075601a609ffb34d269bdd2556e3f7b0617"} Oct 11 02:14:03 crc kubenswrapper[4743]: I1011 02:14:03.958653 4743 generic.go:334] "Generic (PLEG): container finished" podID="d54c922a-5193-47c9-9148-7fe897065885" containerID="b5df9e25f81c30a9e8d87999631a1284a92d5a9f28aa2a96a6785f0d24640fde" exitCode=0 Oct 11 02:14:03 crc kubenswrapper[4743]: I1011 02:14:03.958709 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d54c922a-5193-47c9-9148-7fe897065885","Type":"ContainerDied","Data":"b5df9e25f81c30a9e8d87999631a1284a92d5a9f28aa2a96a6785f0d24640fde"} Oct 11 02:14:05 crc kubenswrapper[4743]: I1011 02:14:05.170948 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 11 02:14:05 crc kubenswrapper[4743]: I1011 02:14:05.417096 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c8d8d886c-dsztl" Oct 11 02:14:05 crc kubenswrapper[4743]: I1011 02:14:05.481644 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8"] Oct 11 02:14:05 crc kubenswrapper[4743]: I1011 02:14:05.482126 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" podUID="69b6d455-26e7-498f-9dd4-4a96b7327f62" containerName="dnsmasq-dns" containerID="cri-o://d00168936a54cc20e1f81b72604436656d7eb0bcfadcddde09b963478e9239e9" gracePeriod=10 Oct 11 02:14:05 crc kubenswrapper[4743]: I1011 02:14:05.987368 4743 generic.go:334] "Generic (PLEG): container finished" podID="69b6d455-26e7-498f-9dd4-4a96b7327f62" containerID="d00168936a54cc20e1f81b72604436656d7eb0bcfadcddde09b963478e9239e9" exitCode=0 Oct 11 02:14:05 crc kubenswrapper[4743]: I1011 02:14:05.987408 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" event={"ID":"69b6d455-26e7-498f-9dd4-4a96b7327f62","Type":"ContainerDied","Data":"d00168936a54cc20e1f81b72604436656d7eb0bcfadcddde09b963478e9239e9"} Oct 11 02:14:06 crc kubenswrapper[4743]: I1011 02:14:06.888511 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 02:14:06 crc kubenswrapper[4743]: I1011 02:14:06.921152 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-ovsdbserver-nb\") pod \"69b6d455-26e7-498f-9dd4-4a96b7327f62\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " Oct 11 02:14:06 crc kubenswrapper[4743]: I1011 02:14:06.921378 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-dns-svc\") pod \"69b6d455-26e7-498f-9dd4-4a96b7327f62\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " Oct 11 02:14:06 crc kubenswrapper[4743]: I1011 02:14:06.921410 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbjhh\" (UniqueName: \"kubernetes.io/projected/69b6d455-26e7-498f-9dd4-4a96b7327f62-kube-api-access-gbjhh\") pod \"69b6d455-26e7-498f-9dd4-4a96b7327f62\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " Oct 11 02:14:06 crc kubenswrapper[4743]: I1011 02:14:06.921425 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-openstack-edpm-ipam\") pod \"69b6d455-26e7-498f-9dd4-4a96b7327f62\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " Oct 11 02:14:06 crc kubenswrapper[4743]: I1011 02:14:06.921466 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-dns-swift-storage-0\") pod \"69b6d455-26e7-498f-9dd4-4a96b7327f62\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " Oct 11 02:14:06 crc kubenswrapper[4743]: I1011 02:14:06.921514 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-ovsdbserver-sb\") pod \"69b6d455-26e7-498f-9dd4-4a96b7327f62\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " Oct 11 02:14:06 crc kubenswrapper[4743]: I1011 02:14:06.921557 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-config\") pod \"69b6d455-26e7-498f-9dd4-4a96b7327f62\" (UID: \"69b6d455-26e7-498f-9dd4-4a96b7327f62\") " Oct 11 02:14:06 crc kubenswrapper[4743]: I1011 02:14:06.940729 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69b6d455-26e7-498f-9dd4-4a96b7327f62-kube-api-access-gbjhh" (OuterVolumeSpecName: "kube-api-access-gbjhh") pod "69b6d455-26e7-498f-9dd4-4a96b7327f62" (UID: "69b6d455-26e7-498f-9dd4-4a96b7327f62"). InnerVolumeSpecName "kube-api-access-gbjhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:14:06 crc kubenswrapper[4743]: I1011 02:14:06.994460 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "69b6d455-26e7-498f-9dd4-4a96b7327f62" (UID: "69b6d455-26e7-498f-9dd4-4a96b7327f62"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 02:14:06 crc kubenswrapper[4743]: I1011 02:14:06.998821 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "69b6d455-26e7-498f-9dd4-4a96b7327f62" (UID: "69b6d455-26e7-498f-9dd4-4a96b7327f62"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 02:14:07 crc kubenswrapper[4743]: I1011 02:14:07.009015 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "69b6d455-26e7-498f-9dd4-4a96b7327f62" (UID: "69b6d455-26e7-498f-9dd4-4a96b7327f62"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 02:14:07 crc kubenswrapper[4743]: I1011 02:14:07.013271 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"7959191f-6ca5-4f63-84e9-815b7378c505","Type":"ContainerStarted","Data":"27aae379d62f5a201c4ff2a9ac2d313cf9de5cbbc8dd07747d83b326b0d8c286"} Oct 11 02:14:07 crc kubenswrapper[4743]: I1011 02:14:07.013786 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 11 02:14:07 crc kubenswrapper[4743]: I1011 02:14:07.013809 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "69b6d455-26e7-498f-9dd4-4a96b7327f62" (UID: "69b6d455-26e7-498f-9dd4-4a96b7327f62"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 02:14:07 crc kubenswrapper[4743]: I1011 02:14:07.041112 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:07 crc kubenswrapper[4743]: I1011 02:14:07.041137 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbjhh\" (UniqueName: \"kubernetes.io/projected/69b6d455-26e7-498f-9dd4-4a96b7327f62-kube-api-access-gbjhh\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:07 crc kubenswrapper[4743]: I1011 02:14:07.041147 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:07 crc kubenswrapper[4743]: I1011 02:14:07.041156 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:07 crc kubenswrapper[4743]: I1011 02:14:07.041171 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:07 crc kubenswrapper[4743]: I1011 02:14:07.064792 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "69b6d455-26e7-498f-9dd4-4a96b7327f62" (UID: "69b6d455-26e7-498f-9dd4-4a96b7327f62"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 02:14:07 crc kubenswrapper[4743]: I1011 02:14:07.065999 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=7.06598291 podStartE2EDuration="7.06598291s" podCreationTimestamp="2025-10-11 02:14:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 02:14:07.065518708 +0000 UTC m=+4941.718499105" watchObservedRunningTime="2025-10-11 02:14:07.06598291 +0000 UTC m=+4941.718963297" Oct 11 02:14:07 crc kubenswrapper[4743]: I1011 02:14:07.067192 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" event={"ID":"69b6d455-26e7-498f-9dd4-4a96b7327f62","Type":"ContainerDied","Data":"359512a022bcb9e36283cdc14cfa3263bfa9fe3d8d214f86241513a54d3f81b3"} Oct 11 02:14:07 crc kubenswrapper[4743]: I1011 02:14:07.067240 4743 scope.go:117] "RemoveContainer" containerID="d00168936a54cc20e1f81b72604436656d7eb0bcfadcddde09b963478e9239e9" Oct 11 02:14:07 crc kubenswrapper[4743]: I1011 02:14:07.067371 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8" Oct 11 02:14:07 crc kubenswrapper[4743]: I1011 02:14:07.105779 4743 scope.go:117] "RemoveContainer" containerID="49fabfdcfab5ad7e3997023f0c6601548d101517477040c8268fbaf7110775a4" Oct 11 02:14:07 crc kubenswrapper[4743]: I1011 02:14:07.143873 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:07 crc kubenswrapper[4743]: I1011 02:14:07.161538 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-config" (OuterVolumeSpecName: "config") pod "69b6d455-26e7-498f-9dd4-4a96b7327f62" (UID: "69b6d455-26e7-498f-9dd4-4a96b7327f62"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 02:14:07 crc kubenswrapper[4743]: I1011 02:14:07.246641 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69b6d455-26e7-498f-9dd4-4a96b7327f62-config\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:07 crc kubenswrapper[4743]: I1011 02:14:07.416969 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8"] Oct 11 02:14:07 crc kubenswrapper[4743]: I1011 02:14:07.428620 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cf7b6cbf7-7wrx8"] Oct 11 02:14:08 crc kubenswrapper[4743]: I1011 02:14:08.077545 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2d3cbb84-0280-484f-a137-47cfea670423","Type":"ContainerStarted","Data":"61b6e8e98922429a828089232abc6e60bb65b6a56dffe7bfc9b8bfd18634b9d5"} Oct 11 02:14:08 crc kubenswrapper[4743]: I1011 02:14:08.077919 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2d3cbb84-0280-484f-a137-47cfea670423","Type":"ContainerStarted","Data":"358eb58248df8616733c81077c03922409626e6cb1fc708be47f26baf0fc6e54"} Oct 11 02:14:08 crc kubenswrapper[4743]: I1011 02:14:08.101015 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.035023796 podStartE2EDuration="14.100998412s" podCreationTimestamp="2025-10-11 02:13:54 +0000 UTC" firstStartedPulling="2025-10-11 02:13:56.53521387 +0000 UTC m=+4931.188194267" lastFinishedPulling="2025-10-11 02:14:06.601188486 +0000 UTC m=+4941.254168883" observedRunningTime="2025-10-11 02:14:08.094560652 +0000 UTC m=+4942.747541039" watchObservedRunningTime="2025-10-11 02:14:08.100998412 +0000 UTC m=+4942.753978809" Oct 11 02:14:08 crc kubenswrapper[4743]: I1011 02:14:08.103559 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69b6d455-26e7-498f-9dd4-4a96b7327f62" path="/var/lib/kubelet/pods/69b6d455-26e7-498f-9dd4-4a96b7327f62/volumes" Oct 11 02:14:09 crc kubenswrapper[4743]: I1011 02:14:09.888418 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 02:14:09 crc kubenswrapper[4743]: I1011 02:14:09.930578 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-ceilometer-tls-certs\") pod \"d54c922a-5193-47c9-9148-7fe897065885\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " Oct 11 02:14:09 crc kubenswrapper[4743]: I1011 02:14:09.930629 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d54c922a-5193-47c9-9148-7fe897065885-run-httpd\") pod \"d54c922a-5193-47c9-9148-7fe897065885\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " Oct 11 02:14:09 crc kubenswrapper[4743]: I1011 02:14:09.930741 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-combined-ca-bundle\") pod \"d54c922a-5193-47c9-9148-7fe897065885\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " Oct 11 02:14:09 crc kubenswrapper[4743]: I1011 02:14:09.930812 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-sg-core-conf-yaml\") pod \"d54c922a-5193-47c9-9148-7fe897065885\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " Oct 11 02:14:09 crc kubenswrapper[4743]: I1011 02:14:09.930893 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d54c922a-5193-47c9-9148-7fe897065885-log-httpd\") pod \"d54c922a-5193-47c9-9148-7fe897065885\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " Oct 11 02:14:09 crc kubenswrapper[4743]: I1011 02:14:09.930941 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-scripts\") pod \"d54c922a-5193-47c9-9148-7fe897065885\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " Oct 11 02:14:09 crc kubenswrapper[4743]: I1011 02:14:09.930959 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-config-data\") pod \"d54c922a-5193-47c9-9148-7fe897065885\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " Oct 11 02:14:09 crc kubenswrapper[4743]: I1011 02:14:09.931010 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8z4p\" (UniqueName: \"kubernetes.io/projected/d54c922a-5193-47c9-9148-7fe897065885-kube-api-access-p8z4p\") pod \"d54c922a-5193-47c9-9148-7fe897065885\" (UID: \"d54c922a-5193-47c9-9148-7fe897065885\") " Oct 11 02:14:09 crc kubenswrapper[4743]: I1011 02:14:09.934413 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d54c922a-5193-47c9-9148-7fe897065885-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d54c922a-5193-47c9-9148-7fe897065885" (UID: "d54c922a-5193-47c9-9148-7fe897065885"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:14:09 crc kubenswrapper[4743]: I1011 02:14:09.934528 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d54c922a-5193-47c9-9148-7fe897065885-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d54c922a-5193-47c9-9148-7fe897065885" (UID: "d54c922a-5193-47c9-9148-7fe897065885"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:14:09 crc kubenswrapper[4743]: I1011 02:14:09.937829 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-scripts" (OuterVolumeSpecName: "scripts") pod "d54c922a-5193-47c9-9148-7fe897065885" (UID: "d54c922a-5193-47c9-9148-7fe897065885"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:14:09 crc kubenswrapper[4743]: I1011 02:14:09.939079 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d54c922a-5193-47c9-9148-7fe897065885-kube-api-access-p8z4p" (OuterVolumeSpecName: "kube-api-access-p8z4p") pod "d54c922a-5193-47c9-9148-7fe897065885" (UID: "d54c922a-5193-47c9-9148-7fe897065885"). InnerVolumeSpecName "kube-api-access-p8z4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:14:09 crc kubenswrapper[4743]: I1011 02:14:09.975140 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d54c922a-5193-47c9-9148-7fe897065885" (UID: "d54c922a-5193-47c9-9148-7fe897065885"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.000938 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d54c922a-5193-47c9-9148-7fe897065885" (UID: "d54c922a-5193-47c9-9148-7fe897065885"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.034230 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8z4p\" (UniqueName: \"kubernetes.io/projected/d54c922a-5193-47c9-9148-7fe897065885-kube-api-access-p8z4p\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.034266 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.034279 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d54c922a-5193-47c9-9148-7fe897065885-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.035175 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.035200 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d54c922a-5193-47c9-9148-7fe897065885-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.035214 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.061732 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d54c922a-5193-47c9-9148-7fe897065885" (UID: "d54c922a-5193-47c9-9148-7fe897065885"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.065657 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-config-data" (OuterVolumeSpecName: "config-data") pod "d54c922a-5193-47c9-9148-7fe897065885" (UID: "d54c922a-5193-47c9-9148-7fe897065885"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.108901 4743 generic.go:334] "Generic (PLEG): container finished" podID="d54c922a-5193-47c9-9148-7fe897065885" containerID="eaa60174b8128222df5518c4c906a810bde6bf618a4dc1f13b7192b8d1bb7381" exitCode=0 Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.108988 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.124719 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d54c922a-5193-47c9-9148-7fe897065885","Type":"ContainerDied","Data":"eaa60174b8128222df5518c4c906a810bde6bf618a4dc1f13b7192b8d1bb7381"} Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.124772 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d54c922a-5193-47c9-9148-7fe897065885","Type":"ContainerDied","Data":"515415f7e3246ac39bf40a24d7a66fa8a689384792eede987f63a33b3c55a763"} Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.124796 4743 scope.go:117] "RemoveContainer" containerID="974cbe5c248e09ea2b0b16ba354ab112eba57fa4bbda53a9ab34bddc530fc71f" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.137063 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.137107 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d54c922a-5193-47c9-9148-7fe897065885-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.161997 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.168671 4743 scope.go:117] "RemoveContainer" containerID="611e40249e0b290add3d2eec42c4b75c490cb1a3909644d026855afd4c4cd9a1" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.170896 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.191286 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 11 02:14:10 crc kubenswrapper[4743]: E1011 02:14:10.192742 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d54c922a-5193-47c9-9148-7fe897065885" containerName="ceilometer-central-agent" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.192765 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d54c922a-5193-47c9-9148-7fe897065885" containerName="ceilometer-central-agent" Oct 11 02:14:10 crc kubenswrapper[4743]: E1011 02:14:10.192783 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d54c922a-5193-47c9-9148-7fe897065885" containerName="ceilometer-notification-agent" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.192792 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d54c922a-5193-47c9-9148-7fe897065885" containerName="ceilometer-notification-agent" Oct 11 02:14:10 crc kubenswrapper[4743]: E1011 02:14:10.192814 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69b6d455-26e7-498f-9dd4-4a96b7327f62" containerName="dnsmasq-dns" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.192821 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="69b6d455-26e7-498f-9dd4-4a96b7327f62" containerName="dnsmasq-dns" Oct 11 02:14:10 crc kubenswrapper[4743]: E1011 02:14:10.192841 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d54c922a-5193-47c9-9148-7fe897065885" containerName="sg-core" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.192854 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d54c922a-5193-47c9-9148-7fe897065885" containerName="sg-core" Oct 11 02:14:10 crc kubenswrapper[4743]: E1011 02:14:10.192881 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69b6d455-26e7-498f-9dd4-4a96b7327f62" containerName="init" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.192889 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="69b6d455-26e7-498f-9dd4-4a96b7327f62" containerName="init" Oct 11 02:14:10 crc kubenswrapper[4743]: E1011 02:14:10.192911 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d54c922a-5193-47c9-9148-7fe897065885" containerName="proxy-httpd" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.192919 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d54c922a-5193-47c9-9148-7fe897065885" containerName="proxy-httpd" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.193185 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d54c922a-5193-47c9-9148-7fe897065885" containerName="sg-core" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.193212 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="69b6d455-26e7-498f-9dd4-4a96b7327f62" containerName="dnsmasq-dns" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.193227 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d54c922a-5193-47c9-9148-7fe897065885" containerName="ceilometer-central-agent" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.193244 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d54c922a-5193-47c9-9148-7fe897065885" containerName="proxy-httpd" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.193258 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d54c922a-5193-47c9-9148-7fe897065885" containerName="ceilometer-notification-agent" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.195813 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.204882 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.243834 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.244052 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.244831 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.247446 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " pod="openstack/ceilometer-0" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.247497 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/203c6ffe-1d21-44d9-8dc5-a97806174cd0-log-httpd\") pod \"ceilometer-0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " pod="openstack/ceilometer-0" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.247519 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-config-data\") pod \"ceilometer-0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " pod="openstack/ceilometer-0" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.247627 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkxwb\" (UniqueName: \"kubernetes.io/projected/203c6ffe-1d21-44d9-8dc5-a97806174cd0-kube-api-access-lkxwb\") pod \"ceilometer-0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " pod="openstack/ceilometer-0" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.247719 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/203c6ffe-1d21-44d9-8dc5-a97806174cd0-run-httpd\") pod \"ceilometer-0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " pod="openstack/ceilometer-0" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.247811 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " pod="openstack/ceilometer-0" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.247936 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " pod="openstack/ceilometer-0" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.248075 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-scripts\") pod \"ceilometer-0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " pod="openstack/ceilometer-0" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.260590 4743 scope.go:117] "RemoveContainer" containerID="eaa60174b8128222df5518c4c906a810bde6bf618a4dc1f13b7192b8d1bb7381" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.290206 4743 scope.go:117] "RemoveContainer" containerID="b5df9e25f81c30a9e8d87999631a1284a92d5a9f28aa2a96a6785f0d24640fde" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.311675 4743 scope.go:117] "RemoveContainer" containerID="974cbe5c248e09ea2b0b16ba354ab112eba57fa4bbda53a9ab34bddc530fc71f" Oct 11 02:14:10 crc kubenswrapper[4743]: E1011 02:14:10.312132 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"974cbe5c248e09ea2b0b16ba354ab112eba57fa4bbda53a9ab34bddc530fc71f\": container with ID starting with 974cbe5c248e09ea2b0b16ba354ab112eba57fa4bbda53a9ab34bddc530fc71f not found: ID does not exist" containerID="974cbe5c248e09ea2b0b16ba354ab112eba57fa4bbda53a9ab34bddc530fc71f" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.312163 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"974cbe5c248e09ea2b0b16ba354ab112eba57fa4bbda53a9ab34bddc530fc71f"} err="failed to get container status \"974cbe5c248e09ea2b0b16ba354ab112eba57fa4bbda53a9ab34bddc530fc71f\": rpc error: code = NotFound desc = could not find container \"974cbe5c248e09ea2b0b16ba354ab112eba57fa4bbda53a9ab34bddc530fc71f\": container with ID starting with 974cbe5c248e09ea2b0b16ba354ab112eba57fa4bbda53a9ab34bddc530fc71f not found: ID does not exist" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.312183 4743 scope.go:117] "RemoveContainer" containerID="611e40249e0b290add3d2eec42c4b75c490cb1a3909644d026855afd4c4cd9a1" Oct 11 02:14:10 crc kubenswrapper[4743]: E1011 02:14:10.312593 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"611e40249e0b290add3d2eec42c4b75c490cb1a3909644d026855afd4c4cd9a1\": container with ID starting with 611e40249e0b290add3d2eec42c4b75c490cb1a3909644d026855afd4c4cd9a1 not found: ID does not exist" containerID="611e40249e0b290add3d2eec42c4b75c490cb1a3909644d026855afd4c4cd9a1" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.312619 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"611e40249e0b290add3d2eec42c4b75c490cb1a3909644d026855afd4c4cd9a1"} err="failed to get container status \"611e40249e0b290add3d2eec42c4b75c490cb1a3909644d026855afd4c4cd9a1\": rpc error: code = NotFound desc = could not find container \"611e40249e0b290add3d2eec42c4b75c490cb1a3909644d026855afd4c4cd9a1\": container with ID starting with 611e40249e0b290add3d2eec42c4b75c490cb1a3909644d026855afd4c4cd9a1 not found: ID does not exist" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.312633 4743 scope.go:117] "RemoveContainer" containerID="eaa60174b8128222df5518c4c906a810bde6bf618a4dc1f13b7192b8d1bb7381" Oct 11 02:14:10 crc kubenswrapper[4743]: E1011 02:14:10.312935 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaa60174b8128222df5518c4c906a810bde6bf618a4dc1f13b7192b8d1bb7381\": container with ID starting with eaa60174b8128222df5518c4c906a810bde6bf618a4dc1f13b7192b8d1bb7381 not found: ID does not exist" containerID="eaa60174b8128222df5518c4c906a810bde6bf618a4dc1f13b7192b8d1bb7381" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.312981 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaa60174b8128222df5518c4c906a810bde6bf618a4dc1f13b7192b8d1bb7381"} err="failed to get container status \"eaa60174b8128222df5518c4c906a810bde6bf618a4dc1f13b7192b8d1bb7381\": rpc error: code = NotFound desc = could not find container \"eaa60174b8128222df5518c4c906a810bde6bf618a4dc1f13b7192b8d1bb7381\": container with ID starting with eaa60174b8128222df5518c4c906a810bde6bf618a4dc1f13b7192b8d1bb7381 not found: ID does not exist" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.313009 4743 scope.go:117] "RemoveContainer" containerID="b5df9e25f81c30a9e8d87999631a1284a92d5a9f28aa2a96a6785f0d24640fde" Oct 11 02:14:10 crc kubenswrapper[4743]: E1011 02:14:10.314752 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5df9e25f81c30a9e8d87999631a1284a92d5a9f28aa2a96a6785f0d24640fde\": container with ID starting with b5df9e25f81c30a9e8d87999631a1284a92d5a9f28aa2a96a6785f0d24640fde not found: ID does not exist" containerID="b5df9e25f81c30a9e8d87999631a1284a92d5a9f28aa2a96a6785f0d24640fde" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.314781 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5df9e25f81c30a9e8d87999631a1284a92d5a9f28aa2a96a6785f0d24640fde"} err="failed to get container status \"b5df9e25f81c30a9e8d87999631a1284a92d5a9f28aa2a96a6785f0d24640fde\": rpc error: code = NotFound desc = could not find container \"b5df9e25f81c30a9e8d87999631a1284a92d5a9f28aa2a96a6785f0d24640fde\": container with ID starting with b5df9e25f81c30a9e8d87999631a1284a92d5a9f28aa2a96a6785f0d24640fde not found: ID does not exist" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.349356 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " pod="openstack/ceilometer-0" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.349420 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " pod="openstack/ceilometer-0" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.349483 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-scripts\") pod \"ceilometer-0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " pod="openstack/ceilometer-0" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.349506 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " pod="openstack/ceilometer-0" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.349526 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/203c6ffe-1d21-44d9-8dc5-a97806174cd0-log-httpd\") pod \"ceilometer-0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " pod="openstack/ceilometer-0" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.349547 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-config-data\") pod \"ceilometer-0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " pod="openstack/ceilometer-0" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.349584 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkxwb\" (UniqueName: \"kubernetes.io/projected/203c6ffe-1d21-44d9-8dc5-a97806174cd0-kube-api-access-lkxwb\") pod \"ceilometer-0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " pod="openstack/ceilometer-0" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.349628 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/203c6ffe-1d21-44d9-8dc5-a97806174cd0-run-httpd\") pod \"ceilometer-0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " pod="openstack/ceilometer-0" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.350052 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/203c6ffe-1d21-44d9-8dc5-a97806174cd0-run-httpd\") pod \"ceilometer-0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " pod="openstack/ceilometer-0" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.351460 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/203c6ffe-1d21-44d9-8dc5-a97806174cd0-log-httpd\") pod \"ceilometer-0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " pod="openstack/ceilometer-0" Oct 11 02:14:10 crc kubenswrapper[4743]: I1011 02:14:10.606807 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 11 02:14:10 crc kubenswrapper[4743]: E1011 02:14:10.607680 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-lkxwb scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="203c6ffe-1d21-44d9-8dc5-a97806174cd0" Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.124082 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.155963 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " pod="openstack/ceilometer-0" Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.155978 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-scripts\") pod \"ceilometer-0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " pod="openstack/ceilometer-0" Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.156317 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " pod="openstack/ceilometer-0" Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.157480 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-config-data\") pod \"ceilometer-0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " pod="openstack/ceilometer-0" Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.157816 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " pod="openstack/ceilometer-0" Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.158218 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkxwb\" (UniqueName: \"kubernetes.io/projected/203c6ffe-1d21-44d9-8dc5-a97806174cd0-kube-api-access-lkxwb\") pod \"ceilometer-0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " pod="openstack/ceilometer-0" Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.401894 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.576773 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-combined-ca-bundle\") pod \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.576841 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkxwb\" (UniqueName: \"kubernetes.io/projected/203c6ffe-1d21-44d9-8dc5-a97806174cd0-kube-api-access-lkxwb\") pod \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.576915 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-ceilometer-tls-certs\") pod \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.577009 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-scripts\") pod \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.577057 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-sg-core-conf-yaml\") pod \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.577091 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/203c6ffe-1d21-44d9-8dc5-a97806174cd0-log-httpd\") pod \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.577201 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-config-data\") pod \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.577244 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/203c6ffe-1d21-44d9-8dc5-a97806174cd0-run-httpd\") pod \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\" (UID: \"203c6ffe-1d21-44d9-8dc5-a97806174cd0\") " Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.577824 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/203c6ffe-1d21-44d9-8dc5-a97806174cd0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "203c6ffe-1d21-44d9-8dc5-a97806174cd0" (UID: "203c6ffe-1d21-44d9-8dc5-a97806174cd0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.578124 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/203c6ffe-1d21-44d9-8dc5-a97806174cd0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "203c6ffe-1d21-44d9-8dc5-a97806174cd0" (UID: "203c6ffe-1d21-44d9-8dc5-a97806174cd0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.578823 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/203c6ffe-1d21-44d9-8dc5-a97806174cd0-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.578884 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/203c6ffe-1d21-44d9-8dc5-a97806174cd0-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.582933 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-scripts" (OuterVolumeSpecName: "scripts") pod "203c6ffe-1d21-44d9-8dc5-a97806174cd0" (UID: "203c6ffe-1d21-44d9-8dc5-a97806174cd0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.583149 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/203c6ffe-1d21-44d9-8dc5-a97806174cd0-kube-api-access-lkxwb" (OuterVolumeSpecName: "kube-api-access-lkxwb") pod "203c6ffe-1d21-44d9-8dc5-a97806174cd0" (UID: "203c6ffe-1d21-44d9-8dc5-a97806174cd0"). InnerVolumeSpecName "kube-api-access-lkxwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.584927 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "203c6ffe-1d21-44d9-8dc5-a97806174cd0" (UID: "203c6ffe-1d21-44d9-8dc5-a97806174cd0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.585770 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "203c6ffe-1d21-44d9-8dc5-a97806174cd0" (UID: "203c6ffe-1d21-44d9-8dc5-a97806174cd0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.586470 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-config-data" (OuterVolumeSpecName: "config-data") pod "203c6ffe-1d21-44d9-8dc5-a97806174cd0" (UID: "203c6ffe-1d21-44d9-8dc5-a97806174cd0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.586537 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "203c6ffe-1d21-44d9-8dc5-a97806174cd0" (UID: "203c6ffe-1d21-44d9-8dc5-a97806174cd0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.679831 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.679884 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.679897 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.679910 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.679924 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkxwb\" (UniqueName: \"kubernetes.io/projected/203c6ffe-1d21-44d9-8dc5-a97806174cd0-kube-api-access-lkxwb\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:11 crc kubenswrapper[4743]: I1011 02:14:11.679936 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/203c6ffe-1d21-44d9-8dc5-a97806174cd0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.110574 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d54c922a-5193-47c9-9148-7fe897065885" path="/var/lib/kubelet/pods/d54c922a-5193-47c9-9148-7fe897065885/volumes" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.134180 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.191507 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.241116 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.263613 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.266569 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.269326 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.269688 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.269954 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.276502 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.397229 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkrqb\" (UniqueName: \"kubernetes.io/projected/0a98090b-ea2d-4b45-98ca-cdb8d619e42d-kube-api-access-pkrqb\") pod \"ceilometer-0\" (UID: \"0a98090b-ea2d-4b45-98ca-cdb8d619e42d\") " pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.397297 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a98090b-ea2d-4b45-98ca-cdb8d619e42d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a98090b-ea2d-4b45-98ca-cdb8d619e42d\") " pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.397389 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a98090b-ea2d-4b45-98ca-cdb8d619e42d-log-httpd\") pod \"ceilometer-0\" (UID: \"0a98090b-ea2d-4b45-98ca-cdb8d619e42d\") " pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.397429 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a98090b-ea2d-4b45-98ca-cdb8d619e42d-run-httpd\") pod \"ceilometer-0\" (UID: \"0a98090b-ea2d-4b45-98ca-cdb8d619e42d\") " pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.397485 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a98090b-ea2d-4b45-98ca-cdb8d619e42d-config-data\") pod \"ceilometer-0\" (UID: \"0a98090b-ea2d-4b45-98ca-cdb8d619e42d\") " pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.397537 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a98090b-ea2d-4b45-98ca-cdb8d619e42d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0a98090b-ea2d-4b45-98ca-cdb8d619e42d\") " pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.397558 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a98090b-ea2d-4b45-98ca-cdb8d619e42d-scripts\") pod \"ceilometer-0\" (UID: \"0a98090b-ea2d-4b45-98ca-cdb8d619e42d\") " pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.397575 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a98090b-ea2d-4b45-98ca-cdb8d619e42d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a98090b-ea2d-4b45-98ca-cdb8d619e42d\") " pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.454132 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5bc9759f8-b6qgh" podUID="d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.67:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.67:8443: connect: connection refused" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.498981 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkrqb\" (UniqueName: \"kubernetes.io/projected/0a98090b-ea2d-4b45-98ca-cdb8d619e42d-kube-api-access-pkrqb\") pod \"ceilometer-0\" (UID: \"0a98090b-ea2d-4b45-98ca-cdb8d619e42d\") " pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.499025 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a98090b-ea2d-4b45-98ca-cdb8d619e42d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a98090b-ea2d-4b45-98ca-cdb8d619e42d\") " pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.499103 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a98090b-ea2d-4b45-98ca-cdb8d619e42d-log-httpd\") pod \"ceilometer-0\" (UID: \"0a98090b-ea2d-4b45-98ca-cdb8d619e42d\") " pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.499148 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a98090b-ea2d-4b45-98ca-cdb8d619e42d-run-httpd\") pod \"ceilometer-0\" (UID: \"0a98090b-ea2d-4b45-98ca-cdb8d619e42d\") " pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.499192 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a98090b-ea2d-4b45-98ca-cdb8d619e42d-config-data\") pod \"ceilometer-0\" (UID: \"0a98090b-ea2d-4b45-98ca-cdb8d619e42d\") " pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.499244 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a98090b-ea2d-4b45-98ca-cdb8d619e42d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0a98090b-ea2d-4b45-98ca-cdb8d619e42d\") " pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.499269 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a98090b-ea2d-4b45-98ca-cdb8d619e42d-scripts\") pod \"ceilometer-0\" (UID: \"0a98090b-ea2d-4b45-98ca-cdb8d619e42d\") " pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.499290 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a98090b-ea2d-4b45-98ca-cdb8d619e42d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a98090b-ea2d-4b45-98ca-cdb8d619e42d\") " pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.500512 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a98090b-ea2d-4b45-98ca-cdb8d619e42d-run-httpd\") pod \"ceilometer-0\" (UID: \"0a98090b-ea2d-4b45-98ca-cdb8d619e42d\") " pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.500525 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a98090b-ea2d-4b45-98ca-cdb8d619e42d-log-httpd\") pod \"ceilometer-0\" (UID: \"0a98090b-ea2d-4b45-98ca-cdb8d619e42d\") " pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.504291 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a98090b-ea2d-4b45-98ca-cdb8d619e42d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a98090b-ea2d-4b45-98ca-cdb8d619e42d\") " pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.504562 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a98090b-ea2d-4b45-98ca-cdb8d619e42d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0a98090b-ea2d-4b45-98ca-cdb8d619e42d\") " pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.505456 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a98090b-ea2d-4b45-98ca-cdb8d619e42d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a98090b-ea2d-4b45-98ca-cdb8d619e42d\") " pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.505599 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a98090b-ea2d-4b45-98ca-cdb8d619e42d-config-data\") pod \"ceilometer-0\" (UID: \"0a98090b-ea2d-4b45-98ca-cdb8d619e42d\") " pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.507343 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a98090b-ea2d-4b45-98ca-cdb8d619e42d-scripts\") pod \"ceilometer-0\" (UID: \"0a98090b-ea2d-4b45-98ca-cdb8d619e42d\") " pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.516681 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkrqb\" (UniqueName: \"kubernetes.io/projected/0a98090b-ea2d-4b45-98ca-cdb8d619e42d-kube-api-access-pkrqb\") pod \"ceilometer-0\" (UID: \"0a98090b-ea2d-4b45-98ca-cdb8d619e42d\") " pod="openstack/ceilometer-0" Oct 11 02:14:12 crc kubenswrapper[4743]: I1011 02:14:12.588994 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 11 02:14:13 crc kubenswrapper[4743]: I1011 02:14:13.105770 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 11 02:14:14 crc kubenswrapper[4743]: I1011 02:14:14.106289 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="203c6ffe-1d21-44d9-8dc5-a97806174cd0" path="/var/lib/kubelet/pods/203c6ffe-1d21-44d9-8dc5-a97806174cd0/volumes" Oct 11 02:14:14 crc kubenswrapper[4743]: I1011 02:14:14.153823 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a98090b-ea2d-4b45-98ca-cdb8d619e42d","Type":"ContainerStarted","Data":"7c013e0503f1de7cec0b08d454dcd8fd415cdd988bfb09a4d5bcb8da62f42091"} Oct 11 02:14:14 crc kubenswrapper[4743]: I1011 02:14:14.153892 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a98090b-ea2d-4b45-98ca-cdb8d619e42d","Type":"ContainerStarted","Data":"9d0d3ec130773f54f62690c3a901228aefb19426d98a22981247df0c73b9d312"} Oct 11 02:14:14 crc kubenswrapper[4743]: I1011 02:14:14.457939 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:14:14 crc kubenswrapper[4743]: I1011 02:14:14.457983 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:14:15 crc kubenswrapper[4743]: I1011 02:14:15.166655 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a98090b-ea2d-4b45-98ca-cdb8d619e42d","Type":"ContainerStarted","Data":"a3b071eb6e74545d46ff09add42110daa6ced657df48f6b6cbb42d81d0cdd00d"} Oct 11 02:14:15 crc kubenswrapper[4743]: I1011 02:14:15.288836 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 11 02:14:16 crc kubenswrapper[4743]: I1011 02:14:16.179381 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a98090b-ea2d-4b45-98ca-cdb8d619e42d","Type":"ContainerStarted","Data":"c9eb77286ad32ecb3b142dfd515d474720906335351b0b9c147ea86e34282b8d"} Oct 11 02:14:16 crc kubenswrapper[4743]: I1011 02:14:16.801786 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 11 02:14:16 crc kubenswrapper[4743]: I1011 02:14:16.870359 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Oct 11 02:14:17 crc kubenswrapper[4743]: I1011 02:14:17.189397 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="e1a8e9c6-a225-4017-8752-c980f44618c5" containerName="manila-scheduler" containerID="cri-o://eccefbca5db579ad67191df996781478f4dff0f0d65c0d4218c5288d73943c07" gracePeriod=30 Oct 11 02:14:17 crc kubenswrapper[4743]: I1011 02:14:17.189540 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="e1a8e9c6-a225-4017-8752-c980f44618c5" containerName="probe" containerID="cri-o://a76fabdfdc6e7cd76ae905a7c8f4ea711990506bc58c9dc7cb4fbe954418498a" gracePeriod=30 Oct 11 02:14:18 crc kubenswrapper[4743]: I1011 02:14:18.200188 4743 generic.go:334] "Generic (PLEG): container finished" podID="e1a8e9c6-a225-4017-8752-c980f44618c5" containerID="a76fabdfdc6e7cd76ae905a7c8f4ea711990506bc58c9dc7cb4fbe954418498a" exitCode=0 Oct 11 02:14:18 crc kubenswrapper[4743]: I1011 02:14:18.200267 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e1a8e9c6-a225-4017-8752-c980f44618c5","Type":"ContainerDied","Data":"a76fabdfdc6e7cd76ae905a7c8f4ea711990506bc58c9dc7cb4fbe954418498a"} Oct 11 02:14:18 crc kubenswrapper[4743]: I1011 02:14:18.202939 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a98090b-ea2d-4b45-98ca-cdb8d619e42d","Type":"ContainerStarted","Data":"4011afa8773b040e0e3f5926a9c7304a39708b1aef6f5eff9a2849ae504e0f53"} Oct 11 02:14:18 crc kubenswrapper[4743]: I1011 02:14:18.203053 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 11 02:14:18 crc kubenswrapper[4743]: I1011 02:14:18.244998 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.130954182 podStartE2EDuration="6.244977942s" podCreationTimestamp="2025-10-11 02:14:12 +0000 UTC" firstStartedPulling="2025-10-11 02:14:13.166415738 +0000 UTC m=+4947.819396135" lastFinishedPulling="2025-10-11 02:14:17.280439498 +0000 UTC m=+4951.933419895" observedRunningTime="2025-10-11 02:14:18.234937133 +0000 UTC m=+4952.887917530" watchObservedRunningTime="2025-10-11 02:14:18.244977942 +0000 UTC m=+4952.897958339" Oct 11 02:14:18 crc kubenswrapper[4743]: I1011 02:14:18.660095 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 11 02:14:18 crc kubenswrapper[4743]: I1011 02:14:18.853555 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrpsd\" (UniqueName: \"kubernetes.io/projected/e1a8e9c6-a225-4017-8752-c980f44618c5-kube-api-access-wrpsd\") pod \"e1a8e9c6-a225-4017-8752-c980f44618c5\" (UID: \"e1a8e9c6-a225-4017-8752-c980f44618c5\") " Oct 11 02:14:18 crc kubenswrapper[4743]: I1011 02:14:18.853671 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1a8e9c6-a225-4017-8752-c980f44618c5-scripts\") pod \"e1a8e9c6-a225-4017-8752-c980f44618c5\" (UID: \"e1a8e9c6-a225-4017-8752-c980f44618c5\") " Oct 11 02:14:18 crc kubenswrapper[4743]: I1011 02:14:18.853754 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e1a8e9c6-a225-4017-8752-c980f44618c5-etc-machine-id\") pod \"e1a8e9c6-a225-4017-8752-c980f44618c5\" (UID: \"e1a8e9c6-a225-4017-8752-c980f44618c5\") " Oct 11 02:14:18 crc kubenswrapper[4743]: I1011 02:14:18.853832 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1a8e9c6-a225-4017-8752-c980f44618c5-config-data-custom\") pod \"e1a8e9c6-a225-4017-8752-c980f44618c5\" (UID: \"e1a8e9c6-a225-4017-8752-c980f44618c5\") " Oct 11 02:14:18 crc kubenswrapper[4743]: I1011 02:14:18.853888 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1a8e9c6-a225-4017-8752-c980f44618c5-config-data\") pod \"e1a8e9c6-a225-4017-8752-c980f44618c5\" (UID: \"e1a8e9c6-a225-4017-8752-c980f44618c5\") " Oct 11 02:14:18 crc kubenswrapper[4743]: I1011 02:14:18.853958 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1a8e9c6-a225-4017-8752-c980f44618c5-combined-ca-bundle\") pod \"e1a8e9c6-a225-4017-8752-c980f44618c5\" (UID: \"e1a8e9c6-a225-4017-8752-c980f44618c5\") " Oct 11 02:14:18 crc kubenswrapper[4743]: I1011 02:14:18.854136 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1a8e9c6-a225-4017-8752-c980f44618c5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e1a8e9c6-a225-4017-8752-c980f44618c5" (UID: "e1a8e9c6-a225-4017-8752-c980f44618c5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 02:14:18 crc kubenswrapper[4743]: I1011 02:14:18.854630 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e1a8e9c6-a225-4017-8752-c980f44618c5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:18 crc kubenswrapper[4743]: I1011 02:14:18.860669 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1a8e9c6-a225-4017-8752-c980f44618c5-scripts" (OuterVolumeSpecName: "scripts") pod "e1a8e9c6-a225-4017-8752-c980f44618c5" (UID: "e1a8e9c6-a225-4017-8752-c980f44618c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:14:18 crc kubenswrapper[4743]: I1011 02:14:18.861112 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1a8e9c6-a225-4017-8752-c980f44618c5-kube-api-access-wrpsd" (OuterVolumeSpecName: "kube-api-access-wrpsd") pod "e1a8e9c6-a225-4017-8752-c980f44618c5" (UID: "e1a8e9c6-a225-4017-8752-c980f44618c5"). InnerVolumeSpecName "kube-api-access-wrpsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:14:18 crc kubenswrapper[4743]: I1011 02:14:18.861145 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1a8e9c6-a225-4017-8752-c980f44618c5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e1a8e9c6-a225-4017-8752-c980f44618c5" (UID: "e1a8e9c6-a225-4017-8752-c980f44618c5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:14:18 crc kubenswrapper[4743]: I1011 02:14:18.922354 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1a8e9c6-a225-4017-8752-c980f44618c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1a8e9c6-a225-4017-8752-c980f44618c5" (UID: "e1a8e9c6-a225-4017-8752-c980f44618c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:14:18 crc kubenswrapper[4743]: I1011 02:14:18.956414 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1a8e9c6-a225-4017-8752-c980f44618c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:18 crc kubenswrapper[4743]: I1011 02:14:18.956443 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrpsd\" (UniqueName: \"kubernetes.io/projected/e1a8e9c6-a225-4017-8752-c980f44618c5-kube-api-access-wrpsd\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:18 crc kubenswrapper[4743]: I1011 02:14:18.956454 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1a8e9c6-a225-4017-8752-c980f44618c5-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:18 crc kubenswrapper[4743]: I1011 02:14:18.956462 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1a8e9c6-a225-4017-8752-c980f44618c5-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:18 crc kubenswrapper[4743]: I1011 02:14:18.968528 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1a8e9c6-a225-4017-8752-c980f44618c5-config-data" (OuterVolumeSpecName: "config-data") pod "e1a8e9c6-a225-4017-8752-c980f44618c5" (UID: "e1a8e9c6-a225-4017-8752-c980f44618c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.058621 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1a8e9c6-a225-4017-8752-c980f44618c5-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.217557 4743 generic.go:334] "Generic (PLEG): container finished" podID="e1a8e9c6-a225-4017-8752-c980f44618c5" containerID="eccefbca5db579ad67191df996781478f4dff0f0d65c0d4218c5288d73943c07" exitCode=0 Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.218005 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e1a8e9c6-a225-4017-8752-c980f44618c5","Type":"ContainerDied","Data":"eccefbca5db579ad67191df996781478f4dff0f0d65c0d4218c5288d73943c07"} Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.218054 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e1a8e9c6-a225-4017-8752-c980f44618c5","Type":"ContainerDied","Data":"b63757734f2b0d1fd774e0248013ab3bc0d6593761fc3f21f58e43c8c58da731"} Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.218072 4743 scope.go:117] "RemoveContainer" containerID="a76fabdfdc6e7cd76ae905a7c8f4ea711990506bc58c9dc7cb4fbe954418498a" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.218119 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.263136 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.269288 4743 scope.go:117] "RemoveContainer" containerID="eccefbca5db579ad67191df996781478f4dff0f0d65c0d4218c5288d73943c07" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.284950 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.297891 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 11 02:14:19 crc kubenswrapper[4743]: E1011 02:14:19.298561 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a8e9c6-a225-4017-8752-c980f44618c5" containerName="probe" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.298579 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a8e9c6-a225-4017-8752-c980f44618c5" containerName="probe" Oct 11 02:14:19 crc kubenswrapper[4743]: E1011 02:14:19.298637 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a8e9c6-a225-4017-8752-c980f44618c5" containerName="manila-scheduler" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.298646 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a8e9c6-a225-4017-8752-c980f44618c5" containerName="manila-scheduler" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.298998 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1a8e9c6-a225-4017-8752-c980f44618c5" containerName="probe" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.299048 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1a8e9c6-a225-4017-8752-c980f44618c5" containerName="manila-scheduler" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.300512 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.304477 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.309701 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.317120 4743 scope.go:117] "RemoveContainer" containerID="a76fabdfdc6e7cd76ae905a7c8f4ea711990506bc58c9dc7cb4fbe954418498a" Oct 11 02:14:19 crc kubenswrapper[4743]: E1011 02:14:19.317671 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a76fabdfdc6e7cd76ae905a7c8f4ea711990506bc58c9dc7cb4fbe954418498a\": container with ID starting with a76fabdfdc6e7cd76ae905a7c8f4ea711990506bc58c9dc7cb4fbe954418498a not found: ID does not exist" containerID="a76fabdfdc6e7cd76ae905a7c8f4ea711990506bc58c9dc7cb4fbe954418498a" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.317708 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a76fabdfdc6e7cd76ae905a7c8f4ea711990506bc58c9dc7cb4fbe954418498a"} err="failed to get container status \"a76fabdfdc6e7cd76ae905a7c8f4ea711990506bc58c9dc7cb4fbe954418498a\": rpc error: code = NotFound desc = could not find container \"a76fabdfdc6e7cd76ae905a7c8f4ea711990506bc58c9dc7cb4fbe954418498a\": container with ID starting with a76fabdfdc6e7cd76ae905a7c8f4ea711990506bc58c9dc7cb4fbe954418498a not found: ID does not exist" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.317736 4743 scope.go:117] "RemoveContainer" containerID="eccefbca5db579ad67191df996781478f4dff0f0d65c0d4218c5288d73943c07" Oct 11 02:14:19 crc kubenswrapper[4743]: E1011 02:14:19.318159 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eccefbca5db579ad67191df996781478f4dff0f0d65c0d4218c5288d73943c07\": container with ID starting with eccefbca5db579ad67191df996781478f4dff0f0d65c0d4218c5288d73943c07 not found: ID does not exist" containerID="eccefbca5db579ad67191df996781478f4dff0f0d65c0d4218c5288d73943c07" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.318239 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eccefbca5db579ad67191df996781478f4dff0f0d65c0d4218c5288d73943c07"} err="failed to get container status \"eccefbca5db579ad67191df996781478f4dff0f0d65c0d4218c5288d73943c07\": rpc error: code = NotFound desc = could not find container \"eccefbca5db579ad67191df996781478f4dff0f0d65c0d4218c5288d73943c07\": container with ID starting with eccefbca5db579ad67191df996781478f4dff0f0d65c0d4218c5288d73943c07 not found: ID does not exist" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.468955 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00706963-dfb0-45d7-a0be-5875e1ae0a8f-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"00706963-dfb0-45d7-a0be-5875e1ae0a8f\") " pod="openstack/manila-scheduler-0" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.469151 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00706963-dfb0-45d7-a0be-5875e1ae0a8f-config-data\") pod \"manila-scheduler-0\" (UID: \"00706963-dfb0-45d7-a0be-5875e1ae0a8f\") " pod="openstack/manila-scheduler-0" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.469211 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00706963-dfb0-45d7-a0be-5875e1ae0a8f-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"00706963-dfb0-45d7-a0be-5875e1ae0a8f\") " pod="openstack/manila-scheduler-0" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.469265 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00706963-dfb0-45d7-a0be-5875e1ae0a8f-scripts\") pod \"manila-scheduler-0\" (UID: \"00706963-dfb0-45d7-a0be-5875e1ae0a8f\") " pod="openstack/manila-scheduler-0" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.469331 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fskr\" (UniqueName: \"kubernetes.io/projected/00706963-dfb0-45d7-a0be-5875e1ae0a8f-kube-api-access-5fskr\") pod \"manila-scheduler-0\" (UID: \"00706963-dfb0-45d7-a0be-5875e1ae0a8f\") " pod="openstack/manila-scheduler-0" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.469389 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00706963-dfb0-45d7-a0be-5875e1ae0a8f-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"00706963-dfb0-45d7-a0be-5875e1ae0a8f\") " pod="openstack/manila-scheduler-0" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.574295 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00706963-dfb0-45d7-a0be-5875e1ae0a8f-config-data\") pod \"manila-scheduler-0\" (UID: \"00706963-dfb0-45d7-a0be-5875e1ae0a8f\") " pod="openstack/manila-scheduler-0" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.574353 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00706963-dfb0-45d7-a0be-5875e1ae0a8f-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"00706963-dfb0-45d7-a0be-5875e1ae0a8f\") " pod="openstack/manila-scheduler-0" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.574380 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00706963-dfb0-45d7-a0be-5875e1ae0a8f-scripts\") pod \"manila-scheduler-0\" (UID: \"00706963-dfb0-45d7-a0be-5875e1ae0a8f\") " pod="openstack/manila-scheduler-0" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.574406 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fskr\" (UniqueName: \"kubernetes.io/projected/00706963-dfb0-45d7-a0be-5875e1ae0a8f-kube-api-access-5fskr\") pod \"manila-scheduler-0\" (UID: \"00706963-dfb0-45d7-a0be-5875e1ae0a8f\") " pod="openstack/manila-scheduler-0" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.574448 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00706963-dfb0-45d7-a0be-5875e1ae0a8f-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"00706963-dfb0-45d7-a0be-5875e1ae0a8f\") " pod="openstack/manila-scheduler-0" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.574671 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00706963-dfb0-45d7-a0be-5875e1ae0a8f-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"00706963-dfb0-45d7-a0be-5875e1ae0a8f\") " pod="openstack/manila-scheduler-0" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.574827 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00706963-dfb0-45d7-a0be-5875e1ae0a8f-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"00706963-dfb0-45d7-a0be-5875e1ae0a8f\") " pod="openstack/manila-scheduler-0" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.579272 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00706963-dfb0-45d7-a0be-5875e1ae0a8f-scripts\") pod \"manila-scheduler-0\" (UID: \"00706963-dfb0-45d7-a0be-5875e1ae0a8f\") " pod="openstack/manila-scheduler-0" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.579656 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00706963-dfb0-45d7-a0be-5875e1ae0a8f-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"00706963-dfb0-45d7-a0be-5875e1ae0a8f\") " pod="openstack/manila-scheduler-0" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.580850 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00706963-dfb0-45d7-a0be-5875e1ae0a8f-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"00706963-dfb0-45d7-a0be-5875e1ae0a8f\") " pod="openstack/manila-scheduler-0" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.582737 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00706963-dfb0-45d7-a0be-5875e1ae0a8f-config-data\") pod \"manila-scheduler-0\" (UID: \"00706963-dfb0-45d7-a0be-5875e1ae0a8f\") " pod="openstack/manila-scheduler-0" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.590322 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fskr\" (UniqueName: \"kubernetes.io/projected/00706963-dfb0-45d7-a0be-5875e1ae0a8f-kube-api-access-5fskr\") pod \"manila-scheduler-0\" (UID: \"00706963-dfb0-45d7-a0be-5875e1ae0a8f\") " pod="openstack/manila-scheduler-0" Oct 11 02:14:19 crc kubenswrapper[4743]: I1011 02:14:19.656143 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 11 02:14:20 crc kubenswrapper[4743]: I1011 02:14:20.115081 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1a8e9c6-a225-4017-8752-c980f44618c5" path="/var/lib/kubelet/pods/e1a8e9c6-a225-4017-8752-c980f44618c5/volumes" Oct 11 02:14:20 crc kubenswrapper[4743]: I1011 02:14:20.157960 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 11 02:14:20 crc kubenswrapper[4743]: W1011 02:14:20.159811 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00706963_dfb0_45d7_a0be_5875e1ae0a8f.slice/crio-c0b2b892044f2738df03cd50b57f246100cbb3912ddf6eecbd1c3489a3024535 WatchSource:0}: Error finding container c0b2b892044f2738df03cd50b57f246100cbb3912ddf6eecbd1c3489a3024535: Status 404 returned error can't find the container with id c0b2b892044f2738df03cd50b57f246100cbb3912ddf6eecbd1c3489a3024535 Oct 11 02:14:20 crc kubenswrapper[4743]: I1011 02:14:20.231316 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"00706963-dfb0-45d7-a0be-5875e1ae0a8f","Type":"ContainerStarted","Data":"c0b2b892044f2738df03cd50b57f246100cbb3912ddf6eecbd1c3489a3024535"} Oct 11 02:14:21 crc kubenswrapper[4743]: I1011 02:14:21.245102 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"00706963-dfb0-45d7-a0be-5875e1ae0a8f","Type":"ContainerStarted","Data":"01900982915e00d356ee00349a44903c37b3198721d345278bf6bd938a77a1a5"} Oct 11 02:14:21 crc kubenswrapper[4743]: I1011 02:14:21.245579 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"00706963-dfb0-45d7-a0be-5875e1ae0a8f","Type":"ContainerStarted","Data":"a021de02cee885a47f72099382a8cbfb2a371c6030d29045ad2f525f06d1162a"} Oct 11 02:14:21 crc kubenswrapper[4743]: I1011 02:14:21.268502 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=2.2684812340000002 podStartE2EDuration="2.268481234s" podCreationTimestamp="2025-10-11 02:14:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 02:14:21.262129797 +0000 UTC m=+4955.915110224" watchObservedRunningTime="2025-10-11 02:14:21.268481234 +0000 UTC m=+4955.921461651" Oct 11 02:14:22 crc kubenswrapper[4743]: I1011 02:14:22.462517 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5bc9759f8-b6qgh" podUID="d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.67:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.67:8443: connect: connection refused" Oct 11 02:14:22 crc kubenswrapper[4743]: I1011 02:14:22.463035 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:14:22 crc kubenswrapper[4743]: I1011 02:14:22.539208 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Oct 11 02:14:27 crc kubenswrapper[4743]: I1011 02:14:27.050098 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 11 02:14:27 crc kubenswrapper[4743]: I1011 02:14:27.122240 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Oct 11 02:14:27 crc kubenswrapper[4743]: I1011 02:14:27.306490 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="2d3cbb84-0280-484f-a137-47cfea670423" containerName="manila-share" containerID="cri-o://358eb58248df8616733c81077c03922409626e6cb1fc708be47f26baf0fc6e54" gracePeriod=30 Oct 11 02:14:27 crc kubenswrapper[4743]: I1011 02:14:27.306543 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="2d3cbb84-0280-484f-a137-47cfea670423" containerName="probe" containerID="cri-o://61b6e8e98922429a828089232abc6e60bb65b6a56dffe7bfc9b8bfd18634b9d5" gracePeriod=30 Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.243843 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.337298 4743 generic.go:334] "Generic (PLEG): container finished" podID="2d3cbb84-0280-484f-a137-47cfea670423" containerID="61b6e8e98922429a828089232abc6e60bb65b6a56dffe7bfc9b8bfd18634b9d5" exitCode=0 Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.337327 4743 generic.go:334] "Generic (PLEG): container finished" podID="2d3cbb84-0280-484f-a137-47cfea670423" containerID="358eb58248df8616733c81077c03922409626e6cb1fc708be47f26baf0fc6e54" exitCode=1 Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.337347 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2d3cbb84-0280-484f-a137-47cfea670423","Type":"ContainerDied","Data":"61b6e8e98922429a828089232abc6e60bb65b6a56dffe7bfc9b8bfd18634b9d5"} Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.337372 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2d3cbb84-0280-484f-a137-47cfea670423","Type":"ContainerDied","Data":"358eb58248df8616733c81077c03922409626e6cb1fc708be47f26baf0fc6e54"} Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.337383 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2d3cbb84-0280-484f-a137-47cfea670423","Type":"ContainerDied","Data":"5c0da06da740f7ffe876ffc857cb702d17fd1cf87bf98d3dfe6b1003cc1b2c21"} Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.337383 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.337398 4743 scope.go:117] "RemoveContainer" containerID="61b6e8e98922429a828089232abc6e60bb65b6a56dffe7bfc9b8bfd18634b9d5" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.363209 4743 scope.go:117] "RemoveContainer" containerID="358eb58248df8616733c81077c03922409626e6cb1fc708be47f26baf0fc6e54" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.384943 4743 scope.go:117] "RemoveContainer" containerID="61b6e8e98922429a828089232abc6e60bb65b6a56dffe7bfc9b8bfd18634b9d5" Oct 11 02:14:28 crc kubenswrapper[4743]: E1011 02:14:28.385330 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61b6e8e98922429a828089232abc6e60bb65b6a56dffe7bfc9b8bfd18634b9d5\": container with ID starting with 61b6e8e98922429a828089232abc6e60bb65b6a56dffe7bfc9b8bfd18634b9d5 not found: ID does not exist" containerID="61b6e8e98922429a828089232abc6e60bb65b6a56dffe7bfc9b8bfd18634b9d5" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.385357 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61b6e8e98922429a828089232abc6e60bb65b6a56dffe7bfc9b8bfd18634b9d5"} err="failed to get container status \"61b6e8e98922429a828089232abc6e60bb65b6a56dffe7bfc9b8bfd18634b9d5\": rpc error: code = NotFound desc = could not find container \"61b6e8e98922429a828089232abc6e60bb65b6a56dffe7bfc9b8bfd18634b9d5\": container with ID starting with 61b6e8e98922429a828089232abc6e60bb65b6a56dffe7bfc9b8bfd18634b9d5 not found: ID does not exist" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.385379 4743 scope.go:117] "RemoveContainer" containerID="358eb58248df8616733c81077c03922409626e6cb1fc708be47f26baf0fc6e54" Oct 11 02:14:28 crc kubenswrapper[4743]: E1011 02:14:28.385620 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"358eb58248df8616733c81077c03922409626e6cb1fc708be47f26baf0fc6e54\": container with ID starting with 358eb58248df8616733c81077c03922409626e6cb1fc708be47f26baf0fc6e54 not found: ID does not exist" containerID="358eb58248df8616733c81077c03922409626e6cb1fc708be47f26baf0fc6e54" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.385645 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"358eb58248df8616733c81077c03922409626e6cb1fc708be47f26baf0fc6e54"} err="failed to get container status \"358eb58248df8616733c81077c03922409626e6cb1fc708be47f26baf0fc6e54\": rpc error: code = NotFound desc = could not find container \"358eb58248df8616733c81077c03922409626e6cb1fc708be47f26baf0fc6e54\": container with ID starting with 358eb58248df8616733c81077c03922409626e6cb1fc708be47f26baf0fc6e54 not found: ID does not exist" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.385660 4743 scope.go:117] "RemoveContainer" containerID="61b6e8e98922429a828089232abc6e60bb65b6a56dffe7bfc9b8bfd18634b9d5" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.385832 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61b6e8e98922429a828089232abc6e60bb65b6a56dffe7bfc9b8bfd18634b9d5"} err="failed to get container status \"61b6e8e98922429a828089232abc6e60bb65b6a56dffe7bfc9b8bfd18634b9d5\": rpc error: code = NotFound desc = could not find container \"61b6e8e98922429a828089232abc6e60bb65b6a56dffe7bfc9b8bfd18634b9d5\": container with ID starting with 61b6e8e98922429a828089232abc6e60bb65b6a56dffe7bfc9b8bfd18634b9d5 not found: ID does not exist" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.385847 4743 scope.go:117] "RemoveContainer" containerID="358eb58248df8616733c81077c03922409626e6cb1fc708be47f26baf0fc6e54" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.386035 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"358eb58248df8616733c81077c03922409626e6cb1fc708be47f26baf0fc6e54"} err="failed to get container status \"358eb58248df8616733c81077c03922409626e6cb1fc708be47f26baf0fc6e54\": rpc error: code = NotFound desc = could not find container \"358eb58248df8616733c81077c03922409626e6cb1fc708be47f26baf0fc6e54\": container with ID starting with 358eb58248df8616733c81077c03922409626e6cb1fc708be47f26baf0fc6e54 not found: ID does not exist" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.386277 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2d3cbb84-0280-484f-a137-47cfea670423-etc-machine-id\") pod \"2d3cbb84-0280-484f-a137-47cfea670423\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.386377 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d3cbb84-0280-484f-a137-47cfea670423-config-data\") pod \"2d3cbb84-0280-484f-a137-47cfea670423\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.386452 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d3cbb84-0280-484f-a137-47cfea670423-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2d3cbb84-0280-484f-a137-47cfea670423" (UID: "2d3cbb84-0280-484f-a137-47cfea670423"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.386576 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d3cbb84-0280-484f-a137-47cfea670423-config-data-custom\") pod \"2d3cbb84-0280-484f-a137-47cfea670423\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.386635 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2d3cbb84-0280-484f-a137-47cfea670423-ceph\") pod \"2d3cbb84-0280-484f-a137-47cfea670423\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.386653 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d3cbb84-0280-484f-a137-47cfea670423-scripts\") pod \"2d3cbb84-0280-484f-a137-47cfea670423\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.386692 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2d3cbb84-0280-484f-a137-47cfea670423-var-lib-manila\") pod \"2d3cbb84-0280-484f-a137-47cfea670423\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.386724 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvxwj\" (UniqueName: \"kubernetes.io/projected/2d3cbb84-0280-484f-a137-47cfea670423-kube-api-access-fvxwj\") pod \"2d3cbb84-0280-484f-a137-47cfea670423\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.386779 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d3cbb84-0280-484f-a137-47cfea670423-combined-ca-bundle\") pod \"2d3cbb84-0280-484f-a137-47cfea670423\" (UID: \"2d3cbb84-0280-484f-a137-47cfea670423\") " Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.387052 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d3cbb84-0280-484f-a137-47cfea670423-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "2d3cbb84-0280-484f-a137-47cfea670423" (UID: "2d3cbb84-0280-484f-a137-47cfea670423"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.388144 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2d3cbb84-0280-484f-a137-47cfea670423-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.388165 4743 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2d3cbb84-0280-484f-a137-47cfea670423-var-lib-manila\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.394419 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d3cbb84-0280-484f-a137-47cfea670423-ceph" (OuterVolumeSpecName: "ceph") pod "2d3cbb84-0280-484f-a137-47cfea670423" (UID: "2d3cbb84-0280-484f-a137-47cfea670423"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.395090 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d3cbb84-0280-484f-a137-47cfea670423-kube-api-access-fvxwj" (OuterVolumeSpecName: "kube-api-access-fvxwj") pod "2d3cbb84-0280-484f-a137-47cfea670423" (UID: "2d3cbb84-0280-484f-a137-47cfea670423"). InnerVolumeSpecName "kube-api-access-fvxwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.395170 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d3cbb84-0280-484f-a137-47cfea670423-scripts" (OuterVolumeSpecName: "scripts") pod "2d3cbb84-0280-484f-a137-47cfea670423" (UID: "2d3cbb84-0280-484f-a137-47cfea670423"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.410047 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d3cbb84-0280-484f-a137-47cfea670423-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2d3cbb84-0280-484f-a137-47cfea670423" (UID: "2d3cbb84-0280-484f-a137-47cfea670423"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.475955 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d3cbb84-0280-484f-a137-47cfea670423-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d3cbb84-0280-484f-a137-47cfea670423" (UID: "2d3cbb84-0280-484f-a137-47cfea670423"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.495067 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d3cbb84-0280-484f-a137-47cfea670423-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.495099 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2d3cbb84-0280-484f-a137-47cfea670423-ceph\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.495112 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d3cbb84-0280-484f-a137-47cfea670423-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.495121 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvxwj\" (UniqueName: \"kubernetes.io/projected/2d3cbb84-0280-484f-a137-47cfea670423-kube-api-access-fvxwj\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.495132 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d3cbb84-0280-484f-a137-47cfea670423-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.547086 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d3cbb84-0280-484f-a137-47cfea670423-config-data" (OuterVolumeSpecName: "config-data") pod "2d3cbb84-0280-484f-a137-47cfea670423" (UID: "2d3cbb84-0280-484f-a137-47cfea670423"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.597553 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d3cbb84-0280-484f-a137-47cfea670423-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.680301 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.734101 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.752102 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 11 02:14:28 crc kubenswrapper[4743]: E1011 02:14:28.753623 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d3cbb84-0280-484f-a137-47cfea670423" containerName="manila-share" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.753645 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d3cbb84-0280-484f-a137-47cfea670423" containerName="manila-share" Oct 11 02:14:28 crc kubenswrapper[4743]: E1011 02:14:28.753704 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d3cbb84-0280-484f-a137-47cfea670423" containerName="probe" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.753712 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d3cbb84-0280-484f-a137-47cfea670423" containerName="probe" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.754181 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d3cbb84-0280-484f-a137-47cfea670423" containerName="manila-share" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.754207 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d3cbb84-0280-484f-a137-47cfea670423" containerName="probe" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.755964 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.759163 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.767826 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.903462 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/0c713aa6-9d10-4baa-855c-a05256d83be7-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"0c713aa6-9d10-4baa-855c-a05256d83be7\") " pod="openstack/manila-share-share1-0" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.903695 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c713aa6-9d10-4baa-855c-a05256d83be7-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"0c713aa6-9d10-4baa-855c-a05256d83be7\") " pod="openstack/manila-share-share1-0" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.903729 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c713aa6-9d10-4baa-855c-a05256d83be7-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"0c713aa6-9d10-4baa-855c-a05256d83be7\") " pod="openstack/manila-share-share1-0" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.903809 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-487rk\" (UniqueName: \"kubernetes.io/projected/0c713aa6-9d10-4baa-855c-a05256d83be7-kube-api-access-487rk\") pod \"manila-share-share1-0\" (UID: \"0c713aa6-9d10-4baa-855c-a05256d83be7\") " pod="openstack/manila-share-share1-0" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.903950 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0c713aa6-9d10-4baa-855c-a05256d83be7-ceph\") pod \"manila-share-share1-0\" (UID: \"0c713aa6-9d10-4baa-855c-a05256d83be7\") " pod="openstack/manila-share-share1-0" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.904075 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c713aa6-9d10-4baa-855c-a05256d83be7-scripts\") pod \"manila-share-share1-0\" (UID: \"0c713aa6-9d10-4baa-855c-a05256d83be7\") " pod="openstack/manila-share-share1-0" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.904295 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c713aa6-9d10-4baa-855c-a05256d83be7-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"0c713aa6-9d10-4baa-855c-a05256d83be7\") " pod="openstack/manila-share-share1-0" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.904376 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c713aa6-9d10-4baa-855c-a05256d83be7-config-data\") pod \"manila-share-share1-0\" (UID: \"0c713aa6-9d10-4baa-855c-a05256d83be7\") " pod="openstack/manila-share-share1-0" Oct 11 02:14:28 crc kubenswrapper[4743]: I1011 02:14:28.959567 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.006100 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-horizon-secret-key\") pod \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.006180 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-combined-ca-bundle\") pod \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.006247 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-logs\") pod \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.006272 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-config-data\") pod \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.006448 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxw7h\" (UniqueName: \"kubernetes.io/projected/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-kube-api-access-wxw7h\") pod \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.006481 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-horizon-tls-certs\") pod \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.006522 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-scripts\") pod \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\" (UID: \"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd\") " Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.006986 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c713aa6-9d10-4baa-855c-a05256d83be7-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"0c713aa6-9d10-4baa-855c-a05256d83be7\") " pod="openstack/manila-share-share1-0" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.007012 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c713aa6-9d10-4baa-855c-a05256d83be7-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"0c713aa6-9d10-4baa-855c-a05256d83be7\") " pod="openstack/manila-share-share1-0" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.007047 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-487rk\" (UniqueName: \"kubernetes.io/projected/0c713aa6-9d10-4baa-855c-a05256d83be7-kube-api-access-487rk\") pod \"manila-share-share1-0\" (UID: \"0c713aa6-9d10-4baa-855c-a05256d83be7\") " pod="openstack/manila-share-share1-0" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.007083 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0c713aa6-9d10-4baa-855c-a05256d83be7-ceph\") pod \"manila-share-share1-0\" (UID: \"0c713aa6-9d10-4baa-855c-a05256d83be7\") " pod="openstack/manila-share-share1-0" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.007269 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c713aa6-9d10-4baa-855c-a05256d83be7-scripts\") pod \"manila-share-share1-0\" (UID: \"0c713aa6-9d10-4baa-855c-a05256d83be7\") " pod="openstack/manila-share-share1-0" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.007337 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c713aa6-9d10-4baa-855c-a05256d83be7-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"0c713aa6-9d10-4baa-855c-a05256d83be7\") " pod="openstack/manila-share-share1-0" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.007387 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c713aa6-9d10-4baa-855c-a05256d83be7-config-data\") pod \"manila-share-share1-0\" (UID: \"0c713aa6-9d10-4baa-855c-a05256d83be7\") " pod="openstack/manila-share-share1-0" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.007429 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/0c713aa6-9d10-4baa-855c-a05256d83be7-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"0c713aa6-9d10-4baa-855c-a05256d83be7\") " pod="openstack/manila-share-share1-0" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.007592 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/0c713aa6-9d10-4baa-855c-a05256d83be7-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"0c713aa6-9d10-4baa-855c-a05256d83be7\") " pod="openstack/manila-share-share1-0" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.008056 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-logs" (OuterVolumeSpecName: "logs") pod "d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd" (UID: "d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.010317 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd" (UID: "d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.013738 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c713aa6-9d10-4baa-855c-a05256d83be7-scripts\") pod \"manila-share-share1-0\" (UID: \"0c713aa6-9d10-4baa-855c-a05256d83be7\") " pod="openstack/manila-share-share1-0" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.013965 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c713aa6-9d10-4baa-855c-a05256d83be7-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"0c713aa6-9d10-4baa-855c-a05256d83be7\") " pod="openstack/manila-share-share1-0" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.019548 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c713aa6-9d10-4baa-855c-a05256d83be7-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"0c713aa6-9d10-4baa-855c-a05256d83be7\") " pod="openstack/manila-share-share1-0" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.020644 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-kube-api-access-wxw7h" (OuterVolumeSpecName: "kube-api-access-wxw7h") pod "d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd" (UID: "d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd"). InnerVolumeSpecName "kube-api-access-wxw7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.021478 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c713aa6-9d10-4baa-855c-a05256d83be7-config-data\") pod \"manila-share-share1-0\" (UID: \"0c713aa6-9d10-4baa-855c-a05256d83be7\") " pod="openstack/manila-share-share1-0" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.022004 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c713aa6-9d10-4baa-855c-a05256d83be7-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"0c713aa6-9d10-4baa-855c-a05256d83be7\") " pod="openstack/manila-share-share1-0" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.022849 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0c713aa6-9d10-4baa-855c-a05256d83be7-ceph\") pod \"manila-share-share1-0\" (UID: \"0c713aa6-9d10-4baa-855c-a05256d83be7\") " pod="openstack/manila-share-share1-0" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.030607 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-487rk\" (UniqueName: \"kubernetes.io/projected/0c713aa6-9d10-4baa-855c-a05256d83be7-kube-api-access-487rk\") pod \"manila-share-share1-0\" (UID: \"0c713aa6-9d10-4baa-855c-a05256d83be7\") " pod="openstack/manila-share-share1-0" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.055412 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-scripts" (OuterVolumeSpecName: "scripts") pod "d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd" (UID: "d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.060290 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-config-data" (OuterVolumeSpecName: "config-data") pod "d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd" (UID: "d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.082249 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd" (UID: "d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.099838 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.113496 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd" (UID: "d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.122193 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxw7h\" (UniqueName: \"kubernetes.io/projected/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-kube-api-access-wxw7h\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.122228 4743 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.122242 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-scripts\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.122254 4743 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.122266 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.122279 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-logs\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.122290 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.351163 4743 generic.go:334] "Generic (PLEG): container finished" podID="d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd" containerID="d1b484db2a51b4a01449527fdf8e1ed8a08f52dd1f27e1abe2a9f0f1d2e2b660" exitCode=137 Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.351496 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bc9759f8-b6qgh" event={"ID":"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd","Type":"ContainerDied","Data":"d1b484db2a51b4a01449527fdf8e1ed8a08f52dd1f27e1abe2a9f0f1d2e2b660"} Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.351539 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bc9759f8-b6qgh" event={"ID":"d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd","Type":"ContainerDied","Data":"b000cfaab42b0bf540f76658399b367b4b7a3dd49d79845db87bebcddd1b7ff5"} Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.351558 4743 scope.go:117] "RemoveContainer" containerID="1ae29a0313861a928c7541c1589ba7b739f50bb56d203301166706e6650049d2" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.351656 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bc9759f8-b6qgh" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.400777 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5bc9759f8-b6qgh"] Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.414195 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5bc9759f8-b6qgh"] Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.521903 4743 scope.go:117] "RemoveContainer" containerID="d1b484db2a51b4a01449527fdf8e1ed8a08f52dd1f27e1abe2a9f0f1d2e2b660" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.544554 4743 scope.go:117] "RemoveContainer" containerID="1ae29a0313861a928c7541c1589ba7b739f50bb56d203301166706e6650049d2" Oct 11 02:14:29 crc kubenswrapper[4743]: E1011 02:14:29.551989 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ae29a0313861a928c7541c1589ba7b739f50bb56d203301166706e6650049d2\": container with ID starting with 1ae29a0313861a928c7541c1589ba7b739f50bb56d203301166706e6650049d2 not found: ID does not exist" containerID="1ae29a0313861a928c7541c1589ba7b739f50bb56d203301166706e6650049d2" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.552116 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae29a0313861a928c7541c1589ba7b739f50bb56d203301166706e6650049d2"} err="failed to get container status \"1ae29a0313861a928c7541c1589ba7b739f50bb56d203301166706e6650049d2\": rpc error: code = NotFound desc = could not find container \"1ae29a0313861a928c7541c1589ba7b739f50bb56d203301166706e6650049d2\": container with ID starting with 1ae29a0313861a928c7541c1589ba7b739f50bb56d203301166706e6650049d2 not found: ID does not exist" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.552165 4743 scope.go:117] "RemoveContainer" containerID="d1b484db2a51b4a01449527fdf8e1ed8a08f52dd1f27e1abe2a9f0f1d2e2b660" Oct 11 02:14:29 crc kubenswrapper[4743]: E1011 02:14:29.557518 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1b484db2a51b4a01449527fdf8e1ed8a08f52dd1f27e1abe2a9f0f1d2e2b660\": container with ID starting with d1b484db2a51b4a01449527fdf8e1ed8a08f52dd1f27e1abe2a9f0f1d2e2b660 not found: ID does not exist" containerID="d1b484db2a51b4a01449527fdf8e1ed8a08f52dd1f27e1abe2a9f0f1d2e2b660" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.557556 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1b484db2a51b4a01449527fdf8e1ed8a08f52dd1f27e1abe2a9f0f1d2e2b660"} err="failed to get container status \"d1b484db2a51b4a01449527fdf8e1ed8a08f52dd1f27e1abe2a9f0f1d2e2b660\": rpc error: code = NotFound desc = could not find container \"d1b484db2a51b4a01449527fdf8e1ed8a08f52dd1f27e1abe2a9f0f1d2e2b660\": container with ID starting with d1b484db2a51b4a01449527fdf8e1ed8a08f52dd1f27e1abe2a9f0f1d2e2b660 not found: ID does not exist" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.657157 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 11 02:14:29 crc kubenswrapper[4743]: I1011 02:14:29.685680 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 11 02:14:29 crc kubenswrapper[4743]: W1011 02:14:29.692269 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c713aa6_9d10_4baa_855c_a05256d83be7.slice/crio-6c52129ee086d678b4263121f9011443a43eee644b397e322ad884ab8fe0cdcd WatchSource:0}: Error finding container 6c52129ee086d678b4263121f9011443a43eee644b397e322ad884ab8fe0cdcd: Status 404 returned error can't find the container with id 6c52129ee086d678b4263121f9011443a43eee644b397e322ad884ab8fe0cdcd Oct 11 02:14:30 crc kubenswrapper[4743]: I1011 02:14:30.114605 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d3cbb84-0280-484f-a137-47cfea670423" path="/var/lib/kubelet/pods/2d3cbb84-0280-484f-a137-47cfea670423/volumes" Oct 11 02:14:30 crc kubenswrapper[4743]: I1011 02:14:30.116168 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd" path="/var/lib/kubelet/pods/d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd/volumes" Oct 11 02:14:30 crc kubenswrapper[4743]: I1011 02:14:30.364266 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"0c713aa6-9d10-4baa-855c-a05256d83be7","Type":"ContainerStarted","Data":"871ac4ad99370173d8cecba5af42db28bfe1f6f7b40378413ed697b407ebcb11"} Oct 11 02:14:30 crc kubenswrapper[4743]: I1011 02:14:30.364306 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"0c713aa6-9d10-4baa-855c-a05256d83be7","Type":"ContainerStarted","Data":"6c52129ee086d678b4263121f9011443a43eee644b397e322ad884ab8fe0cdcd"} Oct 11 02:14:31 crc kubenswrapper[4743]: I1011 02:14:31.381118 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"0c713aa6-9d10-4baa-855c-a05256d83be7","Type":"ContainerStarted","Data":"e2e259c32427cd374065f19e29361ae5a1b73747114e975d020f8fd1c1676241"} Oct 11 02:14:39 crc kubenswrapper[4743]: I1011 02:14:39.100520 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 11 02:14:41 crc kubenswrapper[4743]: I1011 02:14:41.105993 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 11 02:14:41 crc kubenswrapper[4743]: I1011 02:14:41.126385 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=13.126361364 podStartE2EDuration="13.126361364s" podCreationTimestamp="2025-10-11 02:14:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 02:14:31.413801838 +0000 UTC m=+4966.066782245" watchObservedRunningTime="2025-10-11 02:14:41.126361364 +0000 UTC m=+4975.779341781" Oct 11 02:14:42 crc kubenswrapper[4743]: I1011 02:14:42.608605 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 11 02:14:43 crc kubenswrapper[4743]: I1011 02:14:43.630033 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qq9bb"] Oct 11 02:14:43 crc kubenswrapper[4743]: E1011 02:14:43.630767 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd" containerName="horizon" Oct 11 02:14:43 crc kubenswrapper[4743]: I1011 02:14:43.630787 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd" containerName="horizon" Oct 11 02:14:43 crc kubenswrapper[4743]: E1011 02:14:43.630822 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd" containerName="horizon-log" Oct 11 02:14:43 crc kubenswrapper[4743]: I1011 02:14:43.630830 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd" containerName="horizon-log" Oct 11 02:14:43 crc kubenswrapper[4743]: I1011 02:14:43.633719 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd" containerName="horizon" Oct 11 02:14:43 crc kubenswrapper[4743]: I1011 02:14:43.633773 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d53d5bd7-af12-4bc6-bc62-e9bf5fe953fd" containerName="horizon-log" Oct 11 02:14:43 crc kubenswrapper[4743]: I1011 02:14:43.635495 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qq9bb" Oct 11 02:14:43 crc kubenswrapper[4743]: I1011 02:14:43.646394 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qq9bb"] Oct 11 02:14:43 crc kubenswrapper[4743]: I1011 02:14:43.717981 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd474\" (UniqueName: \"kubernetes.io/projected/bb529b02-f3ba-464c-9b94-3699be070a6c-kube-api-access-zd474\") pod \"redhat-marketplace-qq9bb\" (UID: \"bb529b02-f3ba-464c-9b94-3699be070a6c\") " pod="openshift-marketplace/redhat-marketplace-qq9bb" Oct 11 02:14:43 crc kubenswrapper[4743]: I1011 02:14:43.718108 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb529b02-f3ba-464c-9b94-3699be070a6c-utilities\") pod \"redhat-marketplace-qq9bb\" (UID: \"bb529b02-f3ba-464c-9b94-3699be070a6c\") " pod="openshift-marketplace/redhat-marketplace-qq9bb" Oct 11 02:14:43 crc kubenswrapper[4743]: I1011 02:14:43.718203 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb529b02-f3ba-464c-9b94-3699be070a6c-catalog-content\") pod \"redhat-marketplace-qq9bb\" (UID: \"bb529b02-f3ba-464c-9b94-3699be070a6c\") " pod="openshift-marketplace/redhat-marketplace-qq9bb" Oct 11 02:14:43 crc kubenswrapper[4743]: I1011 02:14:43.820800 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb529b02-f3ba-464c-9b94-3699be070a6c-catalog-content\") pod \"redhat-marketplace-qq9bb\" (UID: \"bb529b02-f3ba-464c-9b94-3699be070a6c\") " pod="openshift-marketplace/redhat-marketplace-qq9bb" Oct 11 02:14:43 crc kubenswrapper[4743]: I1011 02:14:43.821227 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd474\" (UniqueName: \"kubernetes.io/projected/bb529b02-f3ba-464c-9b94-3699be070a6c-kube-api-access-zd474\") pod \"redhat-marketplace-qq9bb\" (UID: \"bb529b02-f3ba-464c-9b94-3699be070a6c\") " pod="openshift-marketplace/redhat-marketplace-qq9bb" Oct 11 02:14:43 crc kubenswrapper[4743]: I1011 02:14:43.821417 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb529b02-f3ba-464c-9b94-3699be070a6c-catalog-content\") pod \"redhat-marketplace-qq9bb\" (UID: \"bb529b02-f3ba-464c-9b94-3699be070a6c\") " pod="openshift-marketplace/redhat-marketplace-qq9bb" Oct 11 02:14:43 crc kubenswrapper[4743]: I1011 02:14:43.821431 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb529b02-f3ba-464c-9b94-3699be070a6c-utilities\") pod \"redhat-marketplace-qq9bb\" (UID: \"bb529b02-f3ba-464c-9b94-3699be070a6c\") " pod="openshift-marketplace/redhat-marketplace-qq9bb" Oct 11 02:14:43 crc kubenswrapper[4743]: I1011 02:14:43.822032 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb529b02-f3ba-464c-9b94-3699be070a6c-utilities\") pod \"redhat-marketplace-qq9bb\" (UID: \"bb529b02-f3ba-464c-9b94-3699be070a6c\") " pod="openshift-marketplace/redhat-marketplace-qq9bb" Oct 11 02:14:43 crc kubenswrapper[4743]: I1011 02:14:43.850288 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd474\" (UniqueName: \"kubernetes.io/projected/bb529b02-f3ba-464c-9b94-3699be070a6c-kube-api-access-zd474\") pod \"redhat-marketplace-qq9bb\" (UID: \"bb529b02-f3ba-464c-9b94-3699be070a6c\") " pod="openshift-marketplace/redhat-marketplace-qq9bb" Oct 11 02:14:43 crc kubenswrapper[4743]: I1011 02:14:43.955733 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qq9bb" Oct 11 02:14:44 crc kubenswrapper[4743]: I1011 02:14:44.458668 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:14:44 crc kubenswrapper[4743]: I1011 02:14:44.458956 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:14:44 crc kubenswrapper[4743]: I1011 02:14:44.459000 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 02:14:44 crc kubenswrapper[4743]: I1011 02:14:44.459876 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 02:14:44 crc kubenswrapper[4743]: I1011 02:14:44.459930 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" gracePeriod=600 Oct 11 02:14:44 crc kubenswrapper[4743]: I1011 02:14:44.507394 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qq9bb"] Oct 11 02:14:44 crc kubenswrapper[4743]: W1011 02:14:44.515351 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb529b02_f3ba_464c_9b94_3699be070a6c.slice/crio-1181c44c23a47a7d6aff916010b51f11ec2b50ef8a83b3b57709f193a4252e61 WatchSource:0}: Error finding container 1181c44c23a47a7d6aff916010b51f11ec2b50ef8a83b3b57709f193a4252e61: Status 404 returned error can't find the container with id 1181c44c23a47a7d6aff916010b51f11ec2b50ef8a83b3b57709f193a4252e61 Oct 11 02:14:44 crc kubenswrapper[4743]: I1011 02:14:44.532396 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qq9bb" event={"ID":"bb529b02-f3ba-464c-9b94-3699be070a6c","Type":"ContainerStarted","Data":"1181c44c23a47a7d6aff916010b51f11ec2b50ef8a83b3b57709f193a4252e61"} Oct 11 02:14:44 crc kubenswrapper[4743]: E1011 02:14:44.581660 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:14:45 crc kubenswrapper[4743]: I1011 02:14:45.543398 4743 generic.go:334] "Generic (PLEG): container finished" podID="bb529b02-f3ba-464c-9b94-3699be070a6c" containerID="bc0866e6913db317f2f29bd97cda5b25233b0c7d6b80b7898008f1384be3012b" exitCode=0 Oct 11 02:14:45 crc kubenswrapper[4743]: I1011 02:14:45.543518 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qq9bb" event={"ID":"bb529b02-f3ba-464c-9b94-3699be070a6c","Type":"ContainerDied","Data":"bc0866e6913db317f2f29bd97cda5b25233b0c7d6b80b7898008f1384be3012b"} Oct 11 02:14:45 crc kubenswrapper[4743]: I1011 02:14:45.545952 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 02:14:45 crc kubenswrapper[4743]: I1011 02:14:45.546846 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" exitCode=0 Oct 11 02:14:45 crc kubenswrapper[4743]: I1011 02:14:45.546883 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db"} Oct 11 02:14:45 crc kubenswrapper[4743]: I1011 02:14:45.546933 4743 scope.go:117] "RemoveContainer" containerID="d142e23db0b3ed02fb38edd4569166021ea8fee0edf9a6ae9288591f32b91fb4" Oct 11 02:14:45 crc kubenswrapper[4743]: I1011 02:14:45.548081 4743 scope.go:117] "RemoveContainer" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" Oct 11 02:14:45 crc kubenswrapper[4743]: E1011 02:14:45.548413 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:14:47 crc kubenswrapper[4743]: E1011 02:14:47.396795 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb529b02_f3ba_464c_9b94_3699be070a6c.slice/crio-conmon-a72a4701eed12e388452b7517ac3a163357622b596579916f96ae0522a2d3a13.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb529b02_f3ba_464c_9b94_3699be070a6c.slice/crio-a72a4701eed12e388452b7517ac3a163357622b596579916f96ae0522a2d3a13.scope\": RecentStats: unable to find data in memory cache]" Oct 11 02:14:47 crc kubenswrapper[4743]: I1011 02:14:47.570083 4743 generic.go:334] "Generic (PLEG): container finished" podID="bb529b02-f3ba-464c-9b94-3699be070a6c" containerID="a72a4701eed12e388452b7517ac3a163357622b596579916f96ae0522a2d3a13" exitCode=0 Oct 11 02:14:47 crc kubenswrapper[4743]: I1011 02:14:47.570171 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qq9bb" event={"ID":"bb529b02-f3ba-464c-9b94-3699be070a6c","Type":"ContainerDied","Data":"a72a4701eed12e388452b7517ac3a163357622b596579916f96ae0522a2d3a13"} Oct 11 02:14:48 crc kubenswrapper[4743]: I1011 02:14:48.583205 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qq9bb" event={"ID":"bb529b02-f3ba-464c-9b94-3699be070a6c","Type":"ContainerStarted","Data":"b2c369928f410101837522bc32c66d6b74e1bdd10e36e3fb8151ec7c3e6204d6"} Oct 11 02:14:48 crc kubenswrapper[4743]: I1011 02:14:48.610258 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qq9bb" podStartSLOduration=3.147250435 podStartE2EDuration="5.610236101s" podCreationTimestamp="2025-10-11 02:14:43 +0000 UTC" firstStartedPulling="2025-10-11 02:14:45.545724222 +0000 UTC m=+4980.198704619" lastFinishedPulling="2025-10-11 02:14:48.008709868 +0000 UTC m=+4982.661690285" observedRunningTime="2025-10-11 02:14:48.603915234 +0000 UTC m=+4983.256895641" watchObservedRunningTime="2025-10-11 02:14:48.610236101 +0000 UTC m=+4983.263216498" Oct 11 02:14:50 crc kubenswrapper[4743]: I1011 02:14:50.612255 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 11 02:14:53 crc kubenswrapper[4743]: I1011 02:14:53.956282 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qq9bb" Oct 11 02:14:53 crc kubenswrapper[4743]: I1011 02:14:53.956679 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qq9bb" Oct 11 02:14:54 crc kubenswrapper[4743]: I1011 02:14:54.026964 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qq9bb" Oct 11 02:14:54 crc kubenswrapper[4743]: I1011 02:14:54.716009 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qq9bb" Oct 11 02:14:54 crc kubenswrapper[4743]: I1011 02:14:54.801565 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qq9bb"] Oct 11 02:14:56 crc kubenswrapper[4743]: I1011 02:14:56.676326 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qq9bb" podUID="bb529b02-f3ba-464c-9b94-3699be070a6c" containerName="registry-server" containerID="cri-o://b2c369928f410101837522bc32c66d6b74e1bdd10e36e3fb8151ec7c3e6204d6" gracePeriod=2 Oct 11 02:14:57 crc kubenswrapper[4743]: I1011 02:14:57.712131 4743 generic.go:334] "Generic (PLEG): container finished" podID="bb529b02-f3ba-464c-9b94-3699be070a6c" containerID="b2c369928f410101837522bc32c66d6b74e1bdd10e36e3fb8151ec7c3e6204d6" exitCode=0 Oct 11 02:14:57 crc kubenswrapper[4743]: I1011 02:14:57.712482 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qq9bb" event={"ID":"bb529b02-f3ba-464c-9b94-3699be070a6c","Type":"ContainerDied","Data":"b2c369928f410101837522bc32c66d6b74e1bdd10e36e3fb8151ec7c3e6204d6"} Oct 11 02:14:57 crc kubenswrapper[4743]: I1011 02:14:57.712514 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qq9bb" event={"ID":"bb529b02-f3ba-464c-9b94-3699be070a6c","Type":"ContainerDied","Data":"1181c44c23a47a7d6aff916010b51f11ec2b50ef8a83b3b57709f193a4252e61"} Oct 11 02:14:57 crc kubenswrapper[4743]: I1011 02:14:57.712536 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1181c44c23a47a7d6aff916010b51f11ec2b50ef8a83b3b57709f193a4252e61" Oct 11 02:14:57 crc kubenswrapper[4743]: I1011 02:14:57.787175 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qq9bb" Oct 11 02:14:57 crc kubenswrapper[4743]: I1011 02:14:57.886495 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb529b02-f3ba-464c-9b94-3699be070a6c-utilities\") pod \"bb529b02-f3ba-464c-9b94-3699be070a6c\" (UID: \"bb529b02-f3ba-464c-9b94-3699be070a6c\") " Oct 11 02:14:57 crc kubenswrapper[4743]: I1011 02:14:57.886606 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd474\" (UniqueName: \"kubernetes.io/projected/bb529b02-f3ba-464c-9b94-3699be070a6c-kube-api-access-zd474\") pod \"bb529b02-f3ba-464c-9b94-3699be070a6c\" (UID: \"bb529b02-f3ba-464c-9b94-3699be070a6c\") " Oct 11 02:14:57 crc kubenswrapper[4743]: I1011 02:14:57.886748 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb529b02-f3ba-464c-9b94-3699be070a6c-catalog-content\") pod \"bb529b02-f3ba-464c-9b94-3699be070a6c\" (UID: \"bb529b02-f3ba-464c-9b94-3699be070a6c\") " Oct 11 02:14:57 crc kubenswrapper[4743]: I1011 02:14:57.887801 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb529b02-f3ba-464c-9b94-3699be070a6c-utilities" (OuterVolumeSpecName: "utilities") pod "bb529b02-f3ba-464c-9b94-3699be070a6c" (UID: "bb529b02-f3ba-464c-9b94-3699be070a6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:14:57 crc kubenswrapper[4743]: I1011 02:14:57.893939 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb529b02-f3ba-464c-9b94-3699be070a6c-kube-api-access-zd474" (OuterVolumeSpecName: "kube-api-access-zd474") pod "bb529b02-f3ba-464c-9b94-3699be070a6c" (UID: "bb529b02-f3ba-464c-9b94-3699be070a6c"). InnerVolumeSpecName "kube-api-access-zd474". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:14:57 crc kubenswrapper[4743]: I1011 02:14:57.949047 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb529b02-f3ba-464c-9b94-3699be070a6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb529b02-f3ba-464c-9b94-3699be070a6c" (UID: "bb529b02-f3ba-464c-9b94-3699be070a6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:14:57 crc kubenswrapper[4743]: I1011 02:14:57.989346 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd474\" (UniqueName: \"kubernetes.io/projected/bb529b02-f3ba-464c-9b94-3699be070a6c-kube-api-access-zd474\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:57 crc kubenswrapper[4743]: I1011 02:14:57.989382 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb529b02-f3ba-464c-9b94-3699be070a6c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:57 crc kubenswrapper[4743]: I1011 02:14:57.989392 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb529b02-f3ba-464c-9b94-3699be070a6c-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 02:14:58 crc kubenswrapper[4743]: I1011 02:14:58.091979 4743 scope.go:117] "RemoveContainer" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" Oct 11 02:14:58 crc kubenswrapper[4743]: E1011 02:14:58.092350 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:14:58 crc kubenswrapper[4743]: I1011 02:14:58.723369 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qq9bb" Oct 11 02:14:58 crc kubenswrapper[4743]: I1011 02:14:58.754645 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qq9bb"] Oct 11 02:14:58 crc kubenswrapper[4743]: I1011 02:14:58.767460 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qq9bb"] Oct 11 02:15:00 crc kubenswrapper[4743]: I1011 02:15:00.103482 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb529b02-f3ba-464c-9b94-3699be070a6c" path="/var/lib/kubelet/pods/bb529b02-f3ba-464c-9b94-3699be070a6c/volumes" Oct 11 02:15:00 crc kubenswrapper[4743]: I1011 02:15:00.166181 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335815-q7tlq"] Oct 11 02:15:00 crc kubenswrapper[4743]: E1011 02:15:00.166782 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb529b02-f3ba-464c-9b94-3699be070a6c" containerName="extract-utilities" Oct 11 02:15:00 crc kubenswrapper[4743]: I1011 02:15:00.166808 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb529b02-f3ba-464c-9b94-3699be070a6c" containerName="extract-utilities" Oct 11 02:15:00 crc kubenswrapper[4743]: E1011 02:15:00.166846 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb529b02-f3ba-464c-9b94-3699be070a6c" containerName="extract-content" Oct 11 02:15:00 crc kubenswrapper[4743]: I1011 02:15:00.166872 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb529b02-f3ba-464c-9b94-3699be070a6c" containerName="extract-content" Oct 11 02:15:00 crc kubenswrapper[4743]: E1011 02:15:00.166891 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb529b02-f3ba-464c-9b94-3699be070a6c" containerName="registry-server" Oct 11 02:15:00 crc kubenswrapper[4743]: I1011 02:15:00.166900 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb529b02-f3ba-464c-9b94-3699be070a6c" containerName="registry-server" Oct 11 02:15:00 crc kubenswrapper[4743]: I1011 02:15:00.167155 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb529b02-f3ba-464c-9b94-3699be070a6c" containerName="registry-server" Oct 11 02:15:00 crc kubenswrapper[4743]: I1011 02:15:00.167963 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335815-q7tlq" Oct 11 02:15:00 crc kubenswrapper[4743]: I1011 02:15:00.170284 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 11 02:15:00 crc kubenswrapper[4743]: I1011 02:15:00.170293 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 11 02:15:00 crc kubenswrapper[4743]: I1011 02:15:00.180436 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335815-q7tlq"] Oct 11 02:15:00 crc kubenswrapper[4743]: I1011 02:15:00.346457 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da838f8d-81f6-4239-8b89-607a7dcace70-config-volume\") pod \"collect-profiles-29335815-q7tlq\" (UID: \"da838f8d-81f6-4239-8b89-607a7dcace70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335815-q7tlq" Oct 11 02:15:00 crc kubenswrapper[4743]: I1011 02:15:00.346517 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da838f8d-81f6-4239-8b89-607a7dcace70-secret-volume\") pod \"collect-profiles-29335815-q7tlq\" (UID: \"da838f8d-81f6-4239-8b89-607a7dcace70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335815-q7tlq" Oct 11 02:15:00 crc kubenswrapper[4743]: I1011 02:15:00.346967 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzqnj\" (UniqueName: \"kubernetes.io/projected/da838f8d-81f6-4239-8b89-607a7dcace70-kube-api-access-dzqnj\") pod \"collect-profiles-29335815-q7tlq\" (UID: \"da838f8d-81f6-4239-8b89-607a7dcace70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335815-q7tlq" Oct 11 02:15:00 crc kubenswrapper[4743]: I1011 02:15:00.449478 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzqnj\" (UniqueName: \"kubernetes.io/projected/da838f8d-81f6-4239-8b89-607a7dcace70-kube-api-access-dzqnj\") pod \"collect-profiles-29335815-q7tlq\" (UID: \"da838f8d-81f6-4239-8b89-607a7dcace70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335815-q7tlq" Oct 11 02:15:00 crc kubenswrapper[4743]: I1011 02:15:00.449689 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da838f8d-81f6-4239-8b89-607a7dcace70-config-volume\") pod \"collect-profiles-29335815-q7tlq\" (UID: \"da838f8d-81f6-4239-8b89-607a7dcace70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335815-q7tlq" Oct 11 02:15:00 crc kubenswrapper[4743]: I1011 02:15:00.449729 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da838f8d-81f6-4239-8b89-607a7dcace70-secret-volume\") pod \"collect-profiles-29335815-q7tlq\" (UID: \"da838f8d-81f6-4239-8b89-607a7dcace70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335815-q7tlq" Oct 11 02:15:00 crc kubenswrapper[4743]: I1011 02:15:00.452590 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da838f8d-81f6-4239-8b89-607a7dcace70-config-volume\") pod \"collect-profiles-29335815-q7tlq\" (UID: \"da838f8d-81f6-4239-8b89-607a7dcace70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335815-q7tlq" Oct 11 02:15:00 crc kubenswrapper[4743]: I1011 02:15:00.465443 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da838f8d-81f6-4239-8b89-607a7dcace70-secret-volume\") pod \"collect-profiles-29335815-q7tlq\" (UID: \"da838f8d-81f6-4239-8b89-607a7dcace70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335815-q7tlq" Oct 11 02:15:00 crc kubenswrapper[4743]: I1011 02:15:00.467106 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzqnj\" (UniqueName: \"kubernetes.io/projected/da838f8d-81f6-4239-8b89-607a7dcace70-kube-api-access-dzqnj\") pod \"collect-profiles-29335815-q7tlq\" (UID: \"da838f8d-81f6-4239-8b89-607a7dcace70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335815-q7tlq" Oct 11 02:15:00 crc kubenswrapper[4743]: I1011 02:15:00.489395 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335815-q7tlq" Oct 11 02:15:01 crc kubenswrapper[4743]: I1011 02:15:01.071184 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335815-q7tlq"] Oct 11 02:15:01 crc kubenswrapper[4743]: I1011 02:15:01.760572 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335815-q7tlq" event={"ID":"da838f8d-81f6-4239-8b89-607a7dcace70","Type":"ContainerStarted","Data":"09108208f1ab277a815b8850e624542cec2f1d5bfd37faf75b8ac7c49c434ce7"} Oct 11 02:15:01 crc kubenswrapper[4743]: I1011 02:15:01.761850 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335815-q7tlq" event={"ID":"da838f8d-81f6-4239-8b89-607a7dcace70","Type":"ContainerStarted","Data":"76f6bd5e5162f419951311c0105fce53bea13cc0dd87c536ed6cdb12851a8700"} Oct 11 02:15:01 crc kubenswrapper[4743]: I1011 02:15:01.786192 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29335815-q7tlq" podStartSLOduration=1.786170284 podStartE2EDuration="1.786170284s" podCreationTimestamp="2025-10-11 02:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 02:15:01.784990875 +0000 UTC m=+4996.437971312" watchObservedRunningTime="2025-10-11 02:15:01.786170284 +0000 UTC m=+4996.439150681" Oct 11 02:15:02 crc kubenswrapper[4743]: I1011 02:15:02.777609 4743 generic.go:334] "Generic (PLEG): container finished" podID="da838f8d-81f6-4239-8b89-607a7dcace70" containerID="09108208f1ab277a815b8850e624542cec2f1d5bfd37faf75b8ac7c49c434ce7" exitCode=0 Oct 11 02:15:02 crc kubenswrapper[4743]: I1011 02:15:02.777825 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335815-q7tlq" event={"ID":"da838f8d-81f6-4239-8b89-607a7dcace70","Type":"ContainerDied","Data":"09108208f1ab277a815b8850e624542cec2f1d5bfd37faf75b8ac7c49c434ce7"} Oct 11 02:15:04 crc kubenswrapper[4743]: I1011 02:15:04.236218 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335815-q7tlq" Oct 11 02:15:04 crc kubenswrapper[4743]: I1011 02:15:04.355011 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da838f8d-81f6-4239-8b89-607a7dcace70-config-volume\") pod \"da838f8d-81f6-4239-8b89-607a7dcace70\" (UID: \"da838f8d-81f6-4239-8b89-607a7dcace70\") " Oct 11 02:15:04 crc kubenswrapper[4743]: I1011 02:15:04.355082 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da838f8d-81f6-4239-8b89-607a7dcace70-secret-volume\") pod \"da838f8d-81f6-4239-8b89-607a7dcace70\" (UID: \"da838f8d-81f6-4239-8b89-607a7dcace70\") " Oct 11 02:15:04 crc kubenswrapper[4743]: I1011 02:15:04.355323 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzqnj\" (UniqueName: \"kubernetes.io/projected/da838f8d-81f6-4239-8b89-607a7dcace70-kube-api-access-dzqnj\") pod \"da838f8d-81f6-4239-8b89-607a7dcace70\" (UID: \"da838f8d-81f6-4239-8b89-607a7dcace70\") " Oct 11 02:15:04 crc kubenswrapper[4743]: I1011 02:15:04.355935 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da838f8d-81f6-4239-8b89-607a7dcace70-config-volume" (OuterVolumeSpecName: "config-volume") pod "da838f8d-81f6-4239-8b89-607a7dcace70" (UID: "da838f8d-81f6-4239-8b89-607a7dcace70"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 02:15:04 crc kubenswrapper[4743]: I1011 02:15:04.361547 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da838f8d-81f6-4239-8b89-607a7dcace70-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "da838f8d-81f6-4239-8b89-607a7dcace70" (UID: "da838f8d-81f6-4239-8b89-607a7dcace70"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:15:04 crc kubenswrapper[4743]: I1011 02:15:04.362672 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da838f8d-81f6-4239-8b89-607a7dcace70-kube-api-access-dzqnj" (OuterVolumeSpecName: "kube-api-access-dzqnj") pod "da838f8d-81f6-4239-8b89-607a7dcace70" (UID: "da838f8d-81f6-4239-8b89-607a7dcace70"). InnerVolumeSpecName "kube-api-access-dzqnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:15:04 crc kubenswrapper[4743]: I1011 02:15:04.458326 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzqnj\" (UniqueName: \"kubernetes.io/projected/da838f8d-81f6-4239-8b89-607a7dcace70-kube-api-access-dzqnj\") on node \"crc\" DevicePath \"\"" Oct 11 02:15:04 crc kubenswrapper[4743]: I1011 02:15:04.458621 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da838f8d-81f6-4239-8b89-607a7dcace70-config-volume\") on node \"crc\" DevicePath \"\"" Oct 11 02:15:04 crc kubenswrapper[4743]: I1011 02:15:04.458636 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da838f8d-81f6-4239-8b89-607a7dcace70-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 11 02:15:04 crc kubenswrapper[4743]: I1011 02:15:04.805142 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335815-q7tlq" event={"ID":"da838f8d-81f6-4239-8b89-607a7dcace70","Type":"ContainerDied","Data":"76f6bd5e5162f419951311c0105fce53bea13cc0dd87c536ed6cdb12851a8700"} Oct 11 02:15:04 crc kubenswrapper[4743]: I1011 02:15:04.805185 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335815-q7tlq" Oct 11 02:15:04 crc kubenswrapper[4743]: I1011 02:15:04.805191 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76f6bd5e5162f419951311c0105fce53bea13cc0dd87c536ed6cdb12851a8700" Oct 11 02:15:04 crc kubenswrapper[4743]: I1011 02:15:04.875040 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335770-kjbb9"] Oct 11 02:15:04 crc kubenswrapper[4743]: I1011 02:15:04.884032 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335770-kjbb9"] Oct 11 02:15:06 crc kubenswrapper[4743]: I1011 02:15:06.108368 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fec83ae-8524-4dcd-9983-e207576458c5" path="/var/lib/kubelet/pods/7fec83ae-8524-4dcd-9983-e207576458c5/volumes" Oct 11 02:15:10 crc kubenswrapper[4743]: I1011 02:15:10.092757 4743 scope.go:117] "RemoveContainer" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" Oct 11 02:15:10 crc kubenswrapper[4743]: E1011 02:15:10.093376 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:15:23 crc kubenswrapper[4743]: I1011 02:15:23.091921 4743 scope.go:117] "RemoveContainer" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" Oct 11 02:15:23 crc kubenswrapper[4743]: E1011 02:15:23.093191 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:15:27 crc kubenswrapper[4743]: I1011 02:15:27.481076 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ch2hk"] Oct 11 02:15:27 crc kubenswrapper[4743]: E1011 02:15:27.482266 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da838f8d-81f6-4239-8b89-607a7dcace70" containerName="collect-profiles" Oct 11 02:15:27 crc kubenswrapper[4743]: I1011 02:15:27.482288 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="da838f8d-81f6-4239-8b89-607a7dcace70" containerName="collect-profiles" Oct 11 02:15:27 crc kubenswrapper[4743]: I1011 02:15:27.482576 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="da838f8d-81f6-4239-8b89-607a7dcace70" containerName="collect-profiles" Oct 11 02:15:27 crc kubenswrapper[4743]: I1011 02:15:27.484688 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ch2hk" Oct 11 02:15:27 crc kubenswrapper[4743]: I1011 02:15:27.488411 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ch2hk"] Oct 11 02:15:27 crc kubenswrapper[4743]: I1011 02:15:27.576464 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmzjx\" (UniqueName: \"kubernetes.io/projected/93e84206-cdac-4dc5-8ceb-9140126d467c-kube-api-access-tmzjx\") pod \"community-operators-ch2hk\" (UID: \"93e84206-cdac-4dc5-8ceb-9140126d467c\") " pod="openshift-marketplace/community-operators-ch2hk" Oct 11 02:15:27 crc kubenswrapper[4743]: I1011 02:15:27.577263 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93e84206-cdac-4dc5-8ceb-9140126d467c-catalog-content\") pod \"community-operators-ch2hk\" (UID: \"93e84206-cdac-4dc5-8ceb-9140126d467c\") " pod="openshift-marketplace/community-operators-ch2hk" Oct 11 02:15:27 crc kubenswrapper[4743]: I1011 02:15:27.577559 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93e84206-cdac-4dc5-8ceb-9140126d467c-utilities\") pod \"community-operators-ch2hk\" (UID: \"93e84206-cdac-4dc5-8ceb-9140126d467c\") " pod="openshift-marketplace/community-operators-ch2hk" Oct 11 02:15:27 crc kubenswrapper[4743]: I1011 02:15:27.680169 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93e84206-cdac-4dc5-8ceb-9140126d467c-catalog-content\") pod \"community-operators-ch2hk\" (UID: \"93e84206-cdac-4dc5-8ceb-9140126d467c\") " pod="openshift-marketplace/community-operators-ch2hk" Oct 11 02:15:27 crc kubenswrapper[4743]: I1011 02:15:27.680285 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93e84206-cdac-4dc5-8ceb-9140126d467c-utilities\") pod \"community-operators-ch2hk\" (UID: \"93e84206-cdac-4dc5-8ceb-9140126d467c\") " pod="openshift-marketplace/community-operators-ch2hk" Oct 11 02:15:27 crc kubenswrapper[4743]: I1011 02:15:27.680408 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmzjx\" (UniqueName: \"kubernetes.io/projected/93e84206-cdac-4dc5-8ceb-9140126d467c-kube-api-access-tmzjx\") pod \"community-operators-ch2hk\" (UID: \"93e84206-cdac-4dc5-8ceb-9140126d467c\") " pod="openshift-marketplace/community-operators-ch2hk" Oct 11 02:15:27 crc kubenswrapper[4743]: I1011 02:15:27.680819 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93e84206-cdac-4dc5-8ceb-9140126d467c-catalog-content\") pod \"community-operators-ch2hk\" (UID: \"93e84206-cdac-4dc5-8ceb-9140126d467c\") " pod="openshift-marketplace/community-operators-ch2hk" Oct 11 02:15:27 crc kubenswrapper[4743]: I1011 02:15:27.681040 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93e84206-cdac-4dc5-8ceb-9140126d467c-utilities\") pod \"community-operators-ch2hk\" (UID: \"93e84206-cdac-4dc5-8ceb-9140126d467c\") " pod="openshift-marketplace/community-operators-ch2hk" Oct 11 02:15:27 crc kubenswrapper[4743]: I1011 02:15:27.702459 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmzjx\" (UniqueName: \"kubernetes.io/projected/93e84206-cdac-4dc5-8ceb-9140126d467c-kube-api-access-tmzjx\") pod \"community-operators-ch2hk\" (UID: \"93e84206-cdac-4dc5-8ceb-9140126d467c\") " pod="openshift-marketplace/community-operators-ch2hk" Oct 11 02:15:27 crc kubenswrapper[4743]: I1011 02:15:27.809599 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ch2hk" Oct 11 02:15:28 crc kubenswrapper[4743]: I1011 02:15:28.440206 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ch2hk"] Oct 11 02:15:29 crc kubenswrapper[4743]: I1011 02:15:29.124169 4743 generic.go:334] "Generic (PLEG): container finished" podID="93e84206-cdac-4dc5-8ceb-9140126d467c" containerID="97ea8295d158aa41cc5a215266f562acaac20fba2504bcecb8498d7d648f3bdd" exitCode=0 Oct 11 02:15:29 crc kubenswrapper[4743]: I1011 02:15:29.124290 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch2hk" event={"ID":"93e84206-cdac-4dc5-8ceb-9140126d467c","Type":"ContainerDied","Data":"97ea8295d158aa41cc5a215266f562acaac20fba2504bcecb8498d7d648f3bdd"} Oct 11 02:15:29 crc kubenswrapper[4743]: I1011 02:15:29.124581 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch2hk" event={"ID":"93e84206-cdac-4dc5-8ceb-9140126d467c","Type":"ContainerStarted","Data":"8fb4434cb4673ea0202f0738277672b820c668f10c46db27522672622fdd02b2"} Oct 11 02:15:29 crc kubenswrapper[4743]: I1011 02:15:29.892176 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-brh7l"] Oct 11 02:15:29 crc kubenswrapper[4743]: I1011 02:15:29.897396 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-brh7l" Oct 11 02:15:29 crc kubenswrapper[4743]: I1011 02:15:29.915670 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-brh7l"] Oct 11 02:15:29 crc kubenswrapper[4743]: I1011 02:15:29.941471 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79fd62ad-6966-460a-a08d-22af255e53ce-catalog-content\") pod \"redhat-operators-brh7l\" (UID: \"79fd62ad-6966-460a-a08d-22af255e53ce\") " pod="openshift-marketplace/redhat-operators-brh7l" Oct 11 02:15:29 crc kubenswrapper[4743]: I1011 02:15:29.941638 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79fd62ad-6966-460a-a08d-22af255e53ce-utilities\") pod \"redhat-operators-brh7l\" (UID: \"79fd62ad-6966-460a-a08d-22af255e53ce\") " pod="openshift-marketplace/redhat-operators-brh7l" Oct 11 02:15:29 crc kubenswrapper[4743]: I1011 02:15:29.941719 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfx7d\" (UniqueName: \"kubernetes.io/projected/79fd62ad-6966-460a-a08d-22af255e53ce-kube-api-access-qfx7d\") pod \"redhat-operators-brh7l\" (UID: \"79fd62ad-6966-460a-a08d-22af255e53ce\") " pod="openshift-marketplace/redhat-operators-brh7l" Oct 11 02:15:30 crc kubenswrapper[4743]: I1011 02:15:30.044069 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79fd62ad-6966-460a-a08d-22af255e53ce-catalog-content\") pod \"redhat-operators-brh7l\" (UID: \"79fd62ad-6966-460a-a08d-22af255e53ce\") " pod="openshift-marketplace/redhat-operators-brh7l" Oct 11 02:15:30 crc kubenswrapper[4743]: I1011 02:15:30.044165 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79fd62ad-6966-460a-a08d-22af255e53ce-utilities\") pod \"redhat-operators-brh7l\" (UID: \"79fd62ad-6966-460a-a08d-22af255e53ce\") " pod="openshift-marketplace/redhat-operators-brh7l" Oct 11 02:15:30 crc kubenswrapper[4743]: I1011 02:15:30.044208 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfx7d\" (UniqueName: \"kubernetes.io/projected/79fd62ad-6966-460a-a08d-22af255e53ce-kube-api-access-qfx7d\") pod \"redhat-operators-brh7l\" (UID: \"79fd62ad-6966-460a-a08d-22af255e53ce\") " pod="openshift-marketplace/redhat-operators-brh7l" Oct 11 02:15:30 crc kubenswrapper[4743]: I1011 02:15:30.045013 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79fd62ad-6966-460a-a08d-22af255e53ce-catalog-content\") pod \"redhat-operators-brh7l\" (UID: \"79fd62ad-6966-460a-a08d-22af255e53ce\") " pod="openshift-marketplace/redhat-operators-brh7l" Oct 11 02:15:30 crc kubenswrapper[4743]: I1011 02:15:30.045271 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79fd62ad-6966-460a-a08d-22af255e53ce-utilities\") pod \"redhat-operators-brh7l\" (UID: \"79fd62ad-6966-460a-a08d-22af255e53ce\") " pod="openshift-marketplace/redhat-operators-brh7l" Oct 11 02:15:30 crc kubenswrapper[4743]: I1011 02:15:30.070252 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfx7d\" (UniqueName: \"kubernetes.io/projected/79fd62ad-6966-460a-a08d-22af255e53ce-kube-api-access-qfx7d\") pod \"redhat-operators-brh7l\" (UID: \"79fd62ad-6966-460a-a08d-22af255e53ce\") " pod="openshift-marketplace/redhat-operators-brh7l" Oct 11 02:15:30 crc kubenswrapper[4743]: I1011 02:15:30.135906 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch2hk" event={"ID":"93e84206-cdac-4dc5-8ceb-9140126d467c","Type":"ContainerStarted","Data":"7ae8de9791a7a37ed3a1764fd20b60307b70f22effa8681268b349ce543cdb9f"} Oct 11 02:15:30 crc kubenswrapper[4743]: I1011 02:15:30.241187 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-brh7l" Oct 11 02:15:30 crc kubenswrapper[4743]: I1011 02:15:30.727296 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-brh7l"] Oct 11 02:15:31 crc kubenswrapper[4743]: W1011 02:15:31.062157 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79fd62ad_6966_460a_a08d_22af255e53ce.slice/crio-6fa8f922de5a5863bfc60d671c4cecde807b0bab500c132e1b19e98e34fd81c1 WatchSource:0}: Error finding container 6fa8f922de5a5863bfc60d671c4cecde807b0bab500c132e1b19e98e34fd81c1: Status 404 returned error can't find the container with id 6fa8f922de5a5863bfc60d671c4cecde807b0bab500c132e1b19e98e34fd81c1 Oct 11 02:15:31 crc kubenswrapper[4743]: I1011 02:15:31.160642 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brh7l" event={"ID":"79fd62ad-6966-460a-a08d-22af255e53ce","Type":"ContainerStarted","Data":"6fa8f922de5a5863bfc60d671c4cecde807b0bab500c132e1b19e98e34fd81c1"} Oct 11 02:15:32 crc kubenswrapper[4743]: I1011 02:15:32.170336 4743 generic.go:334] "Generic (PLEG): container finished" podID="79fd62ad-6966-460a-a08d-22af255e53ce" containerID="5b06029ee204912de58571b8f31433ad8f9979c7ac2f5b73f58b23cb223232a6" exitCode=0 Oct 11 02:15:32 crc kubenswrapper[4743]: I1011 02:15:32.170418 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brh7l" event={"ID":"79fd62ad-6966-460a-a08d-22af255e53ce","Type":"ContainerDied","Data":"5b06029ee204912de58571b8f31433ad8f9979c7ac2f5b73f58b23cb223232a6"} Oct 11 02:15:32 crc kubenswrapper[4743]: I1011 02:15:32.173089 4743 generic.go:334] "Generic (PLEG): container finished" podID="93e84206-cdac-4dc5-8ceb-9140126d467c" containerID="7ae8de9791a7a37ed3a1764fd20b60307b70f22effa8681268b349ce543cdb9f" exitCode=0 Oct 11 02:15:32 crc kubenswrapper[4743]: I1011 02:15:32.173132 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch2hk" event={"ID":"93e84206-cdac-4dc5-8ceb-9140126d467c","Type":"ContainerDied","Data":"7ae8de9791a7a37ed3a1764fd20b60307b70f22effa8681268b349ce543cdb9f"} Oct 11 02:15:34 crc kubenswrapper[4743]: I1011 02:15:34.193732 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch2hk" event={"ID":"93e84206-cdac-4dc5-8ceb-9140126d467c","Type":"ContainerStarted","Data":"305e36687037ad4a2b031cadeb69a3a8b3428c1e8096da79735ee65ddc70dfd1"} Oct 11 02:15:34 crc kubenswrapper[4743]: I1011 02:15:34.197017 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brh7l" event={"ID":"79fd62ad-6966-460a-a08d-22af255e53ce","Type":"ContainerStarted","Data":"f3f3ac91f6f842121f9e2511d0c735b3df92ef2d847bc147c32928a8d88a2263"} Oct 11 02:15:34 crc kubenswrapper[4743]: I1011 02:15:34.219835 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ch2hk" podStartSLOduration=3.690090652 podStartE2EDuration="7.219816385s" podCreationTimestamp="2025-10-11 02:15:27 +0000 UTC" firstStartedPulling="2025-10-11 02:15:29.129405678 +0000 UTC m=+5023.782386115" lastFinishedPulling="2025-10-11 02:15:32.659131451 +0000 UTC m=+5027.312111848" observedRunningTime="2025-10-11 02:15:34.215353745 +0000 UTC m=+5028.868334142" watchObservedRunningTime="2025-10-11 02:15:34.219816385 +0000 UTC m=+5028.872796782" Oct 11 02:15:35 crc kubenswrapper[4743]: I1011 02:15:35.092941 4743 scope.go:117] "RemoveContainer" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" Oct 11 02:15:35 crc kubenswrapper[4743]: E1011 02:15:35.094095 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:15:37 crc kubenswrapper[4743]: I1011 02:15:37.246317 4743 generic.go:334] "Generic (PLEG): container finished" podID="79fd62ad-6966-460a-a08d-22af255e53ce" containerID="f3f3ac91f6f842121f9e2511d0c735b3df92ef2d847bc147c32928a8d88a2263" exitCode=0 Oct 11 02:15:37 crc kubenswrapper[4743]: I1011 02:15:37.246377 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brh7l" event={"ID":"79fd62ad-6966-460a-a08d-22af255e53ce","Type":"ContainerDied","Data":"f3f3ac91f6f842121f9e2511d0c735b3df92ef2d847bc147c32928a8d88a2263"} Oct 11 02:15:37 crc kubenswrapper[4743]: I1011 02:15:37.810659 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ch2hk" Oct 11 02:15:37 crc kubenswrapper[4743]: I1011 02:15:37.811036 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ch2hk" Oct 11 02:15:38 crc kubenswrapper[4743]: I1011 02:15:38.257459 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brh7l" event={"ID":"79fd62ad-6966-460a-a08d-22af255e53ce","Type":"ContainerStarted","Data":"d33d23af5244173d0a9e64b66e7fe7ddcc64434896144278b8758a8ae02305ac"} Oct 11 02:15:38 crc kubenswrapper[4743]: I1011 02:15:38.284350 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-brh7l" podStartSLOduration=3.651758607 podStartE2EDuration="9.284328437s" podCreationTimestamp="2025-10-11 02:15:29 +0000 UTC" firstStartedPulling="2025-10-11 02:15:32.172583837 +0000 UTC m=+5026.825564234" lastFinishedPulling="2025-10-11 02:15:37.805153667 +0000 UTC m=+5032.458134064" observedRunningTime="2025-10-11 02:15:38.275783856 +0000 UTC m=+5032.928764273" watchObservedRunningTime="2025-10-11 02:15:38.284328437 +0000 UTC m=+5032.937308844" Oct 11 02:15:38 crc kubenswrapper[4743]: I1011 02:15:38.873003 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-ch2hk" podUID="93e84206-cdac-4dc5-8ceb-9140126d467c" containerName="registry-server" probeResult="failure" output=< Oct 11 02:15:38 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Oct 11 02:15:38 crc kubenswrapper[4743]: > Oct 11 02:15:40 crc kubenswrapper[4743]: I1011 02:15:40.242032 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-brh7l" Oct 11 02:15:40 crc kubenswrapper[4743]: I1011 02:15:40.242334 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-brh7l" Oct 11 02:15:41 crc kubenswrapper[4743]: I1011 02:15:41.319079 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-brh7l" podUID="79fd62ad-6966-460a-a08d-22af255e53ce" containerName="registry-server" probeResult="failure" output=< Oct 11 02:15:41 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Oct 11 02:15:41 crc kubenswrapper[4743]: > Oct 11 02:15:47 crc kubenswrapper[4743]: I1011 02:15:47.884985 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ch2hk" Oct 11 02:15:47 crc kubenswrapper[4743]: I1011 02:15:47.965452 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ch2hk" Oct 11 02:15:48 crc kubenswrapper[4743]: I1011 02:15:48.092465 4743 scope.go:117] "RemoveContainer" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" Oct 11 02:15:48 crc kubenswrapper[4743]: E1011 02:15:48.093211 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:15:50 crc kubenswrapper[4743]: I1011 02:15:50.063677 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ch2hk"] Oct 11 02:15:50 crc kubenswrapper[4743]: I1011 02:15:50.064610 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ch2hk" podUID="93e84206-cdac-4dc5-8ceb-9140126d467c" containerName="registry-server" containerID="cri-o://305e36687037ad4a2b031cadeb69a3a8b3428c1e8096da79735ee65ddc70dfd1" gracePeriod=2 Oct 11 02:15:50 crc kubenswrapper[4743]: I1011 02:15:50.306241 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-brh7l" Oct 11 02:15:50 crc kubenswrapper[4743]: I1011 02:15:50.398229 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-brh7l" Oct 11 02:15:50 crc kubenswrapper[4743]: I1011 02:15:50.427632 4743 generic.go:334] "Generic (PLEG): container finished" podID="93e84206-cdac-4dc5-8ceb-9140126d467c" containerID="305e36687037ad4a2b031cadeb69a3a8b3428c1e8096da79735ee65ddc70dfd1" exitCode=0 Oct 11 02:15:50 crc kubenswrapper[4743]: I1011 02:15:50.427704 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch2hk" event={"ID":"93e84206-cdac-4dc5-8ceb-9140126d467c","Type":"ContainerDied","Data":"305e36687037ad4a2b031cadeb69a3a8b3428c1e8096da79735ee65ddc70dfd1"} Oct 11 02:15:50 crc kubenswrapper[4743]: I1011 02:15:50.667966 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ch2hk" Oct 11 02:15:50 crc kubenswrapper[4743]: I1011 02:15:50.801467 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmzjx\" (UniqueName: \"kubernetes.io/projected/93e84206-cdac-4dc5-8ceb-9140126d467c-kube-api-access-tmzjx\") pod \"93e84206-cdac-4dc5-8ceb-9140126d467c\" (UID: \"93e84206-cdac-4dc5-8ceb-9140126d467c\") " Oct 11 02:15:50 crc kubenswrapper[4743]: I1011 02:15:50.801820 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93e84206-cdac-4dc5-8ceb-9140126d467c-utilities\") pod \"93e84206-cdac-4dc5-8ceb-9140126d467c\" (UID: \"93e84206-cdac-4dc5-8ceb-9140126d467c\") " Oct 11 02:15:50 crc kubenswrapper[4743]: I1011 02:15:50.801908 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93e84206-cdac-4dc5-8ceb-9140126d467c-catalog-content\") pod \"93e84206-cdac-4dc5-8ceb-9140126d467c\" (UID: \"93e84206-cdac-4dc5-8ceb-9140126d467c\") " Oct 11 02:15:50 crc kubenswrapper[4743]: I1011 02:15:50.802938 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93e84206-cdac-4dc5-8ceb-9140126d467c-utilities" (OuterVolumeSpecName: "utilities") pod "93e84206-cdac-4dc5-8ceb-9140126d467c" (UID: "93e84206-cdac-4dc5-8ceb-9140126d467c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:15:50 crc kubenswrapper[4743]: I1011 02:15:50.812249 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93e84206-cdac-4dc5-8ceb-9140126d467c-kube-api-access-tmzjx" (OuterVolumeSpecName: "kube-api-access-tmzjx") pod "93e84206-cdac-4dc5-8ceb-9140126d467c" (UID: "93e84206-cdac-4dc5-8ceb-9140126d467c"). InnerVolumeSpecName "kube-api-access-tmzjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:15:50 crc kubenswrapper[4743]: I1011 02:15:50.851723 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93e84206-cdac-4dc5-8ceb-9140126d467c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93e84206-cdac-4dc5-8ceb-9140126d467c" (UID: "93e84206-cdac-4dc5-8ceb-9140126d467c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:15:50 crc kubenswrapper[4743]: I1011 02:15:50.905118 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmzjx\" (UniqueName: \"kubernetes.io/projected/93e84206-cdac-4dc5-8ceb-9140126d467c-kube-api-access-tmzjx\") on node \"crc\" DevicePath \"\"" Oct 11 02:15:50 crc kubenswrapper[4743]: I1011 02:15:50.905164 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93e84206-cdac-4dc5-8ceb-9140126d467c-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 02:15:50 crc kubenswrapper[4743]: I1011 02:15:50.905181 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93e84206-cdac-4dc5-8ceb-9140126d467c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 02:15:51 crc kubenswrapper[4743]: I1011 02:15:51.443034 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch2hk" event={"ID":"93e84206-cdac-4dc5-8ceb-9140126d467c","Type":"ContainerDied","Data":"8fb4434cb4673ea0202f0738277672b820c668f10c46db27522672622fdd02b2"} Oct 11 02:15:51 crc kubenswrapper[4743]: I1011 02:15:51.443087 4743 scope.go:117] "RemoveContainer" containerID="305e36687037ad4a2b031cadeb69a3a8b3428c1e8096da79735ee65ddc70dfd1" Oct 11 02:15:51 crc kubenswrapper[4743]: I1011 02:15:51.443117 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ch2hk" Oct 11 02:15:51 crc kubenswrapper[4743]: I1011 02:15:51.472129 4743 scope.go:117] "RemoveContainer" containerID="7ae8de9791a7a37ed3a1764fd20b60307b70f22effa8681268b349ce543cdb9f" Oct 11 02:15:51 crc kubenswrapper[4743]: I1011 02:15:51.495361 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ch2hk"] Oct 11 02:15:51 crc kubenswrapper[4743]: I1011 02:15:51.508676 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ch2hk"] Oct 11 02:15:51 crc kubenswrapper[4743]: I1011 02:15:51.518305 4743 scope.go:117] "RemoveContainer" containerID="97ea8295d158aa41cc5a215266f562acaac20fba2504bcecb8498d7d648f3bdd" Oct 11 02:15:52 crc kubenswrapper[4743]: I1011 02:15:52.115141 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93e84206-cdac-4dc5-8ceb-9140126d467c" path="/var/lib/kubelet/pods/93e84206-cdac-4dc5-8ceb-9140126d467c/volumes" Oct 11 02:15:52 crc kubenswrapper[4743]: I1011 02:15:52.655175 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-brh7l"] Oct 11 02:15:52 crc kubenswrapper[4743]: I1011 02:15:52.655472 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-brh7l" podUID="79fd62ad-6966-460a-a08d-22af255e53ce" containerName="registry-server" containerID="cri-o://d33d23af5244173d0a9e64b66e7fe7ddcc64434896144278b8758a8ae02305ac" gracePeriod=2 Oct 11 02:15:53 crc kubenswrapper[4743]: I1011 02:15:53.254375 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-brh7l" Oct 11 02:15:53 crc kubenswrapper[4743]: I1011 02:15:53.375488 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfx7d\" (UniqueName: \"kubernetes.io/projected/79fd62ad-6966-460a-a08d-22af255e53ce-kube-api-access-qfx7d\") pod \"79fd62ad-6966-460a-a08d-22af255e53ce\" (UID: \"79fd62ad-6966-460a-a08d-22af255e53ce\") " Oct 11 02:15:53 crc kubenswrapper[4743]: I1011 02:15:53.375633 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79fd62ad-6966-460a-a08d-22af255e53ce-catalog-content\") pod \"79fd62ad-6966-460a-a08d-22af255e53ce\" (UID: \"79fd62ad-6966-460a-a08d-22af255e53ce\") " Oct 11 02:15:53 crc kubenswrapper[4743]: I1011 02:15:53.375713 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79fd62ad-6966-460a-a08d-22af255e53ce-utilities\") pod \"79fd62ad-6966-460a-a08d-22af255e53ce\" (UID: \"79fd62ad-6966-460a-a08d-22af255e53ce\") " Oct 11 02:15:53 crc kubenswrapper[4743]: I1011 02:15:53.376611 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79fd62ad-6966-460a-a08d-22af255e53ce-utilities" (OuterVolumeSpecName: "utilities") pod "79fd62ad-6966-460a-a08d-22af255e53ce" (UID: "79fd62ad-6966-460a-a08d-22af255e53ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:15:53 crc kubenswrapper[4743]: I1011 02:15:53.381436 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79fd62ad-6966-460a-a08d-22af255e53ce-kube-api-access-qfx7d" (OuterVolumeSpecName: "kube-api-access-qfx7d") pod "79fd62ad-6966-460a-a08d-22af255e53ce" (UID: "79fd62ad-6966-460a-a08d-22af255e53ce"). InnerVolumeSpecName "kube-api-access-qfx7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:15:53 crc kubenswrapper[4743]: I1011 02:15:53.450098 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79fd62ad-6966-460a-a08d-22af255e53ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79fd62ad-6966-460a-a08d-22af255e53ce" (UID: "79fd62ad-6966-460a-a08d-22af255e53ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:15:53 crc kubenswrapper[4743]: I1011 02:15:53.468210 4743 generic.go:334] "Generic (PLEG): container finished" podID="79fd62ad-6966-460a-a08d-22af255e53ce" containerID="d33d23af5244173d0a9e64b66e7fe7ddcc64434896144278b8758a8ae02305ac" exitCode=0 Oct 11 02:15:53 crc kubenswrapper[4743]: I1011 02:15:53.468288 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-brh7l" Oct 11 02:15:53 crc kubenswrapper[4743]: I1011 02:15:53.468271 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brh7l" event={"ID":"79fd62ad-6966-460a-a08d-22af255e53ce","Type":"ContainerDied","Data":"d33d23af5244173d0a9e64b66e7fe7ddcc64434896144278b8758a8ae02305ac"} Oct 11 02:15:53 crc kubenswrapper[4743]: I1011 02:15:53.468550 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brh7l" event={"ID":"79fd62ad-6966-460a-a08d-22af255e53ce","Type":"ContainerDied","Data":"6fa8f922de5a5863bfc60d671c4cecde807b0bab500c132e1b19e98e34fd81c1"} Oct 11 02:15:53 crc kubenswrapper[4743]: I1011 02:15:53.468608 4743 scope.go:117] "RemoveContainer" containerID="d33d23af5244173d0a9e64b66e7fe7ddcc64434896144278b8758a8ae02305ac" Oct 11 02:15:53 crc kubenswrapper[4743]: I1011 02:15:53.478803 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfx7d\" (UniqueName: \"kubernetes.io/projected/79fd62ad-6966-460a-a08d-22af255e53ce-kube-api-access-qfx7d\") on node \"crc\" DevicePath \"\"" Oct 11 02:15:53 crc kubenswrapper[4743]: I1011 02:15:53.478831 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79fd62ad-6966-460a-a08d-22af255e53ce-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 02:15:53 crc kubenswrapper[4743]: I1011 02:15:53.478842 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79fd62ad-6966-460a-a08d-22af255e53ce-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 02:15:53 crc kubenswrapper[4743]: I1011 02:15:53.500203 4743 scope.go:117] "RemoveContainer" containerID="f3f3ac91f6f842121f9e2511d0c735b3df92ef2d847bc147c32928a8d88a2263" Oct 11 02:15:53 crc kubenswrapper[4743]: I1011 02:15:53.510207 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-brh7l"] Oct 11 02:15:53 crc kubenswrapper[4743]: I1011 02:15:53.520561 4743 scope.go:117] "RemoveContainer" containerID="5b06029ee204912de58571b8f31433ad8f9979c7ac2f5b73f58b23cb223232a6" Oct 11 02:15:53 crc kubenswrapper[4743]: I1011 02:15:53.523096 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-brh7l"] Oct 11 02:15:53 crc kubenswrapper[4743]: I1011 02:15:53.572263 4743 scope.go:117] "RemoveContainer" containerID="d33d23af5244173d0a9e64b66e7fe7ddcc64434896144278b8758a8ae02305ac" Oct 11 02:15:53 crc kubenswrapper[4743]: E1011 02:15:53.572779 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d33d23af5244173d0a9e64b66e7fe7ddcc64434896144278b8758a8ae02305ac\": container with ID starting with d33d23af5244173d0a9e64b66e7fe7ddcc64434896144278b8758a8ae02305ac not found: ID does not exist" containerID="d33d23af5244173d0a9e64b66e7fe7ddcc64434896144278b8758a8ae02305ac" Oct 11 02:15:53 crc kubenswrapper[4743]: I1011 02:15:53.572820 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d33d23af5244173d0a9e64b66e7fe7ddcc64434896144278b8758a8ae02305ac"} err="failed to get container status \"d33d23af5244173d0a9e64b66e7fe7ddcc64434896144278b8758a8ae02305ac\": rpc error: code = NotFound desc = could not find container \"d33d23af5244173d0a9e64b66e7fe7ddcc64434896144278b8758a8ae02305ac\": container with ID starting with d33d23af5244173d0a9e64b66e7fe7ddcc64434896144278b8758a8ae02305ac not found: ID does not exist" Oct 11 02:15:53 crc kubenswrapper[4743]: I1011 02:15:53.572845 4743 scope.go:117] "RemoveContainer" containerID="f3f3ac91f6f842121f9e2511d0c735b3df92ef2d847bc147c32928a8d88a2263" Oct 11 02:15:53 crc kubenswrapper[4743]: E1011 02:15:53.573296 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3f3ac91f6f842121f9e2511d0c735b3df92ef2d847bc147c32928a8d88a2263\": container with ID starting with f3f3ac91f6f842121f9e2511d0c735b3df92ef2d847bc147c32928a8d88a2263 not found: ID does not exist" containerID="f3f3ac91f6f842121f9e2511d0c735b3df92ef2d847bc147c32928a8d88a2263" Oct 11 02:15:53 crc kubenswrapper[4743]: I1011 02:15:53.573319 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3f3ac91f6f842121f9e2511d0c735b3df92ef2d847bc147c32928a8d88a2263"} err="failed to get container status \"f3f3ac91f6f842121f9e2511d0c735b3df92ef2d847bc147c32928a8d88a2263\": rpc error: code = NotFound desc = could not find container \"f3f3ac91f6f842121f9e2511d0c735b3df92ef2d847bc147c32928a8d88a2263\": container with ID starting with f3f3ac91f6f842121f9e2511d0c735b3df92ef2d847bc147c32928a8d88a2263 not found: ID does not exist" Oct 11 02:15:53 crc kubenswrapper[4743]: I1011 02:15:53.573333 4743 scope.go:117] "RemoveContainer" containerID="5b06029ee204912de58571b8f31433ad8f9979c7ac2f5b73f58b23cb223232a6" Oct 11 02:15:53 crc kubenswrapper[4743]: E1011 02:15:53.573568 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b06029ee204912de58571b8f31433ad8f9979c7ac2f5b73f58b23cb223232a6\": container with ID starting with 5b06029ee204912de58571b8f31433ad8f9979c7ac2f5b73f58b23cb223232a6 not found: ID does not exist" containerID="5b06029ee204912de58571b8f31433ad8f9979c7ac2f5b73f58b23cb223232a6" Oct 11 02:15:53 crc kubenswrapper[4743]: I1011 02:15:53.573592 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b06029ee204912de58571b8f31433ad8f9979c7ac2f5b73f58b23cb223232a6"} err="failed to get container status \"5b06029ee204912de58571b8f31433ad8f9979c7ac2f5b73f58b23cb223232a6\": rpc error: code = NotFound desc = could not find container \"5b06029ee204912de58571b8f31433ad8f9979c7ac2f5b73f58b23cb223232a6\": container with ID starting with 5b06029ee204912de58571b8f31433ad8f9979c7ac2f5b73f58b23cb223232a6 not found: ID does not exist" Oct 11 02:15:54 crc kubenswrapper[4743]: I1011 02:15:54.106606 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79fd62ad-6966-460a-a08d-22af255e53ce" path="/var/lib/kubelet/pods/79fd62ad-6966-460a-a08d-22af255e53ce/volumes" Oct 11 02:15:56 crc kubenswrapper[4743]: I1011 02:15:56.275124 4743 scope.go:117] "RemoveContainer" containerID="f538fd74ba3a7de967eea3554782d8114f61dfb9b43b9ae2c7cd1a2ee0332969" Oct 11 02:16:01 crc kubenswrapper[4743]: I1011 02:16:01.092511 4743 scope.go:117] "RemoveContainer" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" Oct 11 02:16:01 crc kubenswrapper[4743]: E1011 02:16:01.093522 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:16:12 crc kubenswrapper[4743]: I1011 02:16:12.092363 4743 scope.go:117] "RemoveContainer" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" Oct 11 02:16:12 crc kubenswrapper[4743]: E1011 02:16:12.094116 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:16:24 crc kubenswrapper[4743]: I1011 02:16:24.093496 4743 scope.go:117] "RemoveContainer" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" Oct 11 02:16:24 crc kubenswrapper[4743]: E1011 02:16:24.094854 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:16:38 crc kubenswrapper[4743]: I1011 02:16:38.095240 4743 scope.go:117] "RemoveContainer" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" Oct 11 02:16:38 crc kubenswrapper[4743]: E1011 02:16:38.096631 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:16:53 crc kubenswrapper[4743]: I1011 02:16:53.091983 4743 scope.go:117] "RemoveContainer" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" Oct 11 02:16:53 crc kubenswrapper[4743]: E1011 02:16:53.092763 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:17:05 crc kubenswrapper[4743]: I1011 02:17:05.094052 4743 scope.go:117] "RemoveContainer" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" Oct 11 02:17:05 crc kubenswrapper[4743]: E1011 02:17:05.095051 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:17:19 crc kubenswrapper[4743]: I1011 02:17:19.093085 4743 scope.go:117] "RemoveContainer" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" Oct 11 02:17:19 crc kubenswrapper[4743]: E1011 02:17:19.094394 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:17:27 crc kubenswrapper[4743]: E1011 02:17:27.523247 4743 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.106:44342->38.102.83.106:39201: write tcp 38.102.83.106:44342->38.102.83.106:39201: write: broken pipe Oct 11 02:17:31 crc kubenswrapper[4743]: I1011 02:17:31.092421 4743 scope.go:117] "RemoveContainer" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" Oct 11 02:17:31 crc kubenswrapper[4743]: E1011 02:17:31.093932 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:17:43 crc kubenswrapper[4743]: I1011 02:17:43.092270 4743 scope.go:117] "RemoveContainer" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" Oct 11 02:17:43 crc kubenswrapper[4743]: E1011 02:17:43.093262 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:17:58 crc kubenswrapper[4743]: I1011 02:17:58.092220 4743 scope.go:117] "RemoveContainer" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" Oct 11 02:17:58 crc kubenswrapper[4743]: E1011 02:17:58.093090 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:18:13 crc kubenswrapper[4743]: I1011 02:18:13.092702 4743 scope.go:117] "RemoveContainer" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" Oct 11 02:18:13 crc kubenswrapper[4743]: E1011 02:18:13.093432 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:18:27 crc kubenswrapper[4743]: I1011 02:18:27.093203 4743 scope.go:117] "RemoveContainer" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" Oct 11 02:18:27 crc kubenswrapper[4743]: E1011 02:18:27.094540 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:18:41 crc kubenswrapper[4743]: I1011 02:18:41.092781 4743 scope.go:117] "RemoveContainer" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" Oct 11 02:18:41 crc kubenswrapper[4743]: E1011 02:18:41.094310 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:18:56 crc kubenswrapper[4743]: I1011 02:18:56.102390 4743 scope.go:117] "RemoveContainer" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" Oct 11 02:18:56 crc kubenswrapper[4743]: E1011 02:18:56.103537 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:19:10 crc kubenswrapper[4743]: I1011 02:19:10.092599 4743 scope.go:117] "RemoveContainer" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" Oct 11 02:19:10 crc kubenswrapper[4743]: E1011 02:19:10.093512 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:19:24 crc kubenswrapper[4743]: I1011 02:19:24.092649 4743 scope.go:117] "RemoveContainer" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" Oct 11 02:19:24 crc kubenswrapper[4743]: E1011 02:19:24.093571 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:19:35 crc kubenswrapper[4743]: I1011 02:19:35.091770 4743 scope.go:117] "RemoveContainer" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" Oct 11 02:19:35 crc kubenswrapper[4743]: E1011 02:19:35.092534 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:19:49 crc kubenswrapper[4743]: I1011 02:19:49.092984 4743 scope.go:117] "RemoveContainer" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" Oct 11 02:19:49 crc kubenswrapper[4743]: I1011 02:19:49.567415 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"7be48f3a0d24dfda2d79ce91f5eb41137b96d1b2979c2ec2a35c21bcecafda99"} Oct 11 02:19:59 crc kubenswrapper[4743]: I1011 02:19:59.622726 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-scrmv"] Oct 11 02:19:59 crc kubenswrapper[4743]: E1011 02:19:59.625233 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79fd62ad-6966-460a-a08d-22af255e53ce" containerName="registry-server" Oct 11 02:19:59 crc kubenswrapper[4743]: I1011 02:19:59.625541 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="79fd62ad-6966-460a-a08d-22af255e53ce" containerName="registry-server" Oct 11 02:19:59 crc kubenswrapper[4743]: E1011 02:19:59.625634 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79fd62ad-6966-460a-a08d-22af255e53ce" containerName="extract-utilities" Oct 11 02:19:59 crc kubenswrapper[4743]: I1011 02:19:59.625706 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="79fd62ad-6966-460a-a08d-22af255e53ce" containerName="extract-utilities" Oct 11 02:19:59 crc kubenswrapper[4743]: E1011 02:19:59.625804 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93e84206-cdac-4dc5-8ceb-9140126d467c" containerName="extract-content" Oct 11 02:19:59 crc kubenswrapper[4743]: I1011 02:19:59.625900 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="93e84206-cdac-4dc5-8ceb-9140126d467c" containerName="extract-content" Oct 11 02:19:59 crc kubenswrapper[4743]: E1011 02:19:59.625996 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79fd62ad-6966-460a-a08d-22af255e53ce" containerName="extract-content" Oct 11 02:19:59 crc kubenswrapper[4743]: I1011 02:19:59.626082 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="79fd62ad-6966-460a-a08d-22af255e53ce" containerName="extract-content" Oct 11 02:19:59 crc kubenswrapper[4743]: E1011 02:19:59.626195 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93e84206-cdac-4dc5-8ceb-9140126d467c" containerName="extract-utilities" Oct 11 02:19:59 crc kubenswrapper[4743]: I1011 02:19:59.626281 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="93e84206-cdac-4dc5-8ceb-9140126d467c" containerName="extract-utilities" Oct 11 02:19:59 crc kubenswrapper[4743]: E1011 02:19:59.626381 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93e84206-cdac-4dc5-8ceb-9140126d467c" containerName="registry-server" Oct 11 02:19:59 crc kubenswrapper[4743]: I1011 02:19:59.626471 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="93e84206-cdac-4dc5-8ceb-9140126d467c" containerName="registry-server" Oct 11 02:19:59 crc kubenswrapper[4743]: I1011 02:19:59.626905 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="79fd62ad-6966-460a-a08d-22af255e53ce" containerName="registry-server" Oct 11 02:19:59 crc kubenswrapper[4743]: I1011 02:19:59.627045 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="93e84206-cdac-4dc5-8ceb-9140126d467c" containerName="registry-server" Oct 11 02:19:59 crc kubenswrapper[4743]: I1011 02:19:59.629057 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-scrmv" Oct 11 02:19:59 crc kubenswrapper[4743]: I1011 02:19:59.662920 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-scrmv"] Oct 11 02:19:59 crc kubenswrapper[4743]: I1011 02:19:59.810468 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6cb7120-39e0-4a16-abf8-401c9fe3c73a-catalog-content\") pod \"certified-operators-scrmv\" (UID: \"d6cb7120-39e0-4a16-abf8-401c9fe3c73a\") " pod="openshift-marketplace/certified-operators-scrmv" Oct 11 02:19:59 crc kubenswrapper[4743]: I1011 02:19:59.810559 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8gnk\" (UniqueName: \"kubernetes.io/projected/d6cb7120-39e0-4a16-abf8-401c9fe3c73a-kube-api-access-w8gnk\") pod \"certified-operators-scrmv\" (UID: \"d6cb7120-39e0-4a16-abf8-401c9fe3c73a\") " pod="openshift-marketplace/certified-operators-scrmv" Oct 11 02:19:59 crc kubenswrapper[4743]: I1011 02:19:59.810700 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6cb7120-39e0-4a16-abf8-401c9fe3c73a-utilities\") pod \"certified-operators-scrmv\" (UID: \"d6cb7120-39e0-4a16-abf8-401c9fe3c73a\") " pod="openshift-marketplace/certified-operators-scrmv" Oct 11 02:19:59 crc kubenswrapper[4743]: I1011 02:19:59.913101 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6cb7120-39e0-4a16-abf8-401c9fe3c73a-utilities\") pod \"certified-operators-scrmv\" (UID: \"d6cb7120-39e0-4a16-abf8-401c9fe3c73a\") " pod="openshift-marketplace/certified-operators-scrmv" Oct 11 02:19:59 crc kubenswrapper[4743]: I1011 02:19:59.913246 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6cb7120-39e0-4a16-abf8-401c9fe3c73a-catalog-content\") pod \"certified-operators-scrmv\" (UID: \"d6cb7120-39e0-4a16-abf8-401c9fe3c73a\") " pod="openshift-marketplace/certified-operators-scrmv" Oct 11 02:19:59 crc kubenswrapper[4743]: I1011 02:19:59.913295 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8gnk\" (UniqueName: \"kubernetes.io/projected/d6cb7120-39e0-4a16-abf8-401c9fe3c73a-kube-api-access-w8gnk\") pod \"certified-operators-scrmv\" (UID: \"d6cb7120-39e0-4a16-abf8-401c9fe3c73a\") " pod="openshift-marketplace/certified-operators-scrmv" Oct 11 02:19:59 crc kubenswrapper[4743]: I1011 02:19:59.913896 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6cb7120-39e0-4a16-abf8-401c9fe3c73a-catalog-content\") pod \"certified-operators-scrmv\" (UID: \"d6cb7120-39e0-4a16-abf8-401c9fe3c73a\") " pod="openshift-marketplace/certified-operators-scrmv" Oct 11 02:19:59 crc kubenswrapper[4743]: I1011 02:19:59.914238 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6cb7120-39e0-4a16-abf8-401c9fe3c73a-utilities\") pod \"certified-operators-scrmv\" (UID: \"d6cb7120-39e0-4a16-abf8-401c9fe3c73a\") " pod="openshift-marketplace/certified-operators-scrmv" Oct 11 02:19:59 crc kubenswrapper[4743]: I1011 02:19:59.950542 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8gnk\" (UniqueName: \"kubernetes.io/projected/d6cb7120-39e0-4a16-abf8-401c9fe3c73a-kube-api-access-w8gnk\") pod \"certified-operators-scrmv\" (UID: \"d6cb7120-39e0-4a16-abf8-401c9fe3c73a\") " pod="openshift-marketplace/certified-operators-scrmv" Oct 11 02:19:59 crc kubenswrapper[4743]: I1011 02:19:59.953798 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-scrmv" Oct 11 02:20:00 crc kubenswrapper[4743]: I1011 02:20:00.484553 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-scrmv"] Oct 11 02:20:00 crc kubenswrapper[4743]: I1011 02:20:00.695991 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scrmv" event={"ID":"d6cb7120-39e0-4a16-abf8-401c9fe3c73a","Type":"ContainerStarted","Data":"70877aabac0d104771c12f854caa6aeee80d8937946cbf58779e842e2aa47276"} Oct 11 02:20:01 crc kubenswrapper[4743]: I1011 02:20:01.710162 4743 generic.go:334] "Generic (PLEG): container finished" podID="d6cb7120-39e0-4a16-abf8-401c9fe3c73a" containerID="d89a25cd9d0f0c1875fb1994fa4eb16182fea6b4c390bc6742920cf19391f9c4" exitCode=0 Oct 11 02:20:01 crc kubenswrapper[4743]: I1011 02:20:01.710261 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scrmv" event={"ID":"d6cb7120-39e0-4a16-abf8-401c9fe3c73a","Type":"ContainerDied","Data":"d89a25cd9d0f0c1875fb1994fa4eb16182fea6b4c390bc6742920cf19391f9c4"} Oct 11 02:20:01 crc kubenswrapper[4743]: I1011 02:20:01.713098 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 02:20:03 crc kubenswrapper[4743]: I1011 02:20:03.736653 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scrmv" event={"ID":"d6cb7120-39e0-4a16-abf8-401c9fe3c73a","Type":"ContainerStarted","Data":"b3aa5dc5afad50a22ffaf8ba07035118358e37b6d8ee1ce6546f9d537bf787e2"} Oct 11 02:20:04 crc kubenswrapper[4743]: I1011 02:20:04.752398 4743 generic.go:334] "Generic (PLEG): container finished" podID="d6cb7120-39e0-4a16-abf8-401c9fe3c73a" containerID="b3aa5dc5afad50a22ffaf8ba07035118358e37b6d8ee1ce6546f9d537bf787e2" exitCode=0 Oct 11 02:20:04 crc kubenswrapper[4743]: I1011 02:20:04.752516 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scrmv" event={"ID":"d6cb7120-39e0-4a16-abf8-401c9fe3c73a","Type":"ContainerDied","Data":"b3aa5dc5afad50a22ffaf8ba07035118358e37b6d8ee1ce6546f9d537bf787e2"} Oct 11 02:20:05 crc kubenswrapper[4743]: I1011 02:20:05.768947 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scrmv" event={"ID":"d6cb7120-39e0-4a16-abf8-401c9fe3c73a","Type":"ContainerStarted","Data":"5da3b417670bd0f31005bf31963ee54c15c6d940dca0203bd97bf251ad278f59"} Oct 11 02:20:05 crc kubenswrapper[4743]: I1011 02:20:05.797115 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-scrmv" podStartSLOduration=3.3422955610000002 podStartE2EDuration="6.797092682s" podCreationTimestamp="2025-10-11 02:19:59 +0000 UTC" firstStartedPulling="2025-10-11 02:20:01.712422306 +0000 UTC m=+5296.365402743" lastFinishedPulling="2025-10-11 02:20:05.167219457 +0000 UTC m=+5299.820199864" observedRunningTime="2025-10-11 02:20:05.786812396 +0000 UTC m=+5300.439792793" watchObservedRunningTime="2025-10-11 02:20:05.797092682 +0000 UTC m=+5300.450073079" Oct 11 02:20:09 crc kubenswrapper[4743]: I1011 02:20:09.954991 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-scrmv" Oct 11 02:20:09 crc kubenswrapper[4743]: I1011 02:20:09.956842 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-scrmv" Oct 11 02:20:10 crc kubenswrapper[4743]: I1011 02:20:10.034486 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-scrmv" Oct 11 02:20:10 crc kubenswrapper[4743]: I1011 02:20:10.878727 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-scrmv" Oct 11 02:20:10 crc kubenswrapper[4743]: I1011 02:20:10.938271 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-scrmv"] Oct 11 02:20:12 crc kubenswrapper[4743]: I1011 02:20:12.848158 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-scrmv" podUID="d6cb7120-39e0-4a16-abf8-401c9fe3c73a" containerName="registry-server" containerID="cri-o://5da3b417670bd0f31005bf31963ee54c15c6d940dca0203bd97bf251ad278f59" gracePeriod=2 Oct 11 02:20:13 crc kubenswrapper[4743]: I1011 02:20:13.383286 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-scrmv" Oct 11 02:20:13 crc kubenswrapper[4743]: I1011 02:20:13.447134 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6cb7120-39e0-4a16-abf8-401c9fe3c73a-catalog-content\") pod \"d6cb7120-39e0-4a16-abf8-401c9fe3c73a\" (UID: \"d6cb7120-39e0-4a16-abf8-401c9fe3c73a\") " Oct 11 02:20:13 crc kubenswrapper[4743]: I1011 02:20:13.447211 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6cb7120-39e0-4a16-abf8-401c9fe3c73a-utilities\") pod \"d6cb7120-39e0-4a16-abf8-401c9fe3c73a\" (UID: \"d6cb7120-39e0-4a16-abf8-401c9fe3c73a\") " Oct 11 02:20:13 crc kubenswrapper[4743]: I1011 02:20:13.447235 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8gnk\" (UniqueName: \"kubernetes.io/projected/d6cb7120-39e0-4a16-abf8-401c9fe3c73a-kube-api-access-w8gnk\") pod \"d6cb7120-39e0-4a16-abf8-401c9fe3c73a\" (UID: \"d6cb7120-39e0-4a16-abf8-401c9fe3c73a\") " Oct 11 02:20:13 crc kubenswrapper[4743]: I1011 02:20:13.448955 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6cb7120-39e0-4a16-abf8-401c9fe3c73a-utilities" (OuterVolumeSpecName: "utilities") pod "d6cb7120-39e0-4a16-abf8-401c9fe3c73a" (UID: "d6cb7120-39e0-4a16-abf8-401c9fe3c73a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:20:13 crc kubenswrapper[4743]: I1011 02:20:13.454219 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6cb7120-39e0-4a16-abf8-401c9fe3c73a-kube-api-access-w8gnk" (OuterVolumeSpecName: "kube-api-access-w8gnk") pod "d6cb7120-39e0-4a16-abf8-401c9fe3c73a" (UID: "d6cb7120-39e0-4a16-abf8-401c9fe3c73a"). InnerVolumeSpecName "kube-api-access-w8gnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:20:13 crc kubenswrapper[4743]: I1011 02:20:13.550567 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8gnk\" (UniqueName: \"kubernetes.io/projected/d6cb7120-39e0-4a16-abf8-401c9fe3c73a-kube-api-access-w8gnk\") on node \"crc\" DevicePath \"\"" Oct 11 02:20:13 crc kubenswrapper[4743]: I1011 02:20:13.550598 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6cb7120-39e0-4a16-abf8-401c9fe3c73a-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 02:20:13 crc kubenswrapper[4743]: I1011 02:20:13.863406 4743 generic.go:334] "Generic (PLEG): container finished" podID="d6cb7120-39e0-4a16-abf8-401c9fe3c73a" containerID="5da3b417670bd0f31005bf31963ee54c15c6d940dca0203bd97bf251ad278f59" exitCode=0 Oct 11 02:20:13 crc kubenswrapper[4743]: I1011 02:20:13.863456 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scrmv" event={"ID":"d6cb7120-39e0-4a16-abf8-401c9fe3c73a","Type":"ContainerDied","Data":"5da3b417670bd0f31005bf31963ee54c15c6d940dca0203bd97bf251ad278f59"} Oct 11 02:20:13 crc kubenswrapper[4743]: I1011 02:20:13.863483 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scrmv" event={"ID":"d6cb7120-39e0-4a16-abf8-401c9fe3c73a","Type":"ContainerDied","Data":"70877aabac0d104771c12f854caa6aeee80d8937946cbf58779e842e2aa47276"} Oct 11 02:20:13 crc kubenswrapper[4743]: I1011 02:20:13.863498 4743 scope.go:117] "RemoveContainer" containerID="5da3b417670bd0f31005bf31963ee54c15c6d940dca0203bd97bf251ad278f59" Oct 11 02:20:13 crc kubenswrapper[4743]: I1011 02:20:13.863633 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-scrmv" Oct 11 02:20:13 crc kubenswrapper[4743]: I1011 02:20:13.923974 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6cb7120-39e0-4a16-abf8-401c9fe3c73a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d6cb7120-39e0-4a16-abf8-401c9fe3c73a" (UID: "d6cb7120-39e0-4a16-abf8-401c9fe3c73a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:20:13 crc kubenswrapper[4743]: I1011 02:20:13.961575 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6cb7120-39e0-4a16-abf8-401c9fe3c73a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 02:20:14 crc kubenswrapper[4743]: I1011 02:20:14.527771 4743 scope.go:117] "RemoveContainer" containerID="b3aa5dc5afad50a22ffaf8ba07035118358e37b6d8ee1ce6546f9d537bf787e2" Oct 11 02:20:14 crc kubenswrapper[4743]: I1011 02:20:14.536673 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-scrmv"] Oct 11 02:20:14 crc kubenswrapper[4743]: I1011 02:20:14.550790 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-scrmv"] Oct 11 02:20:14 crc kubenswrapper[4743]: I1011 02:20:14.555260 4743 scope.go:117] "RemoveContainer" containerID="d89a25cd9d0f0c1875fb1994fa4eb16182fea6b4c390bc6742920cf19391f9c4" Oct 11 02:20:14 crc kubenswrapper[4743]: I1011 02:20:14.605840 4743 scope.go:117] "RemoveContainer" containerID="5da3b417670bd0f31005bf31963ee54c15c6d940dca0203bd97bf251ad278f59" Oct 11 02:20:14 crc kubenswrapper[4743]: E1011 02:20:14.606365 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5da3b417670bd0f31005bf31963ee54c15c6d940dca0203bd97bf251ad278f59\": container with ID starting with 5da3b417670bd0f31005bf31963ee54c15c6d940dca0203bd97bf251ad278f59 not found: ID does not exist" containerID="5da3b417670bd0f31005bf31963ee54c15c6d940dca0203bd97bf251ad278f59" Oct 11 02:20:14 crc kubenswrapper[4743]: I1011 02:20:14.606396 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5da3b417670bd0f31005bf31963ee54c15c6d940dca0203bd97bf251ad278f59"} err="failed to get container status \"5da3b417670bd0f31005bf31963ee54c15c6d940dca0203bd97bf251ad278f59\": rpc error: code = NotFound desc = could not find container \"5da3b417670bd0f31005bf31963ee54c15c6d940dca0203bd97bf251ad278f59\": container with ID starting with 5da3b417670bd0f31005bf31963ee54c15c6d940dca0203bd97bf251ad278f59 not found: ID does not exist" Oct 11 02:20:14 crc kubenswrapper[4743]: I1011 02:20:14.606416 4743 scope.go:117] "RemoveContainer" containerID="b3aa5dc5afad50a22ffaf8ba07035118358e37b6d8ee1ce6546f9d537bf787e2" Oct 11 02:20:14 crc kubenswrapper[4743]: E1011 02:20:14.606673 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3aa5dc5afad50a22ffaf8ba07035118358e37b6d8ee1ce6546f9d537bf787e2\": container with ID starting with b3aa5dc5afad50a22ffaf8ba07035118358e37b6d8ee1ce6546f9d537bf787e2 not found: ID does not exist" containerID="b3aa5dc5afad50a22ffaf8ba07035118358e37b6d8ee1ce6546f9d537bf787e2" Oct 11 02:20:14 crc kubenswrapper[4743]: I1011 02:20:14.606700 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3aa5dc5afad50a22ffaf8ba07035118358e37b6d8ee1ce6546f9d537bf787e2"} err="failed to get container status \"b3aa5dc5afad50a22ffaf8ba07035118358e37b6d8ee1ce6546f9d537bf787e2\": rpc error: code = NotFound desc = could not find container \"b3aa5dc5afad50a22ffaf8ba07035118358e37b6d8ee1ce6546f9d537bf787e2\": container with ID starting with b3aa5dc5afad50a22ffaf8ba07035118358e37b6d8ee1ce6546f9d537bf787e2 not found: ID does not exist" Oct 11 02:20:14 crc kubenswrapper[4743]: I1011 02:20:14.606714 4743 scope.go:117] "RemoveContainer" containerID="d89a25cd9d0f0c1875fb1994fa4eb16182fea6b4c390bc6742920cf19391f9c4" Oct 11 02:20:14 crc kubenswrapper[4743]: E1011 02:20:14.606969 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d89a25cd9d0f0c1875fb1994fa4eb16182fea6b4c390bc6742920cf19391f9c4\": container with ID starting with d89a25cd9d0f0c1875fb1994fa4eb16182fea6b4c390bc6742920cf19391f9c4 not found: ID does not exist" containerID="d89a25cd9d0f0c1875fb1994fa4eb16182fea6b4c390bc6742920cf19391f9c4" Oct 11 02:20:14 crc kubenswrapper[4743]: I1011 02:20:14.606988 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d89a25cd9d0f0c1875fb1994fa4eb16182fea6b4c390bc6742920cf19391f9c4"} err="failed to get container status \"d89a25cd9d0f0c1875fb1994fa4eb16182fea6b4c390bc6742920cf19391f9c4\": rpc error: code = NotFound desc = could not find container \"d89a25cd9d0f0c1875fb1994fa4eb16182fea6b4c390bc6742920cf19391f9c4\": container with ID starting with d89a25cd9d0f0c1875fb1994fa4eb16182fea6b4c390bc6742920cf19391f9c4 not found: ID does not exist" Oct 11 02:20:16 crc kubenswrapper[4743]: I1011 02:20:16.121227 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6cb7120-39e0-4a16-abf8-401c9fe3c73a" path="/var/lib/kubelet/pods/d6cb7120-39e0-4a16-abf8-401c9fe3c73a/volumes" Oct 11 02:20:57 crc kubenswrapper[4743]: I1011 02:20:57.377285 4743 scope.go:117] "RemoveContainer" containerID="b2c369928f410101837522bc32c66d6b74e1bdd10e36e3fb8151ec7c3e6204d6" Oct 11 02:20:57 crc kubenswrapper[4743]: I1011 02:20:57.421055 4743 scope.go:117] "RemoveContainer" containerID="a72a4701eed12e388452b7517ac3a163357622b596579916f96ae0522a2d3a13" Oct 11 02:20:57 crc kubenswrapper[4743]: I1011 02:20:57.458384 4743 scope.go:117] "RemoveContainer" containerID="bc0866e6913db317f2f29bd97cda5b25233b0c7d6b80b7898008f1384be3012b" Oct 11 02:22:14 crc kubenswrapper[4743]: I1011 02:22:14.458508 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:22:14 crc kubenswrapper[4743]: I1011 02:22:14.459285 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:22:44 crc kubenswrapper[4743]: I1011 02:22:44.458656 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:22:44 crc kubenswrapper[4743]: I1011 02:22:44.459347 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:23:14 crc kubenswrapper[4743]: I1011 02:23:14.077847 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-qfwq7"] Oct 11 02:23:14 crc kubenswrapper[4743]: I1011 02:23:14.090839 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-qfwq7"] Oct 11 02:23:14 crc kubenswrapper[4743]: I1011 02:23:14.130297 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0f86bb5-1ce9-44d0-a03a-ef624592cdc4" path="/var/lib/kubelet/pods/c0f86bb5-1ce9-44d0-a03a-ef624592cdc4/volumes" Oct 11 02:23:14 crc kubenswrapper[4743]: I1011 02:23:14.458311 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:23:14 crc kubenswrapper[4743]: I1011 02:23:14.458701 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:23:14 crc kubenswrapper[4743]: I1011 02:23:14.458757 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 02:23:14 crc kubenswrapper[4743]: I1011 02:23:14.459848 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7be48f3a0d24dfda2d79ce91f5eb41137b96d1b2979c2ec2a35c21bcecafda99"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 02:23:14 crc kubenswrapper[4743]: I1011 02:23:14.459996 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://7be48f3a0d24dfda2d79ce91f5eb41137b96d1b2979c2ec2a35c21bcecafda99" gracePeriod=600 Oct 11 02:23:15 crc kubenswrapper[4743]: I1011 02:23:15.156907 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="7be48f3a0d24dfda2d79ce91f5eb41137b96d1b2979c2ec2a35c21bcecafda99" exitCode=0 Oct 11 02:23:15 crc kubenswrapper[4743]: I1011 02:23:15.156974 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"7be48f3a0d24dfda2d79ce91f5eb41137b96d1b2979c2ec2a35c21bcecafda99"} Oct 11 02:23:15 crc kubenswrapper[4743]: I1011 02:23:15.157294 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d"} Oct 11 02:23:15 crc kubenswrapper[4743]: I1011 02:23:15.157329 4743 scope.go:117] "RemoveContainer" containerID="8db380be5bc3ef378ff0592922955ef88122acec712f90fde34b6491175535db" Oct 11 02:23:34 crc kubenswrapper[4743]: I1011 02:23:34.051340 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-d886-account-create-4g9gc"] Oct 11 02:23:34 crc kubenswrapper[4743]: I1011 02:23:34.067761 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-d886-account-create-4g9gc"] Oct 11 02:23:34 crc kubenswrapper[4743]: I1011 02:23:34.114793 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a02defd0-0720-4bc9-a2ba-c8262b4b4432" path="/var/lib/kubelet/pods/a02defd0-0720-4bc9-a2ba-c8262b4b4432/volumes" Oct 11 02:23:54 crc kubenswrapper[4743]: I1011 02:23:54.037605 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-7s99b"] Oct 11 02:23:54 crc kubenswrapper[4743]: I1011 02:23:54.054565 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-7s99b"] Oct 11 02:23:54 crc kubenswrapper[4743]: I1011 02:23:54.108146 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595" path="/var/lib/kubelet/pods/b37ccbe6-6cf3-4ee2-ae8d-a6c4acc66595/volumes" Oct 11 02:23:57 crc kubenswrapper[4743]: I1011 02:23:57.593151 4743 scope.go:117] "RemoveContainer" containerID="c0c28bfcf170da34a692035c0335b91c84848516661ea1b2a7965f3180fdd070" Oct 11 02:23:57 crc kubenswrapper[4743]: I1011 02:23:57.630484 4743 scope.go:117] "RemoveContainer" containerID="94f1700a2ee279a090b01900d8d39c3b6841a8bd9bee1f675dddcb424b5a0fbc" Oct 11 02:23:57 crc kubenswrapper[4743]: I1011 02:23:57.710461 4743 scope.go:117] "RemoveContainer" containerID="c20d4ec301e1c2892e1e29c4f4911cf5b7ec93056223930f00414901e82ddc0d" Oct 11 02:25:14 crc kubenswrapper[4743]: I1011 02:25:14.458368 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:25:14 crc kubenswrapper[4743]: I1011 02:25:14.458959 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:25:44 crc kubenswrapper[4743]: I1011 02:25:44.457896 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:25:44 crc kubenswrapper[4743]: I1011 02:25:44.458641 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:26:14 crc kubenswrapper[4743]: I1011 02:26:14.458773 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:26:14 crc kubenswrapper[4743]: I1011 02:26:14.459470 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:26:14 crc kubenswrapper[4743]: I1011 02:26:14.459536 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 02:26:14 crc kubenswrapper[4743]: I1011 02:26:14.460632 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 02:26:14 crc kubenswrapper[4743]: I1011 02:26:14.460730 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" gracePeriod=600 Oct 11 02:26:14 crc kubenswrapper[4743]: E1011 02:26:14.582496 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:26:15 crc kubenswrapper[4743]: I1011 02:26:15.284802 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" exitCode=0 Oct 11 02:26:15 crc kubenswrapper[4743]: I1011 02:26:15.284913 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d"} Oct 11 02:26:15 crc kubenswrapper[4743]: I1011 02:26:15.285325 4743 scope.go:117] "RemoveContainer" containerID="7be48f3a0d24dfda2d79ce91f5eb41137b96d1b2979c2ec2a35c21bcecafda99" Oct 11 02:26:15 crc kubenswrapper[4743]: I1011 02:26:15.287064 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:26:15 crc kubenswrapper[4743]: E1011 02:26:15.287761 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:26:27 crc kubenswrapper[4743]: I1011 02:26:27.092454 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:26:27 crc kubenswrapper[4743]: E1011 02:26:27.093451 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:26:33 crc kubenswrapper[4743]: I1011 02:26:33.715047 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jtx96"] Oct 11 02:26:33 crc kubenswrapper[4743]: E1011 02:26:33.716328 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6cb7120-39e0-4a16-abf8-401c9fe3c73a" containerName="extract-utilities" Oct 11 02:26:33 crc kubenswrapper[4743]: I1011 02:26:33.716350 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6cb7120-39e0-4a16-abf8-401c9fe3c73a" containerName="extract-utilities" Oct 11 02:26:33 crc kubenswrapper[4743]: E1011 02:26:33.716383 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6cb7120-39e0-4a16-abf8-401c9fe3c73a" containerName="registry-server" Oct 11 02:26:33 crc kubenswrapper[4743]: I1011 02:26:33.716393 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6cb7120-39e0-4a16-abf8-401c9fe3c73a" containerName="registry-server" Oct 11 02:26:33 crc kubenswrapper[4743]: E1011 02:26:33.716437 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6cb7120-39e0-4a16-abf8-401c9fe3c73a" containerName="extract-content" Oct 11 02:26:33 crc kubenswrapper[4743]: I1011 02:26:33.716449 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6cb7120-39e0-4a16-abf8-401c9fe3c73a" containerName="extract-content" Oct 11 02:26:33 crc kubenswrapper[4743]: I1011 02:26:33.716756 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6cb7120-39e0-4a16-abf8-401c9fe3c73a" containerName="registry-server" Oct 11 02:26:33 crc kubenswrapper[4743]: I1011 02:26:33.730852 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtx96" Oct 11 02:26:33 crc kubenswrapper[4743]: I1011 02:26:33.780415 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jtx96"] Oct 11 02:26:33 crc kubenswrapper[4743]: I1011 02:26:33.882012 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81708c78-25fa-4f77-aaac-78cde85d2114-utilities\") pod \"redhat-operators-jtx96\" (UID: \"81708c78-25fa-4f77-aaac-78cde85d2114\") " pod="openshift-marketplace/redhat-operators-jtx96" Oct 11 02:26:33 crc kubenswrapper[4743]: I1011 02:26:33.882152 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktjc8\" (UniqueName: \"kubernetes.io/projected/81708c78-25fa-4f77-aaac-78cde85d2114-kube-api-access-ktjc8\") pod \"redhat-operators-jtx96\" (UID: \"81708c78-25fa-4f77-aaac-78cde85d2114\") " pod="openshift-marketplace/redhat-operators-jtx96" Oct 11 02:26:33 crc kubenswrapper[4743]: I1011 02:26:33.882279 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81708c78-25fa-4f77-aaac-78cde85d2114-catalog-content\") pod \"redhat-operators-jtx96\" (UID: \"81708c78-25fa-4f77-aaac-78cde85d2114\") " pod="openshift-marketplace/redhat-operators-jtx96" Oct 11 02:26:33 crc kubenswrapper[4743]: I1011 02:26:33.984008 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktjc8\" (UniqueName: \"kubernetes.io/projected/81708c78-25fa-4f77-aaac-78cde85d2114-kube-api-access-ktjc8\") pod \"redhat-operators-jtx96\" (UID: \"81708c78-25fa-4f77-aaac-78cde85d2114\") " pod="openshift-marketplace/redhat-operators-jtx96" Oct 11 02:26:33 crc kubenswrapper[4743]: I1011 02:26:33.984197 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81708c78-25fa-4f77-aaac-78cde85d2114-catalog-content\") pod \"redhat-operators-jtx96\" (UID: \"81708c78-25fa-4f77-aaac-78cde85d2114\") " pod="openshift-marketplace/redhat-operators-jtx96" Oct 11 02:26:33 crc kubenswrapper[4743]: I1011 02:26:33.984358 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81708c78-25fa-4f77-aaac-78cde85d2114-utilities\") pod \"redhat-operators-jtx96\" (UID: \"81708c78-25fa-4f77-aaac-78cde85d2114\") " pod="openshift-marketplace/redhat-operators-jtx96" Oct 11 02:26:33 crc kubenswrapper[4743]: I1011 02:26:33.984707 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81708c78-25fa-4f77-aaac-78cde85d2114-catalog-content\") pod \"redhat-operators-jtx96\" (UID: \"81708c78-25fa-4f77-aaac-78cde85d2114\") " pod="openshift-marketplace/redhat-operators-jtx96" Oct 11 02:26:33 crc kubenswrapper[4743]: I1011 02:26:33.984823 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81708c78-25fa-4f77-aaac-78cde85d2114-utilities\") pod \"redhat-operators-jtx96\" (UID: \"81708c78-25fa-4f77-aaac-78cde85d2114\") " pod="openshift-marketplace/redhat-operators-jtx96" Oct 11 02:26:34 crc kubenswrapper[4743]: I1011 02:26:34.007607 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktjc8\" (UniqueName: \"kubernetes.io/projected/81708c78-25fa-4f77-aaac-78cde85d2114-kube-api-access-ktjc8\") pod \"redhat-operators-jtx96\" (UID: \"81708c78-25fa-4f77-aaac-78cde85d2114\") " pod="openshift-marketplace/redhat-operators-jtx96" Oct 11 02:26:34 crc kubenswrapper[4743]: I1011 02:26:34.091526 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtx96" Oct 11 02:26:34 crc kubenswrapper[4743]: I1011 02:26:34.589239 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jtx96"] Oct 11 02:26:35 crc kubenswrapper[4743]: I1011 02:26:35.556280 4743 generic.go:334] "Generic (PLEG): container finished" podID="81708c78-25fa-4f77-aaac-78cde85d2114" containerID="a43d11181f08a50e4a207a5eb196e5fdbccde6aba98a6019ce74f8c591f28412" exitCode=0 Oct 11 02:26:35 crc kubenswrapper[4743]: I1011 02:26:35.556375 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtx96" event={"ID":"81708c78-25fa-4f77-aaac-78cde85d2114","Type":"ContainerDied","Data":"a43d11181f08a50e4a207a5eb196e5fdbccde6aba98a6019ce74f8c591f28412"} Oct 11 02:26:35 crc kubenswrapper[4743]: I1011 02:26:35.556585 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtx96" event={"ID":"81708c78-25fa-4f77-aaac-78cde85d2114","Type":"ContainerStarted","Data":"e682ef06882aea34d48276a01d2271347072aa9eb18426030ebf33ebf8639f56"} Oct 11 02:26:35 crc kubenswrapper[4743]: I1011 02:26:35.558609 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 02:26:36 crc kubenswrapper[4743]: I1011 02:26:36.568274 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtx96" event={"ID":"81708c78-25fa-4f77-aaac-78cde85d2114","Type":"ContainerStarted","Data":"3d0cc4ad4b26152e6ee5dd704b62edefd41bac06354671902dbfe82033c084a2"} Oct 11 02:26:38 crc kubenswrapper[4743]: I1011 02:26:38.095836 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:26:38 crc kubenswrapper[4743]: E1011 02:26:38.099498 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:26:40 crc kubenswrapper[4743]: I1011 02:26:40.620737 4743 generic.go:334] "Generic (PLEG): container finished" podID="81708c78-25fa-4f77-aaac-78cde85d2114" containerID="3d0cc4ad4b26152e6ee5dd704b62edefd41bac06354671902dbfe82033c084a2" exitCode=0 Oct 11 02:26:40 crc kubenswrapper[4743]: I1011 02:26:40.620809 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtx96" event={"ID":"81708c78-25fa-4f77-aaac-78cde85d2114","Type":"ContainerDied","Data":"3d0cc4ad4b26152e6ee5dd704b62edefd41bac06354671902dbfe82033c084a2"} Oct 11 02:26:41 crc kubenswrapper[4743]: I1011 02:26:41.636507 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtx96" event={"ID":"81708c78-25fa-4f77-aaac-78cde85d2114","Type":"ContainerStarted","Data":"ac6ee0386097c7bd0b56f1e58fcb54edf9440eaafdbffc087aa7015390839793"} Oct 11 02:26:41 crc kubenswrapper[4743]: I1011 02:26:41.682083 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jtx96" podStartSLOduration=3.202855947 podStartE2EDuration="8.682052498s" podCreationTimestamp="2025-10-11 02:26:33 +0000 UTC" firstStartedPulling="2025-10-11 02:26:35.558397628 +0000 UTC m=+5690.211378025" lastFinishedPulling="2025-10-11 02:26:41.037594179 +0000 UTC m=+5695.690574576" observedRunningTime="2025-10-11 02:26:41.662346397 +0000 UTC m=+5696.315326804" watchObservedRunningTime="2025-10-11 02:26:41.682052498 +0000 UTC m=+5696.335032935" Oct 11 02:26:44 crc kubenswrapper[4743]: I1011 02:26:44.113136 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jtx96" Oct 11 02:26:44 crc kubenswrapper[4743]: I1011 02:26:44.114111 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jtx96" Oct 11 02:26:45 crc kubenswrapper[4743]: I1011 02:26:45.158318 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jtx96" podUID="81708c78-25fa-4f77-aaac-78cde85d2114" containerName="registry-server" probeResult="failure" output=< Oct 11 02:26:45 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Oct 11 02:26:45 crc kubenswrapper[4743]: > Oct 11 02:26:49 crc kubenswrapper[4743]: I1011 02:26:49.092527 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:26:49 crc kubenswrapper[4743]: E1011 02:26:49.093309 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:26:55 crc kubenswrapper[4743]: I1011 02:26:55.159311 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jtx96" podUID="81708c78-25fa-4f77-aaac-78cde85d2114" containerName="registry-server" probeResult="failure" output=< Oct 11 02:26:55 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Oct 11 02:26:55 crc kubenswrapper[4743]: > Oct 11 02:27:02 crc kubenswrapper[4743]: I1011 02:27:02.091829 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:27:02 crc kubenswrapper[4743]: E1011 02:27:02.092889 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:27:04 crc kubenswrapper[4743]: I1011 02:27:04.166665 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jtx96" Oct 11 02:27:04 crc kubenswrapper[4743]: I1011 02:27:04.268930 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jtx96" Oct 11 02:27:04 crc kubenswrapper[4743]: I1011 02:27:04.920973 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jtx96"] Oct 11 02:27:05 crc kubenswrapper[4743]: I1011 02:27:05.930611 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jtx96" podUID="81708c78-25fa-4f77-aaac-78cde85d2114" containerName="registry-server" containerID="cri-o://ac6ee0386097c7bd0b56f1e58fcb54edf9440eaafdbffc087aa7015390839793" gracePeriod=2 Oct 11 02:27:06 crc kubenswrapper[4743]: I1011 02:27:06.949455 4743 generic.go:334] "Generic (PLEG): container finished" podID="81708c78-25fa-4f77-aaac-78cde85d2114" containerID="ac6ee0386097c7bd0b56f1e58fcb54edf9440eaafdbffc087aa7015390839793" exitCode=0 Oct 11 02:27:06 crc kubenswrapper[4743]: I1011 02:27:06.949529 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtx96" event={"ID":"81708c78-25fa-4f77-aaac-78cde85d2114","Type":"ContainerDied","Data":"ac6ee0386097c7bd0b56f1e58fcb54edf9440eaafdbffc087aa7015390839793"} Oct 11 02:27:06 crc kubenswrapper[4743]: I1011 02:27:06.950926 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtx96" event={"ID":"81708c78-25fa-4f77-aaac-78cde85d2114","Type":"ContainerDied","Data":"e682ef06882aea34d48276a01d2271347072aa9eb18426030ebf33ebf8639f56"} Oct 11 02:27:06 crc kubenswrapper[4743]: I1011 02:27:06.951008 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e682ef06882aea34d48276a01d2271347072aa9eb18426030ebf33ebf8639f56" Oct 11 02:27:07 crc kubenswrapper[4743]: I1011 02:27:07.096698 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtx96" Oct 11 02:27:07 crc kubenswrapper[4743]: I1011 02:27:07.199655 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81708c78-25fa-4f77-aaac-78cde85d2114-utilities\") pod \"81708c78-25fa-4f77-aaac-78cde85d2114\" (UID: \"81708c78-25fa-4f77-aaac-78cde85d2114\") " Oct 11 02:27:07 crc kubenswrapper[4743]: I1011 02:27:07.199810 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81708c78-25fa-4f77-aaac-78cde85d2114-catalog-content\") pod \"81708c78-25fa-4f77-aaac-78cde85d2114\" (UID: \"81708c78-25fa-4f77-aaac-78cde85d2114\") " Oct 11 02:27:07 crc kubenswrapper[4743]: I1011 02:27:07.199899 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktjc8\" (UniqueName: \"kubernetes.io/projected/81708c78-25fa-4f77-aaac-78cde85d2114-kube-api-access-ktjc8\") pod \"81708c78-25fa-4f77-aaac-78cde85d2114\" (UID: \"81708c78-25fa-4f77-aaac-78cde85d2114\") " Oct 11 02:27:07 crc kubenswrapper[4743]: I1011 02:27:07.202441 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81708c78-25fa-4f77-aaac-78cde85d2114-utilities" (OuterVolumeSpecName: "utilities") pod "81708c78-25fa-4f77-aaac-78cde85d2114" (UID: "81708c78-25fa-4f77-aaac-78cde85d2114"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:27:07 crc kubenswrapper[4743]: I1011 02:27:07.205645 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81708c78-25fa-4f77-aaac-78cde85d2114-kube-api-access-ktjc8" (OuterVolumeSpecName: "kube-api-access-ktjc8") pod "81708c78-25fa-4f77-aaac-78cde85d2114" (UID: "81708c78-25fa-4f77-aaac-78cde85d2114"). InnerVolumeSpecName "kube-api-access-ktjc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:27:07 crc kubenswrapper[4743]: I1011 02:27:07.302488 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81708c78-25fa-4f77-aaac-78cde85d2114-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81708c78-25fa-4f77-aaac-78cde85d2114" (UID: "81708c78-25fa-4f77-aaac-78cde85d2114"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:27:07 crc kubenswrapper[4743]: I1011 02:27:07.302711 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81708c78-25fa-4f77-aaac-78cde85d2114-catalog-content\") pod \"81708c78-25fa-4f77-aaac-78cde85d2114\" (UID: \"81708c78-25fa-4f77-aaac-78cde85d2114\") " Oct 11 02:27:07 crc kubenswrapper[4743]: I1011 02:27:07.303444 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81708c78-25fa-4f77-aaac-78cde85d2114-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 02:27:07 crc kubenswrapper[4743]: I1011 02:27:07.303472 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktjc8\" (UniqueName: \"kubernetes.io/projected/81708c78-25fa-4f77-aaac-78cde85d2114-kube-api-access-ktjc8\") on node \"crc\" DevicePath \"\"" Oct 11 02:27:07 crc kubenswrapper[4743]: W1011 02:27:07.303906 4743 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/81708c78-25fa-4f77-aaac-78cde85d2114/volumes/kubernetes.io~empty-dir/catalog-content Oct 11 02:27:07 crc kubenswrapper[4743]: I1011 02:27:07.303981 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81708c78-25fa-4f77-aaac-78cde85d2114-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81708c78-25fa-4f77-aaac-78cde85d2114" (UID: "81708c78-25fa-4f77-aaac-78cde85d2114"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:27:07 crc kubenswrapper[4743]: I1011 02:27:07.408253 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81708c78-25fa-4f77-aaac-78cde85d2114-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 02:27:07 crc kubenswrapper[4743]: I1011 02:27:07.960748 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtx96" Oct 11 02:27:08 crc kubenswrapper[4743]: I1011 02:27:08.002720 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jtx96"] Oct 11 02:27:08 crc kubenswrapper[4743]: I1011 02:27:08.011300 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jtx96"] Oct 11 02:27:08 crc kubenswrapper[4743]: I1011 02:27:08.120286 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81708c78-25fa-4f77-aaac-78cde85d2114" path="/var/lib/kubelet/pods/81708c78-25fa-4f77-aaac-78cde85d2114/volumes" Oct 11 02:27:13 crc kubenswrapper[4743]: I1011 02:27:13.092764 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:27:13 crc kubenswrapper[4743]: E1011 02:27:13.093694 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:27:25 crc kubenswrapper[4743]: I1011 02:27:25.092263 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:27:25 crc kubenswrapper[4743]: E1011 02:27:25.094510 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:27:36 crc kubenswrapper[4743]: I1011 02:27:36.099152 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:27:36 crc kubenswrapper[4743]: E1011 02:27:36.101148 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:27:48 crc kubenswrapper[4743]: I1011 02:27:48.092553 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:27:48 crc kubenswrapper[4743]: E1011 02:27:48.094505 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:28:02 crc kubenswrapper[4743]: I1011 02:28:02.092596 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:28:02 crc kubenswrapper[4743]: E1011 02:28:02.093518 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:28:16 crc kubenswrapper[4743]: I1011 02:28:16.109473 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:28:16 crc kubenswrapper[4743]: E1011 02:28:16.110954 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:28:29 crc kubenswrapper[4743]: I1011 02:28:29.092660 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:28:29 crc kubenswrapper[4743]: E1011 02:28:29.093788 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:28:43 crc kubenswrapper[4743]: I1011 02:28:43.093044 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:28:43 crc kubenswrapper[4743]: E1011 02:28:43.094217 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:28:58 crc kubenswrapper[4743]: I1011 02:28:58.092709 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:28:58 crc kubenswrapper[4743]: E1011 02:28:58.093746 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:29:13 crc kubenswrapper[4743]: I1011 02:29:13.092211 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:29:13 crc kubenswrapper[4743]: E1011 02:29:13.093016 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:29:27 crc kubenswrapper[4743]: I1011 02:29:27.091935 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:29:27 crc kubenswrapper[4743]: E1011 02:29:27.092684 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:29:42 crc kubenswrapper[4743]: I1011 02:29:42.092257 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:29:42 crc kubenswrapper[4743]: E1011 02:29:42.093919 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:29:53 crc kubenswrapper[4743]: I1011 02:29:53.091994 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:29:53 crc kubenswrapper[4743]: E1011 02:29:53.092834 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:30:00 crc kubenswrapper[4743]: I1011 02:30:00.169216 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335830-zgds8"] Oct 11 02:30:00 crc kubenswrapper[4743]: E1011 02:30:00.170357 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81708c78-25fa-4f77-aaac-78cde85d2114" containerName="registry-server" Oct 11 02:30:00 crc kubenswrapper[4743]: I1011 02:30:00.170372 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="81708c78-25fa-4f77-aaac-78cde85d2114" containerName="registry-server" Oct 11 02:30:00 crc kubenswrapper[4743]: E1011 02:30:00.170394 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81708c78-25fa-4f77-aaac-78cde85d2114" containerName="extract-utilities" Oct 11 02:30:00 crc kubenswrapper[4743]: I1011 02:30:00.170400 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="81708c78-25fa-4f77-aaac-78cde85d2114" containerName="extract-utilities" Oct 11 02:30:00 crc kubenswrapper[4743]: E1011 02:30:00.170426 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81708c78-25fa-4f77-aaac-78cde85d2114" containerName="extract-content" Oct 11 02:30:00 crc kubenswrapper[4743]: I1011 02:30:00.170432 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="81708c78-25fa-4f77-aaac-78cde85d2114" containerName="extract-content" Oct 11 02:30:00 crc kubenswrapper[4743]: I1011 02:30:00.170666 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="81708c78-25fa-4f77-aaac-78cde85d2114" containerName="registry-server" Oct 11 02:30:00 crc kubenswrapper[4743]: I1011 02:30:00.171489 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335830-zgds8" Oct 11 02:30:00 crc kubenswrapper[4743]: I1011 02:30:00.173208 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 11 02:30:00 crc kubenswrapper[4743]: I1011 02:30:00.173271 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 11 02:30:00 crc kubenswrapper[4743]: I1011 02:30:00.183747 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335830-zgds8"] Oct 11 02:30:00 crc kubenswrapper[4743]: I1011 02:30:00.228785 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c8179836-97fd-419f-b226-c3cd3ebf1530-config-volume\") pod \"collect-profiles-29335830-zgds8\" (UID: \"c8179836-97fd-419f-b226-c3cd3ebf1530\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335830-zgds8" Oct 11 02:30:00 crc kubenswrapper[4743]: I1011 02:30:00.228933 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7l65\" (UniqueName: \"kubernetes.io/projected/c8179836-97fd-419f-b226-c3cd3ebf1530-kube-api-access-b7l65\") pod \"collect-profiles-29335830-zgds8\" (UID: \"c8179836-97fd-419f-b226-c3cd3ebf1530\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335830-zgds8" Oct 11 02:30:00 crc kubenswrapper[4743]: I1011 02:30:00.228967 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c8179836-97fd-419f-b226-c3cd3ebf1530-secret-volume\") pod \"collect-profiles-29335830-zgds8\" (UID: \"c8179836-97fd-419f-b226-c3cd3ebf1530\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335830-zgds8" Oct 11 02:30:00 crc kubenswrapper[4743]: I1011 02:30:00.330670 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c8179836-97fd-419f-b226-c3cd3ebf1530-config-volume\") pod \"collect-profiles-29335830-zgds8\" (UID: \"c8179836-97fd-419f-b226-c3cd3ebf1530\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335830-zgds8" Oct 11 02:30:00 crc kubenswrapper[4743]: I1011 02:30:00.330787 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c8179836-97fd-419f-b226-c3cd3ebf1530-secret-volume\") pod \"collect-profiles-29335830-zgds8\" (UID: \"c8179836-97fd-419f-b226-c3cd3ebf1530\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335830-zgds8" Oct 11 02:30:00 crc kubenswrapper[4743]: I1011 02:30:00.330810 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7l65\" (UniqueName: \"kubernetes.io/projected/c8179836-97fd-419f-b226-c3cd3ebf1530-kube-api-access-b7l65\") pod \"collect-profiles-29335830-zgds8\" (UID: \"c8179836-97fd-419f-b226-c3cd3ebf1530\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335830-zgds8" Oct 11 02:30:00 crc kubenswrapper[4743]: I1011 02:30:00.331900 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c8179836-97fd-419f-b226-c3cd3ebf1530-config-volume\") pod \"collect-profiles-29335830-zgds8\" (UID: \"c8179836-97fd-419f-b226-c3cd3ebf1530\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335830-zgds8" Oct 11 02:30:00 crc kubenswrapper[4743]: I1011 02:30:00.337316 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c8179836-97fd-419f-b226-c3cd3ebf1530-secret-volume\") pod \"collect-profiles-29335830-zgds8\" (UID: \"c8179836-97fd-419f-b226-c3cd3ebf1530\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335830-zgds8" Oct 11 02:30:00 crc kubenswrapper[4743]: I1011 02:30:00.349149 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7l65\" (UniqueName: \"kubernetes.io/projected/c8179836-97fd-419f-b226-c3cd3ebf1530-kube-api-access-b7l65\") pod \"collect-profiles-29335830-zgds8\" (UID: \"c8179836-97fd-419f-b226-c3cd3ebf1530\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335830-zgds8" Oct 11 02:30:00 crc kubenswrapper[4743]: I1011 02:30:00.502112 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335830-zgds8" Oct 11 02:30:01 crc kubenswrapper[4743]: I1011 02:30:01.016008 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335830-zgds8"] Oct 11 02:30:01 crc kubenswrapper[4743]: I1011 02:30:01.033016 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335830-zgds8" event={"ID":"c8179836-97fd-419f-b226-c3cd3ebf1530","Type":"ContainerStarted","Data":"a463c07ebdfe67e5b7c8e50ba017a4362377a47b2210f7a8475975dc80af0236"} Oct 11 02:30:02 crc kubenswrapper[4743]: I1011 02:30:02.046666 4743 generic.go:334] "Generic (PLEG): container finished" podID="c8179836-97fd-419f-b226-c3cd3ebf1530" containerID="aa348e18c5bd9529e8a6d706d9bf377a98c25d878f637fa5fbd9b5ccfc2b0625" exitCode=0 Oct 11 02:30:02 crc kubenswrapper[4743]: I1011 02:30:02.046755 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335830-zgds8" event={"ID":"c8179836-97fd-419f-b226-c3cd3ebf1530","Type":"ContainerDied","Data":"aa348e18c5bd9529e8a6d706d9bf377a98c25d878f637fa5fbd9b5ccfc2b0625"} Oct 11 02:30:03 crc kubenswrapper[4743]: I1011 02:30:03.508431 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335830-zgds8" Oct 11 02:30:03 crc kubenswrapper[4743]: I1011 02:30:03.604703 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c8179836-97fd-419f-b226-c3cd3ebf1530-secret-volume\") pod \"c8179836-97fd-419f-b226-c3cd3ebf1530\" (UID: \"c8179836-97fd-419f-b226-c3cd3ebf1530\") " Oct 11 02:30:03 crc kubenswrapper[4743]: I1011 02:30:03.605016 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7l65\" (UniqueName: \"kubernetes.io/projected/c8179836-97fd-419f-b226-c3cd3ebf1530-kube-api-access-b7l65\") pod \"c8179836-97fd-419f-b226-c3cd3ebf1530\" (UID: \"c8179836-97fd-419f-b226-c3cd3ebf1530\") " Oct 11 02:30:03 crc kubenswrapper[4743]: I1011 02:30:03.605071 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c8179836-97fd-419f-b226-c3cd3ebf1530-config-volume\") pod \"c8179836-97fd-419f-b226-c3cd3ebf1530\" (UID: \"c8179836-97fd-419f-b226-c3cd3ebf1530\") " Oct 11 02:30:03 crc kubenswrapper[4743]: I1011 02:30:03.605684 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8179836-97fd-419f-b226-c3cd3ebf1530-config-volume" (OuterVolumeSpecName: "config-volume") pod "c8179836-97fd-419f-b226-c3cd3ebf1530" (UID: "c8179836-97fd-419f-b226-c3cd3ebf1530"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 02:30:03 crc kubenswrapper[4743]: I1011 02:30:03.606450 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c8179836-97fd-419f-b226-c3cd3ebf1530-config-volume\") on node \"crc\" DevicePath \"\"" Oct 11 02:30:03 crc kubenswrapper[4743]: I1011 02:30:03.611146 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8179836-97fd-419f-b226-c3cd3ebf1530-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c8179836-97fd-419f-b226-c3cd3ebf1530" (UID: "c8179836-97fd-419f-b226-c3cd3ebf1530"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:30:03 crc kubenswrapper[4743]: I1011 02:30:03.613115 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8179836-97fd-419f-b226-c3cd3ebf1530-kube-api-access-b7l65" (OuterVolumeSpecName: "kube-api-access-b7l65") pod "c8179836-97fd-419f-b226-c3cd3ebf1530" (UID: "c8179836-97fd-419f-b226-c3cd3ebf1530"). InnerVolumeSpecName "kube-api-access-b7l65". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:30:03 crc kubenswrapper[4743]: I1011 02:30:03.708533 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7l65\" (UniqueName: \"kubernetes.io/projected/c8179836-97fd-419f-b226-c3cd3ebf1530-kube-api-access-b7l65\") on node \"crc\" DevicePath \"\"" Oct 11 02:30:03 crc kubenswrapper[4743]: I1011 02:30:03.708573 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c8179836-97fd-419f-b226-c3cd3ebf1530-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 11 02:30:04 crc kubenswrapper[4743]: I1011 02:30:04.066885 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335830-zgds8" event={"ID":"c8179836-97fd-419f-b226-c3cd3ebf1530","Type":"ContainerDied","Data":"a463c07ebdfe67e5b7c8e50ba017a4362377a47b2210f7a8475975dc80af0236"} Oct 11 02:30:04 crc kubenswrapper[4743]: I1011 02:30:04.067251 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a463c07ebdfe67e5b7c8e50ba017a4362377a47b2210f7a8475975dc80af0236" Oct 11 02:30:04 crc kubenswrapper[4743]: I1011 02:30:04.066934 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335830-zgds8" Oct 11 02:30:04 crc kubenswrapper[4743]: I1011 02:30:04.602365 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335785-z7bgx"] Oct 11 02:30:04 crc kubenswrapper[4743]: I1011 02:30:04.614142 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335785-z7bgx"] Oct 11 02:30:06 crc kubenswrapper[4743]: I1011 02:30:06.122470 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fde7d0a-32a2-4eae-b215-ae5f6d819980" path="/var/lib/kubelet/pods/1fde7d0a-32a2-4eae-b215-ae5f6d819980/volumes" Oct 11 02:30:08 crc kubenswrapper[4743]: I1011 02:30:08.092368 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:30:08 crc kubenswrapper[4743]: E1011 02:30:08.092999 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:30:19 crc kubenswrapper[4743]: I1011 02:30:19.092412 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:30:19 crc kubenswrapper[4743]: E1011 02:30:19.093310 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:30:33 crc kubenswrapper[4743]: I1011 02:30:33.092318 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:30:33 crc kubenswrapper[4743]: E1011 02:30:33.092915 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:30:38 crc kubenswrapper[4743]: I1011 02:30:38.010989 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zbmjv"] Oct 11 02:30:38 crc kubenswrapper[4743]: E1011 02:30:38.012072 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8179836-97fd-419f-b226-c3cd3ebf1530" containerName="collect-profiles" Oct 11 02:30:38 crc kubenswrapper[4743]: I1011 02:30:38.012093 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8179836-97fd-419f-b226-c3cd3ebf1530" containerName="collect-profiles" Oct 11 02:30:38 crc kubenswrapper[4743]: I1011 02:30:38.012384 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8179836-97fd-419f-b226-c3cd3ebf1530" containerName="collect-profiles" Oct 11 02:30:38 crc kubenswrapper[4743]: I1011 02:30:38.014111 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zbmjv" Oct 11 02:30:38 crc kubenswrapper[4743]: I1011 02:30:38.032682 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zbmjv"] Oct 11 02:30:38 crc kubenswrapper[4743]: I1011 02:30:38.188686 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad-catalog-content\") pod \"certified-operators-zbmjv\" (UID: \"3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad\") " pod="openshift-marketplace/certified-operators-zbmjv" Oct 11 02:30:38 crc kubenswrapper[4743]: I1011 02:30:38.189067 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad-utilities\") pod \"certified-operators-zbmjv\" (UID: \"3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad\") " pod="openshift-marketplace/certified-operators-zbmjv" Oct 11 02:30:38 crc kubenswrapper[4743]: I1011 02:30:38.189270 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxnv8\" (UniqueName: \"kubernetes.io/projected/3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad-kube-api-access-zxnv8\") pod \"certified-operators-zbmjv\" (UID: \"3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad\") " pod="openshift-marketplace/certified-operators-zbmjv" Oct 11 02:30:38 crc kubenswrapper[4743]: I1011 02:30:38.291320 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxnv8\" (UniqueName: \"kubernetes.io/projected/3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad-kube-api-access-zxnv8\") pod \"certified-operators-zbmjv\" (UID: \"3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad\") " pod="openshift-marketplace/certified-operators-zbmjv" Oct 11 02:30:38 crc kubenswrapper[4743]: I1011 02:30:38.291462 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad-catalog-content\") pod \"certified-operators-zbmjv\" (UID: \"3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad\") " pod="openshift-marketplace/certified-operators-zbmjv" Oct 11 02:30:38 crc kubenswrapper[4743]: I1011 02:30:38.291583 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad-utilities\") pod \"certified-operators-zbmjv\" (UID: \"3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad\") " pod="openshift-marketplace/certified-operators-zbmjv" Oct 11 02:30:38 crc kubenswrapper[4743]: I1011 02:30:38.292193 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad-utilities\") pod \"certified-operators-zbmjv\" (UID: \"3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad\") " pod="openshift-marketplace/certified-operators-zbmjv" Oct 11 02:30:38 crc kubenswrapper[4743]: I1011 02:30:38.292267 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad-catalog-content\") pod \"certified-operators-zbmjv\" (UID: \"3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad\") " pod="openshift-marketplace/certified-operators-zbmjv" Oct 11 02:30:38 crc kubenswrapper[4743]: I1011 02:30:38.315080 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxnv8\" (UniqueName: \"kubernetes.io/projected/3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad-kube-api-access-zxnv8\") pod \"certified-operators-zbmjv\" (UID: \"3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad\") " pod="openshift-marketplace/certified-operators-zbmjv" Oct 11 02:30:38 crc kubenswrapper[4743]: I1011 02:30:38.338795 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zbmjv" Oct 11 02:30:38 crc kubenswrapper[4743]: I1011 02:30:38.903416 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zbmjv"] Oct 11 02:30:39 crc kubenswrapper[4743]: I1011 02:30:39.485269 4743 generic.go:334] "Generic (PLEG): container finished" podID="3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad" containerID="c5a1b2a8558ee621fc906308c609f22a0d897bb2774996514ec1dab4dcdf8c4c" exitCode=0 Oct 11 02:30:39 crc kubenswrapper[4743]: I1011 02:30:39.485423 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbmjv" event={"ID":"3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad","Type":"ContainerDied","Data":"c5a1b2a8558ee621fc906308c609f22a0d897bb2774996514ec1dab4dcdf8c4c"} Oct 11 02:30:39 crc kubenswrapper[4743]: I1011 02:30:39.485619 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbmjv" event={"ID":"3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad","Type":"ContainerStarted","Data":"2e6468b012401c891c759d735bf9d0a637766a25dd154e1dd378b64d114b1311"} Oct 11 02:30:40 crc kubenswrapper[4743]: I1011 02:30:40.504742 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbmjv" event={"ID":"3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad","Type":"ContainerStarted","Data":"99c4c8fbb79ce08df7772de4be754ada54b8834dac00cb0076e9d1b5a798c357"} Oct 11 02:30:42 crc kubenswrapper[4743]: I1011 02:30:42.532118 4743 generic.go:334] "Generic (PLEG): container finished" podID="3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad" containerID="99c4c8fbb79ce08df7772de4be754ada54b8834dac00cb0076e9d1b5a798c357" exitCode=0 Oct 11 02:30:42 crc kubenswrapper[4743]: I1011 02:30:42.532206 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbmjv" event={"ID":"3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad","Type":"ContainerDied","Data":"99c4c8fbb79ce08df7772de4be754ada54b8834dac00cb0076e9d1b5a798c357"} Oct 11 02:30:43 crc kubenswrapper[4743]: I1011 02:30:43.544688 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbmjv" event={"ID":"3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad","Type":"ContainerStarted","Data":"b9e11971dabbdbf41b69459570511a0d374955e40879019b7d38a7fcc537e115"} Oct 11 02:30:43 crc kubenswrapper[4743]: I1011 02:30:43.569574 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zbmjv" podStartSLOduration=3.071986982 podStartE2EDuration="6.569551704s" podCreationTimestamp="2025-10-11 02:30:37 +0000 UTC" firstStartedPulling="2025-10-11 02:30:39.487597332 +0000 UTC m=+5934.140577729" lastFinishedPulling="2025-10-11 02:30:42.985162054 +0000 UTC m=+5937.638142451" observedRunningTime="2025-10-11 02:30:43.564395896 +0000 UTC m=+5938.217376303" watchObservedRunningTime="2025-10-11 02:30:43.569551704 +0000 UTC m=+5938.222532101" Oct 11 02:30:48 crc kubenswrapper[4743]: I1011 02:30:48.093450 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:30:48 crc kubenswrapper[4743]: E1011 02:30:48.094606 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:30:48 crc kubenswrapper[4743]: I1011 02:30:48.340502 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zbmjv" Oct 11 02:30:48 crc kubenswrapper[4743]: I1011 02:30:48.340548 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zbmjv" Oct 11 02:30:48 crc kubenswrapper[4743]: I1011 02:30:48.407270 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zbmjv" Oct 11 02:30:48 crc kubenswrapper[4743]: I1011 02:30:48.652722 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zbmjv" Oct 11 02:30:48 crc kubenswrapper[4743]: I1011 02:30:48.721714 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zbmjv"] Oct 11 02:30:50 crc kubenswrapper[4743]: I1011 02:30:50.631454 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zbmjv" podUID="3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad" containerName="registry-server" containerID="cri-o://b9e11971dabbdbf41b69459570511a0d374955e40879019b7d38a7fcc537e115" gracePeriod=2 Oct 11 02:30:51 crc kubenswrapper[4743]: I1011 02:30:51.221220 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zbmjv" Oct 11 02:30:51 crc kubenswrapper[4743]: I1011 02:30:51.410821 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad-utilities\") pod \"3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad\" (UID: \"3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad\") " Oct 11 02:30:51 crc kubenswrapper[4743]: I1011 02:30:51.410893 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad-catalog-content\") pod \"3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad\" (UID: \"3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad\") " Oct 11 02:30:51 crc kubenswrapper[4743]: I1011 02:30:51.410920 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxnv8\" (UniqueName: \"kubernetes.io/projected/3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad-kube-api-access-zxnv8\") pod \"3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad\" (UID: \"3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad\") " Oct 11 02:30:51 crc kubenswrapper[4743]: I1011 02:30:51.412592 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad-utilities" (OuterVolumeSpecName: "utilities") pod "3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad" (UID: "3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:30:51 crc kubenswrapper[4743]: I1011 02:30:51.419593 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad-kube-api-access-zxnv8" (OuterVolumeSpecName: "kube-api-access-zxnv8") pod "3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad" (UID: "3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad"). InnerVolumeSpecName "kube-api-access-zxnv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:30:51 crc kubenswrapper[4743]: I1011 02:30:51.481190 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad" (UID: "3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:30:51 crc kubenswrapper[4743]: I1011 02:30:51.513554 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 02:30:51 crc kubenswrapper[4743]: I1011 02:30:51.513607 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 02:30:51 crc kubenswrapper[4743]: I1011 02:30:51.513637 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxnv8\" (UniqueName: \"kubernetes.io/projected/3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad-kube-api-access-zxnv8\") on node \"crc\" DevicePath \"\"" Oct 11 02:30:51 crc kubenswrapper[4743]: I1011 02:30:51.648042 4743 generic.go:334] "Generic (PLEG): container finished" podID="3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad" containerID="b9e11971dabbdbf41b69459570511a0d374955e40879019b7d38a7fcc537e115" exitCode=0 Oct 11 02:30:51 crc kubenswrapper[4743]: I1011 02:30:51.648106 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbmjv" event={"ID":"3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad","Type":"ContainerDied","Data":"b9e11971dabbdbf41b69459570511a0d374955e40879019b7d38a7fcc537e115"} Oct 11 02:30:51 crc kubenswrapper[4743]: I1011 02:30:51.648181 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zbmjv" Oct 11 02:30:51 crc kubenswrapper[4743]: I1011 02:30:51.648834 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbmjv" event={"ID":"3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad","Type":"ContainerDied","Data":"2e6468b012401c891c759d735bf9d0a637766a25dd154e1dd378b64d114b1311"} Oct 11 02:30:51 crc kubenswrapper[4743]: I1011 02:30:51.648938 4743 scope.go:117] "RemoveContainer" containerID="b9e11971dabbdbf41b69459570511a0d374955e40879019b7d38a7fcc537e115" Oct 11 02:30:51 crc kubenswrapper[4743]: I1011 02:30:51.676774 4743 scope.go:117] "RemoveContainer" containerID="99c4c8fbb79ce08df7772de4be754ada54b8834dac00cb0076e9d1b5a798c357" Oct 11 02:30:51 crc kubenswrapper[4743]: I1011 02:30:51.698265 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zbmjv"] Oct 11 02:30:51 crc kubenswrapper[4743]: I1011 02:30:51.710752 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zbmjv"] Oct 11 02:30:51 crc kubenswrapper[4743]: I1011 02:30:51.716396 4743 scope.go:117] "RemoveContainer" containerID="c5a1b2a8558ee621fc906308c609f22a0d897bb2774996514ec1dab4dcdf8c4c" Oct 11 02:30:51 crc kubenswrapper[4743]: I1011 02:30:51.780908 4743 scope.go:117] "RemoveContainer" containerID="b9e11971dabbdbf41b69459570511a0d374955e40879019b7d38a7fcc537e115" Oct 11 02:30:51 crc kubenswrapper[4743]: E1011 02:30:51.781446 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9e11971dabbdbf41b69459570511a0d374955e40879019b7d38a7fcc537e115\": container with ID starting with b9e11971dabbdbf41b69459570511a0d374955e40879019b7d38a7fcc537e115 not found: ID does not exist" containerID="b9e11971dabbdbf41b69459570511a0d374955e40879019b7d38a7fcc537e115" Oct 11 02:30:51 crc kubenswrapper[4743]: I1011 02:30:51.781494 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9e11971dabbdbf41b69459570511a0d374955e40879019b7d38a7fcc537e115"} err="failed to get container status \"b9e11971dabbdbf41b69459570511a0d374955e40879019b7d38a7fcc537e115\": rpc error: code = NotFound desc = could not find container \"b9e11971dabbdbf41b69459570511a0d374955e40879019b7d38a7fcc537e115\": container with ID starting with b9e11971dabbdbf41b69459570511a0d374955e40879019b7d38a7fcc537e115 not found: ID does not exist" Oct 11 02:30:51 crc kubenswrapper[4743]: I1011 02:30:51.781522 4743 scope.go:117] "RemoveContainer" containerID="99c4c8fbb79ce08df7772de4be754ada54b8834dac00cb0076e9d1b5a798c357" Oct 11 02:30:51 crc kubenswrapper[4743]: E1011 02:30:51.781965 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99c4c8fbb79ce08df7772de4be754ada54b8834dac00cb0076e9d1b5a798c357\": container with ID starting with 99c4c8fbb79ce08df7772de4be754ada54b8834dac00cb0076e9d1b5a798c357 not found: ID does not exist" containerID="99c4c8fbb79ce08df7772de4be754ada54b8834dac00cb0076e9d1b5a798c357" Oct 11 02:30:51 crc kubenswrapper[4743]: I1011 02:30:51.782005 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99c4c8fbb79ce08df7772de4be754ada54b8834dac00cb0076e9d1b5a798c357"} err="failed to get container status \"99c4c8fbb79ce08df7772de4be754ada54b8834dac00cb0076e9d1b5a798c357\": rpc error: code = NotFound desc = could not find container \"99c4c8fbb79ce08df7772de4be754ada54b8834dac00cb0076e9d1b5a798c357\": container with ID starting with 99c4c8fbb79ce08df7772de4be754ada54b8834dac00cb0076e9d1b5a798c357 not found: ID does not exist" Oct 11 02:30:51 crc kubenswrapper[4743]: I1011 02:30:51.782027 4743 scope.go:117] "RemoveContainer" containerID="c5a1b2a8558ee621fc906308c609f22a0d897bb2774996514ec1dab4dcdf8c4c" Oct 11 02:30:51 crc kubenswrapper[4743]: E1011 02:30:51.782331 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5a1b2a8558ee621fc906308c609f22a0d897bb2774996514ec1dab4dcdf8c4c\": container with ID starting with c5a1b2a8558ee621fc906308c609f22a0d897bb2774996514ec1dab4dcdf8c4c not found: ID does not exist" containerID="c5a1b2a8558ee621fc906308c609f22a0d897bb2774996514ec1dab4dcdf8c4c" Oct 11 02:30:51 crc kubenswrapper[4743]: I1011 02:30:51.782373 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5a1b2a8558ee621fc906308c609f22a0d897bb2774996514ec1dab4dcdf8c4c"} err="failed to get container status \"c5a1b2a8558ee621fc906308c609f22a0d897bb2774996514ec1dab4dcdf8c4c\": rpc error: code = NotFound desc = could not find container \"c5a1b2a8558ee621fc906308c609f22a0d897bb2774996514ec1dab4dcdf8c4c\": container with ID starting with c5a1b2a8558ee621fc906308c609f22a0d897bb2774996514ec1dab4dcdf8c4c not found: ID does not exist" Oct 11 02:30:52 crc kubenswrapper[4743]: I1011 02:30:52.104040 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad" path="/var/lib/kubelet/pods/3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad/volumes" Oct 11 02:30:57 crc kubenswrapper[4743]: I1011 02:30:57.979588 4743 scope.go:117] "RemoveContainer" containerID="fed630d91b09ea66cf637b8d41962e061b07d0b0cdb8dcdeadc322e82ad10a97" Oct 11 02:31:03 crc kubenswrapper[4743]: I1011 02:31:03.091662 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:31:03 crc kubenswrapper[4743]: E1011 02:31:03.093581 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:31:11 crc kubenswrapper[4743]: I1011 02:31:11.805937 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qdthz"] Oct 11 02:31:11 crc kubenswrapper[4743]: E1011 02:31:11.806939 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad" containerName="extract-content" Oct 11 02:31:11 crc kubenswrapper[4743]: I1011 02:31:11.806958 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad" containerName="extract-content" Oct 11 02:31:11 crc kubenswrapper[4743]: E1011 02:31:11.806977 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad" containerName="extract-utilities" Oct 11 02:31:11 crc kubenswrapper[4743]: I1011 02:31:11.806983 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad" containerName="extract-utilities" Oct 11 02:31:11 crc kubenswrapper[4743]: E1011 02:31:11.807007 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad" containerName="registry-server" Oct 11 02:31:11 crc kubenswrapper[4743]: I1011 02:31:11.807014 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad" containerName="registry-server" Oct 11 02:31:11 crc kubenswrapper[4743]: I1011 02:31:11.807256 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c77deb4-d488-4e1d-a9a5-6d39ac9d27ad" containerName="registry-server" Oct 11 02:31:11 crc kubenswrapper[4743]: I1011 02:31:11.809370 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qdthz" Oct 11 02:31:11 crc kubenswrapper[4743]: I1011 02:31:11.829726 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qdthz"] Oct 11 02:31:11 crc kubenswrapper[4743]: I1011 02:31:11.906610 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/071aa1a8-13b1-4c41-bbbe-939ea1ec648c-catalog-content\") pod \"community-operators-qdthz\" (UID: \"071aa1a8-13b1-4c41-bbbe-939ea1ec648c\") " pod="openshift-marketplace/community-operators-qdthz" Oct 11 02:31:11 crc kubenswrapper[4743]: I1011 02:31:11.906888 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/071aa1a8-13b1-4c41-bbbe-939ea1ec648c-utilities\") pod \"community-operators-qdthz\" (UID: \"071aa1a8-13b1-4c41-bbbe-939ea1ec648c\") " pod="openshift-marketplace/community-operators-qdthz" Oct 11 02:31:11 crc kubenswrapper[4743]: I1011 02:31:11.907033 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44z9f\" (UniqueName: \"kubernetes.io/projected/071aa1a8-13b1-4c41-bbbe-939ea1ec648c-kube-api-access-44z9f\") pod \"community-operators-qdthz\" (UID: \"071aa1a8-13b1-4c41-bbbe-939ea1ec648c\") " pod="openshift-marketplace/community-operators-qdthz" Oct 11 02:31:12 crc kubenswrapper[4743]: I1011 02:31:12.009935 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/071aa1a8-13b1-4c41-bbbe-939ea1ec648c-catalog-content\") pod \"community-operators-qdthz\" (UID: \"071aa1a8-13b1-4c41-bbbe-939ea1ec648c\") " pod="openshift-marketplace/community-operators-qdthz" Oct 11 02:31:12 crc kubenswrapper[4743]: I1011 02:31:12.010131 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/071aa1a8-13b1-4c41-bbbe-939ea1ec648c-utilities\") pod \"community-operators-qdthz\" (UID: \"071aa1a8-13b1-4c41-bbbe-939ea1ec648c\") " pod="openshift-marketplace/community-operators-qdthz" Oct 11 02:31:12 crc kubenswrapper[4743]: I1011 02:31:12.010211 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44z9f\" (UniqueName: \"kubernetes.io/projected/071aa1a8-13b1-4c41-bbbe-939ea1ec648c-kube-api-access-44z9f\") pod \"community-operators-qdthz\" (UID: \"071aa1a8-13b1-4c41-bbbe-939ea1ec648c\") " pod="openshift-marketplace/community-operators-qdthz" Oct 11 02:31:12 crc kubenswrapper[4743]: I1011 02:31:12.010827 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/071aa1a8-13b1-4c41-bbbe-939ea1ec648c-catalog-content\") pod \"community-operators-qdthz\" (UID: \"071aa1a8-13b1-4c41-bbbe-939ea1ec648c\") " pod="openshift-marketplace/community-operators-qdthz" Oct 11 02:31:12 crc kubenswrapper[4743]: I1011 02:31:12.010891 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/071aa1a8-13b1-4c41-bbbe-939ea1ec648c-utilities\") pod \"community-operators-qdthz\" (UID: \"071aa1a8-13b1-4c41-bbbe-939ea1ec648c\") " pod="openshift-marketplace/community-operators-qdthz" Oct 11 02:31:12 crc kubenswrapper[4743]: I1011 02:31:12.050207 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44z9f\" (UniqueName: \"kubernetes.io/projected/071aa1a8-13b1-4c41-bbbe-939ea1ec648c-kube-api-access-44z9f\") pod \"community-operators-qdthz\" (UID: \"071aa1a8-13b1-4c41-bbbe-939ea1ec648c\") " pod="openshift-marketplace/community-operators-qdthz" Oct 11 02:31:12 crc kubenswrapper[4743]: I1011 02:31:12.170594 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qdthz" Oct 11 02:31:12 crc kubenswrapper[4743]: I1011 02:31:12.402319 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g4zpz"] Oct 11 02:31:12 crc kubenswrapper[4743]: I1011 02:31:12.404712 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g4zpz" Oct 11 02:31:12 crc kubenswrapper[4743]: I1011 02:31:12.434766 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g4zpz"] Oct 11 02:31:12 crc kubenswrapper[4743]: I1011 02:31:12.520761 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa054b17-9f03-495d-920a-76175af779b9-catalog-content\") pod \"redhat-marketplace-g4zpz\" (UID: \"aa054b17-9f03-495d-920a-76175af779b9\") " pod="openshift-marketplace/redhat-marketplace-g4zpz" Oct 11 02:31:12 crc kubenswrapper[4743]: I1011 02:31:12.520844 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqpzt\" (UniqueName: \"kubernetes.io/projected/aa054b17-9f03-495d-920a-76175af779b9-kube-api-access-xqpzt\") pod \"redhat-marketplace-g4zpz\" (UID: \"aa054b17-9f03-495d-920a-76175af779b9\") " pod="openshift-marketplace/redhat-marketplace-g4zpz" Oct 11 02:31:12 crc kubenswrapper[4743]: I1011 02:31:12.521515 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa054b17-9f03-495d-920a-76175af779b9-utilities\") pod \"redhat-marketplace-g4zpz\" (UID: \"aa054b17-9f03-495d-920a-76175af779b9\") " pod="openshift-marketplace/redhat-marketplace-g4zpz" Oct 11 02:31:12 crc kubenswrapper[4743]: I1011 02:31:12.623764 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa054b17-9f03-495d-920a-76175af779b9-utilities\") pod \"redhat-marketplace-g4zpz\" (UID: \"aa054b17-9f03-495d-920a-76175af779b9\") " pod="openshift-marketplace/redhat-marketplace-g4zpz" Oct 11 02:31:12 crc kubenswrapper[4743]: I1011 02:31:12.624072 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa054b17-9f03-495d-920a-76175af779b9-catalog-content\") pod \"redhat-marketplace-g4zpz\" (UID: \"aa054b17-9f03-495d-920a-76175af779b9\") " pod="openshift-marketplace/redhat-marketplace-g4zpz" Oct 11 02:31:12 crc kubenswrapper[4743]: I1011 02:31:12.624096 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqpzt\" (UniqueName: \"kubernetes.io/projected/aa054b17-9f03-495d-920a-76175af779b9-kube-api-access-xqpzt\") pod \"redhat-marketplace-g4zpz\" (UID: \"aa054b17-9f03-495d-920a-76175af779b9\") " pod="openshift-marketplace/redhat-marketplace-g4zpz" Oct 11 02:31:12 crc kubenswrapper[4743]: I1011 02:31:12.624300 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa054b17-9f03-495d-920a-76175af779b9-utilities\") pod \"redhat-marketplace-g4zpz\" (UID: \"aa054b17-9f03-495d-920a-76175af779b9\") " pod="openshift-marketplace/redhat-marketplace-g4zpz" Oct 11 02:31:12 crc kubenswrapper[4743]: I1011 02:31:12.624443 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa054b17-9f03-495d-920a-76175af779b9-catalog-content\") pod \"redhat-marketplace-g4zpz\" (UID: \"aa054b17-9f03-495d-920a-76175af779b9\") " pod="openshift-marketplace/redhat-marketplace-g4zpz" Oct 11 02:31:12 crc kubenswrapper[4743]: I1011 02:31:12.642595 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqpzt\" (UniqueName: \"kubernetes.io/projected/aa054b17-9f03-495d-920a-76175af779b9-kube-api-access-xqpzt\") pod \"redhat-marketplace-g4zpz\" (UID: \"aa054b17-9f03-495d-920a-76175af779b9\") " pod="openshift-marketplace/redhat-marketplace-g4zpz" Oct 11 02:31:12 crc kubenswrapper[4743]: I1011 02:31:12.720441 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qdthz"] Oct 11 02:31:12 crc kubenswrapper[4743]: I1011 02:31:12.748718 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g4zpz" Oct 11 02:31:12 crc kubenswrapper[4743]: I1011 02:31:12.925377 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdthz" event={"ID":"071aa1a8-13b1-4c41-bbbe-939ea1ec648c","Type":"ContainerStarted","Data":"4d1df81f6ead8eef3312dbc8553a1c7d7491731cfb6a1514e58c5fd7562e8cc0"} Oct 11 02:31:12 crc kubenswrapper[4743]: I1011 02:31:12.925686 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdthz" event={"ID":"071aa1a8-13b1-4c41-bbbe-939ea1ec648c","Type":"ContainerStarted","Data":"69a91ffd310ad84336fd364ca8d4d6cb16d29246c3d9cfbec8b58c44d8cbc876"} Oct 11 02:31:13 crc kubenswrapper[4743]: I1011 02:31:13.249327 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g4zpz"] Oct 11 02:31:13 crc kubenswrapper[4743]: I1011 02:31:13.940927 4743 generic.go:334] "Generic (PLEG): container finished" podID="aa054b17-9f03-495d-920a-76175af779b9" containerID="4ad5f4ba2ea9d6a3bd01270d6d58a0d9f5f0b620f680ba70c4174961416e71b2" exitCode=0 Oct 11 02:31:13 crc kubenswrapper[4743]: I1011 02:31:13.941402 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g4zpz" event={"ID":"aa054b17-9f03-495d-920a-76175af779b9","Type":"ContainerDied","Data":"4ad5f4ba2ea9d6a3bd01270d6d58a0d9f5f0b620f680ba70c4174961416e71b2"} Oct 11 02:31:13 crc kubenswrapper[4743]: I1011 02:31:13.941438 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g4zpz" event={"ID":"aa054b17-9f03-495d-920a-76175af779b9","Type":"ContainerStarted","Data":"d3280c8f015f36cc6c46c580d2f80199335ba7bf6a6cfe55075cf9e7c5c19586"} Oct 11 02:31:13 crc kubenswrapper[4743]: I1011 02:31:13.945811 4743 generic.go:334] "Generic (PLEG): container finished" podID="071aa1a8-13b1-4c41-bbbe-939ea1ec648c" containerID="4d1df81f6ead8eef3312dbc8553a1c7d7491731cfb6a1514e58c5fd7562e8cc0" exitCode=0 Oct 11 02:31:13 crc kubenswrapper[4743]: I1011 02:31:13.945879 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdthz" event={"ID":"071aa1a8-13b1-4c41-bbbe-939ea1ec648c","Type":"ContainerDied","Data":"4d1df81f6ead8eef3312dbc8553a1c7d7491731cfb6a1514e58c5fd7562e8cc0"} Oct 11 02:31:14 crc kubenswrapper[4743]: I1011 02:31:14.093291 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:31:14 crc kubenswrapper[4743]: E1011 02:31:14.094639 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:31:14 crc kubenswrapper[4743]: I1011 02:31:14.959374 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdthz" event={"ID":"071aa1a8-13b1-4c41-bbbe-939ea1ec648c","Type":"ContainerStarted","Data":"4bd1539f5147e0f5e0fee970373b053d7511cca7d650eed0ab8ea96f09e82283"} Oct 11 02:31:15 crc kubenswrapper[4743]: I1011 02:31:15.973817 4743 generic.go:334] "Generic (PLEG): container finished" podID="071aa1a8-13b1-4c41-bbbe-939ea1ec648c" containerID="4bd1539f5147e0f5e0fee970373b053d7511cca7d650eed0ab8ea96f09e82283" exitCode=0 Oct 11 02:31:15 crc kubenswrapper[4743]: I1011 02:31:15.973940 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdthz" event={"ID":"071aa1a8-13b1-4c41-bbbe-939ea1ec648c","Type":"ContainerDied","Data":"4bd1539f5147e0f5e0fee970373b053d7511cca7d650eed0ab8ea96f09e82283"} Oct 11 02:31:15 crc kubenswrapper[4743]: I1011 02:31:15.978715 4743 generic.go:334] "Generic (PLEG): container finished" podID="aa054b17-9f03-495d-920a-76175af779b9" containerID="0cdcd255ff03856d8f698ab49b231ae84951af5d7744a18ae44291b3c7f1aa02" exitCode=0 Oct 11 02:31:15 crc kubenswrapper[4743]: I1011 02:31:15.978964 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g4zpz" event={"ID":"aa054b17-9f03-495d-920a-76175af779b9","Type":"ContainerDied","Data":"0cdcd255ff03856d8f698ab49b231ae84951af5d7744a18ae44291b3c7f1aa02"} Oct 11 02:31:16 crc kubenswrapper[4743]: I1011 02:31:16.992958 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdthz" event={"ID":"071aa1a8-13b1-4c41-bbbe-939ea1ec648c","Type":"ContainerStarted","Data":"2032c8b1b80a77f2dbe1ee65e41f0335015392285d07df6e47d251a2a5b5958f"} Oct 11 02:31:16 crc kubenswrapper[4743]: I1011 02:31:16.997351 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g4zpz" event={"ID":"aa054b17-9f03-495d-920a-76175af779b9","Type":"ContainerStarted","Data":"e2f0dafbf0ffb78fd30e10f0807563991e9a48ea48dd62e7c9285bc94cb4a0f9"} Oct 11 02:31:17 crc kubenswrapper[4743]: I1011 02:31:17.047025 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qdthz" podStartSLOduration=2.528967652 podStartE2EDuration="6.046999935s" podCreationTimestamp="2025-10-11 02:31:11 +0000 UTC" firstStartedPulling="2025-10-11 02:31:12.92970288 +0000 UTC m=+5967.582683277" lastFinishedPulling="2025-10-11 02:31:16.447735163 +0000 UTC m=+5971.100715560" observedRunningTime="2025-10-11 02:31:17.020648987 +0000 UTC m=+5971.673629384" watchObservedRunningTime="2025-10-11 02:31:17.046999935 +0000 UTC m=+5971.699980342" Oct 11 02:31:17 crc kubenswrapper[4743]: I1011 02:31:17.067393 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g4zpz" podStartSLOduration=2.553983525 podStartE2EDuration="5.067369433s" podCreationTimestamp="2025-10-11 02:31:12 +0000 UTC" firstStartedPulling="2025-10-11 02:31:13.945943205 +0000 UTC m=+5968.598923612" lastFinishedPulling="2025-10-11 02:31:16.459329123 +0000 UTC m=+5971.112309520" observedRunningTime="2025-10-11 02:31:17.035719083 +0000 UTC m=+5971.688699480" watchObservedRunningTime="2025-10-11 02:31:17.067369433 +0000 UTC m=+5971.720349830" Oct 11 02:31:22 crc kubenswrapper[4743]: I1011 02:31:22.171115 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qdthz" Oct 11 02:31:22 crc kubenswrapper[4743]: I1011 02:31:22.171804 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qdthz" Oct 11 02:31:22 crc kubenswrapper[4743]: I1011 02:31:22.241059 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qdthz" Oct 11 02:31:22 crc kubenswrapper[4743]: I1011 02:31:22.749634 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g4zpz" Oct 11 02:31:22 crc kubenswrapper[4743]: I1011 02:31:22.750016 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g4zpz" Oct 11 02:31:22 crc kubenswrapper[4743]: I1011 02:31:22.810789 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g4zpz" Oct 11 02:31:23 crc kubenswrapper[4743]: I1011 02:31:23.113074 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g4zpz" Oct 11 02:31:23 crc kubenswrapper[4743]: I1011 02:31:23.121471 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qdthz" Oct 11 02:31:24 crc kubenswrapper[4743]: I1011 02:31:24.402699 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g4zpz"] Oct 11 02:31:25 crc kubenswrapper[4743]: I1011 02:31:25.087980 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g4zpz" podUID="aa054b17-9f03-495d-920a-76175af779b9" containerName="registry-server" containerID="cri-o://e2f0dafbf0ffb78fd30e10f0807563991e9a48ea48dd62e7c9285bc94cb4a0f9" gracePeriod=2 Oct 11 02:31:25 crc kubenswrapper[4743]: I1011 02:31:25.395250 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qdthz"] Oct 11 02:31:25 crc kubenswrapper[4743]: I1011 02:31:25.395874 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qdthz" podUID="071aa1a8-13b1-4c41-bbbe-939ea1ec648c" containerName="registry-server" containerID="cri-o://2032c8b1b80a77f2dbe1ee65e41f0335015392285d07df6e47d251a2a5b5958f" gracePeriod=2 Oct 11 02:31:25 crc kubenswrapper[4743]: I1011 02:31:25.638634 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g4zpz" Oct 11 02:31:25 crc kubenswrapper[4743]: I1011 02:31:25.746147 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa054b17-9f03-495d-920a-76175af779b9-catalog-content\") pod \"aa054b17-9f03-495d-920a-76175af779b9\" (UID: \"aa054b17-9f03-495d-920a-76175af779b9\") " Oct 11 02:31:25 crc kubenswrapper[4743]: I1011 02:31:25.746458 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa054b17-9f03-495d-920a-76175af779b9-utilities\") pod \"aa054b17-9f03-495d-920a-76175af779b9\" (UID: \"aa054b17-9f03-495d-920a-76175af779b9\") " Oct 11 02:31:25 crc kubenswrapper[4743]: I1011 02:31:25.746535 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqpzt\" (UniqueName: \"kubernetes.io/projected/aa054b17-9f03-495d-920a-76175af779b9-kube-api-access-xqpzt\") pod \"aa054b17-9f03-495d-920a-76175af779b9\" (UID: \"aa054b17-9f03-495d-920a-76175af779b9\") " Oct 11 02:31:25 crc kubenswrapper[4743]: I1011 02:31:25.748787 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa054b17-9f03-495d-920a-76175af779b9-utilities" (OuterVolumeSpecName: "utilities") pod "aa054b17-9f03-495d-920a-76175af779b9" (UID: "aa054b17-9f03-495d-920a-76175af779b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:31:25 crc kubenswrapper[4743]: I1011 02:31:25.753627 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa054b17-9f03-495d-920a-76175af779b9-kube-api-access-xqpzt" (OuterVolumeSpecName: "kube-api-access-xqpzt") pod "aa054b17-9f03-495d-920a-76175af779b9" (UID: "aa054b17-9f03-495d-920a-76175af779b9"). InnerVolumeSpecName "kube-api-access-xqpzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:31:25 crc kubenswrapper[4743]: I1011 02:31:25.759396 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa054b17-9f03-495d-920a-76175af779b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa054b17-9f03-495d-920a-76175af779b9" (UID: "aa054b17-9f03-495d-920a-76175af779b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:31:25 crc kubenswrapper[4743]: I1011 02:31:25.849517 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa054b17-9f03-495d-920a-76175af779b9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 02:31:25 crc kubenswrapper[4743]: I1011 02:31:25.849548 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa054b17-9f03-495d-920a-76175af779b9-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 02:31:25 crc kubenswrapper[4743]: I1011 02:31:25.849560 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqpzt\" (UniqueName: \"kubernetes.io/projected/aa054b17-9f03-495d-920a-76175af779b9-kube-api-access-xqpzt\") on node \"crc\" DevicePath \"\"" Oct 11 02:31:25 crc kubenswrapper[4743]: I1011 02:31:25.944143 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qdthz" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.053713 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44z9f\" (UniqueName: \"kubernetes.io/projected/071aa1a8-13b1-4c41-bbbe-939ea1ec648c-kube-api-access-44z9f\") pod \"071aa1a8-13b1-4c41-bbbe-939ea1ec648c\" (UID: \"071aa1a8-13b1-4c41-bbbe-939ea1ec648c\") " Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.053772 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/071aa1a8-13b1-4c41-bbbe-939ea1ec648c-utilities\") pod \"071aa1a8-13b1-4c41-bbbe-939ea1ec648c\" (UID: \"071aa1a8-13b1-4c41-bbbe-939ea1ec648c\") " Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.053935 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/071aa1a8-13b1-4c41-bbbe-939ea1ec648c-catalog-content\") pod \"071aa1a8-13b1-4c41-bbbe-939ea1ec648c\" (UID: \"071aa1a8-13b1-4c41-bbbe-939ea1ec648c\") " Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.054725 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/071aa1a8-13b1-4c41-bbbe-939ea1ec648c-utilities" (OuterVolumeSpecName: "utilities") pod "071aa1a8-13b1-4c41-bbbe-939ea1ec648c" (UID: "071aa1a8-13b1-4c41-bbbe-939ea1ec648c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.056434 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/071aa1a8-13b1-4c41-bbbe-939ea1ec648c-kube-api-access-44z9f" (OuterVolumeSpecName: "kube-api-access-44z9f") pod "071aa1a8-13b1-4c41-bbbe-939ea1ec648c" (UID: "071aa1a8-13b1-4c41-bbbe-939ea1ec648c"). InnerVolumeSpecName "kube-api-access-44z9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.075029 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44z9f\" (UniqueName: \"kubernetes.io/projected/071aa1a8-13b1-4c41-bbbe-939ea1ec648c-kube-api-access-44z9f\") on node \"crc\" DevicePath \"\"" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.075317 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/071aa1a8-13b1-4c41-bbbe-939ea1ec648c-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.099956 4743 generic.go:334] "Generic (PLEG): container finished" podID="aa054b17-9f03-495d-920a-76175af779b9" containerID="e2f0dafbf0ffb78fd30e10f0807563991e9a48ea48dd62e7c9285bc94cb4a0f9" exitCode=0 Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.102963 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g4zpz" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.105253 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.108255 4743 generic.go:334] "Generic (PLEG): container finished" podID="071aa1a8-13b1-4c41-bbbe-939ea1ec648c" containerID="2032c8b1b80a77f2dbe1ee65e41f0335015392285d07df6e47d251a2a5b5958f" exitCode=0 Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.108700 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qdthz" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.112327 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g4zpz" event={"ID":"aa054b17-9f03-495d-920a-76175af779b9","Type":"ContainerDied","Data":"e2f0dafbf0ffb78fd30e10f0807563991e9a48ea48dd62e7c9285bc94cb4a0f9"} Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.112436 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g4zpz" event={"ID":"aa054b17-9f03-495d-920a-76175af779b9","Type":"ContainerDied","Data":"d3280c8f015f36cc6c46c580d2f80199335ba7bf6a6cfe55075cf9e7c5c19586"} Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.112553 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdthz" event={"ID":"071aa1a8-13b1-4c41-bbbe-939ea1ec648c","Type":"ContainerDied","Data":"2032c8b1b80a77f2dbe1ee65e41f0335015392285d07df6e47d251a2a5b5958f"} Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.112616 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdthz" event={"ID":"071aa1a8-13b1-4c41-bbbe-939ea1ec648c","Type":"ContainerDied","Data":"69a91ffd310ad84336fd364ca8d4d6cb16d29246c3d9cfbec8b58c44d8cbc876"} Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.112517 4743 scope.go:117] "RemoveContainer" containerID="e2f0dafbf0ffb78fd30e10f0807563991e9a48ea48dd62e7c9285bc94cb4a0f9" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.113817 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/071aa1a8-13b1-4c41-bbbe-939ea1ec648c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "071aa1a8-13b1-4c41-bbbe-939ea1ec648c" (UID: "071aa1a8-13b1-4c41-bbbe-939ea1ec648c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.143116 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g4zpz"] Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.152573 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g4zpz"] Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.169112 4743 scope.go:117] "RemoveContainer" containerID="0cdcd255ff03856d8f698ab49b231ae84951af5d7744a18ae44291b3c7f1aa02" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.177106 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/071aa1a8-13b1-4c41-bbbe-939ea1ec648c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.201883 4743 scope.go:117] "RemoveContainer" containerID="4ad5f4ba2ea9d6a3bd01270d6d58a0d9f5f0b620f680ba70c4174961416e71b2" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.274117 4743 scope.go:117] "RemoveContainer" containerID="e2f0dafbf0ffb78fd30e10f0807563991e9a48ea48dd62e7c9285bc94cb4a0f9" Oct 11 02:31:26 crc kubenswrapper[4743]: E1011 02:31:26.274491 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2f0dafbf0ffb78fd30e10f0807563991e9a48ea48dd62e7c9285bc94cb4a0f9\": container with ID starting with e2f0dafbf0ffb78fd30e10f0807563991e9a48ea48dd62e7c9285bc94cb4a0f9 not found: ID does not exist" containerID="e2f0dafbf0ffb78fd30e10f0807563991e9a48ea48dd62e7c9285bc94cb4a0f9" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.274534 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2f0dafbf0ffb78fd30e10f0807563991e9a48ea48dd62e7c9285bc94cb4a0f9"} err="failed to get container status \"e2f0dafbf0ffb78fd30e10f0807563991e9a48ea48dd62e7c9285bc94cb4a0f9\": rpc error: code = NotFound desc = could not find container \"e2f0dafbf0ffb78fd30e10f0807563991e9a48ea48dd62e7c9285bc94cb4a0f9\": container with ID starting with e2f0dafbf0ffb78fd30e10f0807563991e9a48ea48dd62e7c9285bc94cb4a0f9 not found: ID does not exist" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.274559 4743 scope.go:117] "RemoveContainer" containerID="0cdcd255ff03856d8f698ab49b231ae84951af5d7744a18ae44291b3c7f1aa02" Oct 11 02:31:26 crc kubenswrapper[4743]: E1011 02:31:26.274798 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cdcd255ff03856d8f698ab49b231ae84951af5d7744a18ae44291b3c7f1aa02\": container with ID starting with 0cdcd255ff03856d8f698ab49b231ae84951af5d7744a18ae44291b3c7f1aa02 not found: ID does not exist" containerID="0cdcd255ff03856d8f698ab49b231ae84951af5d7744a18ae44291b3c7f1aa02" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.274827 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cdcd255ff03856d8f698ab49b231ae84951af5d7744a18ae44291b3c7f1aa02"} err="failed to get container status \"0cdcd255ff03856d8f698ab49b231ae84951af5d7744a18ae44291b3c7f1aa02\": rpc error: code = NotFound desc = could not find container \"0cdcd255ff03856d8f698ab49b231ae84951af5d7744a18ae44291b3c7f1aa02\": container with ID starting with 0cdcd255ff03856d8f698ab49b231ae84951af5d7744a18ae44291b3c7f1aa02 not found: ID does not exist" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.274846 4743 scope.go:117] "RemoveContainer" containerID="4ad5f4ba2ea9d6a3bd01270d6d58a0d9f5f0b620f680ba70c4174961416e71b2" Oct 11 02:31:26 crc kubenswrapper[4743]: E1011 02:31:26.275107 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ad5f4ba2ea9d6a3bd01270d6d58a0d9f5f0b620f680ba70c4174961416e71b2\": container with ID starting with 4ad5f4ba2ea9d6a3bd01270d6d58a0d9f5f0b620f680ba70c4174961416e71b2 not found: ID does not exist" containerID="4ad5f4ba2ea9d6a3bd01270d6d58a0d9f5f0b620f680ba70c4174961416e71b2" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.275135 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ad5f4ba2ea9d6a3bd01270d6d58a0d9f5f0b620f680ba70c4174961416e71b2"} err="failed to get container status \"4ad5f4ba2ea9d6a3bd01270d6d58a0d9f5f0b620f680ba70c4174961416e71b2\": rpc error: code = NotFound desc = could not find container \"4ad5f4ba2ea9d6a3bd01270d6d58a0d9f5f0b620f680ba70c4174961416e71b2\": container with ID starting with 4ad5f4ba2ea9d6a3bd01270d6d58a0d9f5f0b620f680ba70c4174961416e71b2 not found: ID does not exist" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.275154 4743 scope.go:117] "RemoveContainer" containerID="2032c8b1b80a77f2dbe1ee65e41f0335015392285d07df6e47d251a2a5b5958f" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.333978 4743 scope.go:117] "RemoveContainer" containerID="4bd1539f5147e0f5e0fee970373b053d7511cca7d650eed0ab8ea96f09e82283" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.360458 4743 scope.go:117] "RemoveContainer" containerID="4d1df81f6ead8eef3312dbc8553a1c7d7491731cfb6a1514e58c5fd7562e8cc0" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.408829 4743 scope.go:117] "RemoveContainer" containerID="2032c8b1b80a77f2dbe1ee65e41f0335015392285d07df6e47d251a2a5b5958f" Oct 11 02:31:26 crc kubenswrapper[4743]: E1011 02:31:26.410016 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2032c8b1b80a77f2dbe1ee65e41f0335015392285d07df6e47d251a2a5b5958f\": container with ID starting with 2032c8b1b80a77f2dbe1ee65e41f0335015392285d07df6e47d251a2a5b5958f not found: ID does not exist" containerID="2032c8b1b80a77f2dbe1ee65e41f0335015392285d07df6e47d251a2a5b5958f" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.410045 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2032c8b1b80a77f2dbe1ee65e41f0335015392285d07df6e47d251a2a5b5958f"} err="failed to get container status \"2032c8b1b80a77f2dbe1ee65e41f0335015392285d07df6e47d251a2a5b5958f\": rpc error: code = NotFound desc = could not find container \"2032c8b1b80a77f2dbe1ee65e41f0335015392285d07df6e47d251a2a5b5958f\": container with ID starting with 2032c8b1b80a77f2dbe1ee65e41f0335015392285d07df6e47d251a2a5b5958f not found: ID does not exist" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.410067 4743 scope.go:117] "RemoveContainer" containerID="4bd1539f5147e0f5e0fee970373b053d7511cca7d650eed0ab8ea96f09e82283" Oct 11 02:31:26 crc kubenswrapper[4743]: E1011 02:31:26.413335 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bd1539f5147e0f5e0fee970373b053d7511cca7d650eed0ab8ea96f09e82283\": container with ID starting with 4bd1539f5147e0f5e0fee970373b053d7511cca7d650eed0ab8ea96f09e82283 not found: ID does not exist" containerID="4bd1539f5147e0f5e0fee970373b053d7511cca7d650eed0ab8ea96f09e82283" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.413415 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bd1539f5147e0f5e0fee970373b053d7511cca7d650eed0ab8ea96f09e82283"} err="failed to get container status \"4bd1539f5147e0f5e0fee970373b053d7511cca7d650eed0ab8ea96f09e82283\": rpc error: code = NotFound desc = could not find container \"4bd1539f5147e0f5e0fee970373b053d7511cca7d650eed0ab8ea96f09e82283\": container with ID starting with 4bd1539f5147e0f5e0fee970373b053d7511cca7d650eed0ab8ea96f09e82283 not found: ID does not exist" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.413430 4743 scope.go:117] "RemoveContainer" containerID="4d1df81f6ead8eef3312dbc8553a1c7d7491731cfb6a1514e58c5fd7562e8cc0" Oct 11 02:31:26 crc kubenswrapper[4743]: E1011 02:31:26.414796 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d1df81f6ead8eef3312dbc8553a1c7d7491731cfb6a1514e58c5fd7562e8cc0\": container with ID starting with 4d1df81f6ead8eef3312dbc8553a1c7d7491731cfb6a1514e58c5fd7562e8cc0 not found: ID does not exist" containerID="4d1df81f6ead8eef3312dbc8553a1c7d7491731cfb6a1514e58c5fd7562e8cc0" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.414820 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d1df81f6ead8eef3312dbc8553a1c7d7491731cfb6a1514e58c5fd7562e8cc0"} err="failed to get container status \"4d1df81f6ead8eef3312dbc8553a1c7d7491731cfb6a1514e58c5fd7562e8cc0\": rpc error: code = NotFound desc = could not find container \"4d1df81f6ead8eef3312dbc8553a1c7d7491731cfb6a1514e58c5fd7562e8cc0\": container with ID starting with 4d1df81f6ead8eef3312dbc8553a1c7d7491731cfb6a1514e58c5fd7562e8cc0 not found: ID does not exist" Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.470479 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qdthz"] Oct 11 02:31:26 crc kubenswrapper[4743]: I1011 02:31:26.482084 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qdthz"] Oct 11 02:31:27 crc kubenswrapper[4743]: I1011 02:31:27.124189 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"7428f79637ae147481ca81c99fa3a25d4c0eb43fab0c6c46b4b1e0515b9d596d"} Oct 11 02:31:28 crc kubenswrapper[4743]: I1011 02:31:28.124528 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="071aa1a8-13b1-4c41-bbbe-939ea1ec648c" path="/var/lib/kubelet/pods/071aa1a8-13b1-4c41-bbbe-939ea1ec648c/volumes" Oct 11 02:31:28 crc kubenswrapper[4743]: I1011 02:31:28.126293 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa054b17-9f03-495d-920a-76175af779b9" path="/var/lib/kubelet/pods/aa054b17-9f03-495d-920a-76175af779b9/volumes" Oct 11 02:32:58 crc kubenswrapper[4743]: I1011 02:32:58.165303 4743 scope.go:117] "RemoveContainer" containerID="a43d11181f08a50e4a207a5eb196e5fdbccde6aba98a6019ce74f8c591f28412" Oct 11 02:32:58 crc kubenswrapper[4743]: I1011 02:32:58.194142 4743 scope.go:117] "RemoveContainer" containerID="3d0cc4ad4b26152e6ee5dd704b62edefd41bac06354671902dbfe82033c084a2" Oct 11 02:32:58 crc kubenswrapper[4743]: I1011 02:32:58.286628 4743 scope.go:117] "RemoveContainer" containerID="ac6ee0386097c7bd0b56f1e58fcb54edf9440eaafdbffc087aa7015390839793" Oct 11 02:33:44 crc kubenswrapper[4743]: I1011 02:33:44.458830 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:33:44 crc kubenswrapper[4743]: I1011 02:33:44.459516 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:33:59 crc kubenswrapper[4743]: I1011 02:33:59.488788 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-744b8cd687-p7lgl" podUID="68219217-d875-4eb2-9611-b9afb0f64c45" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 11 02:34:14 crc kubenswrapper[4743]: I1011 02:34:14.461353 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:34:14 crc kubenswrapper[4743]: I1011 02:34:14.461990 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:34:44 crc kubenswrapper[4743]: I1011 02:34:44.457913 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:34:44 crc kubenswrapper[4743]: I1011 02:34:44.458661 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:34:44 crc kubenswrapper[4743]: I1011 02:34:44.458733 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 02:34:44 crc kubenswrapper[4743]: I1011 02:34:44.460175 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7428f79637ae147481ca81c99fa3a25d4c0eb43fab0c6c46b4b1e0515b9d596d"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 02:34:44 crc kubenswrapper[4743]: I1011 02:34:44.460292 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://7428f79637ae147481ca81c99fa3a25d4c0eb43fab0c6c46b4b1e0515b9d596d" gracePeriod=600 Oct 11 02:34:45 crc kubenswrapper[4743]: I1011 02:34:45.433811 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="7428f79637ae147481ca81c99fa3a25d4c0eb43fab0c6c46b4b1e0515b9d596d" exitCode=0 Oct 11 02:34:45 crc kubenswrapper[4743]: I1011 02:34:45.433881 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"7428f79637ae147481ca81c99fa3a25d4c0eb43fab0c6c46b4b1e0515b9d596d"} Oct 11 02:34:45 crc kubenswrapper[4743]: I1011 02:34:45.434291 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b"} Oct 11 02:34:45 crc kubenswrapper[4743]: I1011 02:34:45.434312 4743 scope.go:117] "RemoveContainer" containerID="2e5d27e1167f6676643c09a4d46ec4454b12980bd38f5695ef580ca9ee0e294d" Oct 11 02:36:44 crc kubenswrapper[4743]: I1011 02:36:44.458164 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:36:44 crc kubenswrapper[4743]: I1011 02:36:44.458962 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:36:45 crc kubenswrapper[4743]: E1011 02:36:45.486688 4743 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.106:37608->38.102.83.106:39201: write tcp 38.102.83.106:37608->38.102.83.106:39201: write: broken pipe Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.425288 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 11 02:37:06 crc kubenswrapper[4743]: E1011 02:37:06.426389 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="071aa1a8-13b1-4c41-bbbe-939ea1ec648c" containerName="registry-server" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.426409 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="071aa1a8-13b1-4c41-bbbe-939ea1ec648c" containerName="registry-server" Oct 11 02:37:06 crc kubenswrapper[4743]: E1011 02:37:06.426437 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa054b17-9f03-495d-920a-76175af779b9" containerName="registry-server" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.426447 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa054b17-9f03-495d-920a-76175af779b9" containerName="registry-server" Oct 11 02:37:06 crc kubenswrapper[4743]: E1011 02:37:06.426468 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="071aa1a8-13b1-4c41-bbbe-939ea1ec648c" containerName="extract-content" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.426478 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="071aa1a8-13b1-4c41-bbbe-939ea1ec648c" containerName="extract-content" Oct 11 02:37:06 crc kubenswrapper[4743]: E1011 02:37:06.426490 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="071aa1a8-13b1-4c41-bbbe-939ea1ec648c" containerName="extract-utilities" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.426499 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="071aa1a8-13b1-4c41-bbbe-939ea1ec648c" containerName="extract-utilities" Oct 11 02:37:06 crc kubenswrapper[4743]: E1011 02:37:06.426524 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa054b17-9f03-495d-920a-76175af779b9" containerName="extract-content" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.426535 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa054b17-9f03-495d-920a-76175af779b9" containerName="extract-content" Oct 11 02:37:06 crc kubenswrapper[4743]: E1011 02:37:06.426561 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa054b17-9f03-495d-920a-76175af779b9" containerName="extract-utilities" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.426573 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa054b17-9f03-495d-920a-76175af779b9" containerName="extract-utilities" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.426914 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa054b17-9f03-495d-920a-76175af779b9" containerName="registry-server" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.426951 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="071aa1a8-13b1-4c41-bbbe-939ea1ec648c" containerName="registry-server" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.427846 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.429892 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.430299 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.431419 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-xtpzb" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.431670 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.449453 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.503205 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59e812a1-677a-4aca-bb9a-c4f0d166710a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.503248 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/59e812a1-677a-4aca-bb9a-c4f0d166710a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.503322 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/59e812a1-677a-4aca-bb9a-c4f0d166710a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.503389 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/59e812a1-677a-4aca-bb9a-c4f0d166710a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.503450 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tbht\" (UniqueName: \"kubernetes.io/projected/59e812a1-677a-4aca-bb9a-c4f0d166710a-kube-api-access-2tbht\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.503489 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.503559 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/59e812a1-677a-4aca-bb9a-c4f0d166710a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.503591 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/59e812a1-677a-4aca-bb9a-c4f0d166710a-config-data\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.503730 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/59e812a1-677a-4aca-bb9a-c4f0d166710a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.605093 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tbht\" (UniqueName: \"kubernetes.io/projected/59e812a1-677a-4aca-bb9a-c4f0d166710a-kube-api-access-2tbht\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.605151 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.605198 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/59e812a1-677a-4aca-bb9a-c4f0d166710a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.605231 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/59e812a1-677a-4aca-bb9a-c4f0d166710a-config-data\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.605265 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/59e812a1-677a-4aca-bb9a-c4f0d166710a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.605340 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59e812a1-677a-4aca-bb9a-c4f0d166710a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.605368 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/59e812a1-677a-4aca-bb9a-c4f0d166710a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.605471 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/59e812a1-677a-4aca-bb9a-c4f0d166710a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.605616 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/59e812a1-677a-4aca-bb9a-c4f0d166710a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.606357 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/59e812a1-677a-4aca-bb9a-c4f0d166710a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.606543 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/59e812a1-677a-4aca-bb9a-c4f0d166710a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.606561 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/59e812a1-677a-4aca-bb9a-c4f0d166710a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.606747 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/59e812a1-677a-4aca-bb9a-c4f0d166710a-config-data\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.607611 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.611248 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59e812a1-677a-4aca-bb9a-c4f0d166710a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.616983 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/59e812a1-677a-4aca-bb9a-c4f0d166710a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.621114 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/59e812a1-677a-4aca-bb9a-c4f0d166710a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.622074 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tbht\" (UniqueName: \"kubernetes.io/projected/59e812a1-677a-4aca-bb9a-c4f0d166710a-kube-api-access-2tbht\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.665930 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " pod="openstack/tempest-tests-tempest" Oct 11 02:37:06 crc kubenswrapper[4743]: I1011 02:37:06.755344 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 11 02:37:07 crc kubenswrapper[4743]: I1011 02:37:07.287348 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 11 02:37:07 crc kubenswrapper[4743]: W1011 02:37:07.293873 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59e812a1_677a_4aca_bb9a_c4f0d166710a.slice/crio-4cbcedab8669dc7e88e557e1077dcba47b8cba98b942e275eced85107128365a WatchSource:0}: Error finding container 4cbcedab8669dc7e88e557e1077dcba47b8cba98b942e275eced85107128365a: Status 404 returned error can't find the container with id 4cbcedab8669dc7e88e557e1077dcba47b8cba98b942e275eced85107128365a Oct 11 02:37:07 crc kubenswrapper[4743]: I1011 02:37:07.298035 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 02:37:08 crc kubenswrapper[4743]: I1011 02:37:08.115466 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"59e812a1-677a-4aca-bb9a-c4f0d166710a","Type":"ContainerStarted","Data":"4cbcedab8669dc7e88e557e1077dcba47b8cba98b942e275eced85107128365a"} Oct 11 02:37:14 crc kubenswrapper[4743]: I1011 02:37:14.458120 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:37:14 crc kubenswrapper[4743]: I1011 02:37:14.460235 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:37:36 crc kubenswrapper[4743]: E1011 02:37:36.717069 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 11 02:37:36 crc kubenswrapper[4743]: E1011 02:37:36.720941 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2tbht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(59e812a1-677a-4aca-bb9a-c4f0d166710a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 11 02:37:36 crc kubenswrapper[4743]: E1011 02:37:36.722262 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="59e812a1-677a-4aca-bb9a-c4f0d166710a" Oct 11 02:37:37 crc kubenswrapper[4743]: E1011 02:37:37.454502 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="59e812a1-677a-4aca-bb9a-c4f0d166710a" Oct 11 02:37:44 crc kubenswrapper[4743]: I1011 02:37:44.458843 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:37:44 crc kubenswrapper[4743]: I1011 02:37:44.460988 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:37:44 crc kubenswrapper[4743]: I1011 02:37:44.461080 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 02:37:44 crc kubenswrapper[4743]: I1011 02:37:44.462196 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 02:37:44 crc kubenswrapper[4743]: I1011 02:37:44.462277 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" gracePeriod=600 Oct 11 02:37:44 crc kubenswrapper[4743]: E1011 02:37:44.602323 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:37:45 crc kubenswrapper[4743]: I1011 02:37:45.545734 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" exitCode=0 Oct 11 02:37:45 crc kubenswrapper[4743]: I1011 02:37:45.545813 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b"} Oct 11 02:37:45 crc kubenswrapper[4743]: I1011 02:37:45.546107 4743 scope.go:117] "RemoveContainer" containerID="7428f79637ae147481ca81c99fa3a25d4c0eb43fab0c6c46b4b1e0515b9d596d" Oct 11 02:37:45 crc kubenswrapper[4743]: I1011 02:37:45.546731 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:37:45 crc kubenswrapper[4743]: E1011 02:37:45.547008 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:37:50 crc kubenswrapper[4743]: I1011 02:37:50.640077 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 11 02:37:52 crc kubenswrapper[4743]: I1011 02:37:52.657929 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"59e812a1-677a-4aca-bb9a-c4f0d166710a","Type":"ContainerStarted","Data":"c1417cf18987330c309aba79a1f08fe5bcc5e606328a37a1d6bfcb964e4d4210"} Oct 11 02:37:52 crc kubenswrapper[4743]: I1011 02:37:52.684817 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.345199003 podStartE2EDuration="47.684796069s" podCreationTimestamp="2025-10-11 02:37:05 +0000 UTC" firstStartedPulling="2025-10-11 02:37:07.297142603 +0000 UTC m=+6321.950123040" lastFinishedPulling="2025-10-11 02:37:50.636739669 +0000 UTC m=+6365.289720106" observedRunningTime="2025-10-11 02:37:52.680700166 +0000 UTC m=+6367.333680613" watchObservedRunningTime="2025-10-11 02:37:52.684796069 +0000 UTC m=+6367.337776476" Oct 11 02:37:58 crc kubenswrapper[4743]: I1011 02:37:58.092131 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:37:58 crc kubenswrapper[4743]: E1011 02:37:58.093126 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:38:10 crc kubenswrapper[4743]: I1011 02:38:10.140301 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-49csn"] Oct 11 02:38:10 crc kubenswrapper[4743]: I1011 02:38:10.143838 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-49csn" Oct 11 02:38:10 crc kubenswrapper[4743]: I1011 02:38:10.165774 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-49csn"] Oct 11 02:38:10 crc kubenswrapper[4743]: I1011 02:38:10.262105 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c851d9bd-0f76-470c-afae-698dac4798db-catalog-content\") pod \"redhat-operators-49csn\" (UID: \"c851d9bd-0f76-470c-afae-698dac4798db\") " pod="openshift-marketplace/redhat-operators-49csn" Oct 11 02:38:10 crc kubenswrapper[4743]: I1011 02:38:10.262146 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69v6j\" (UniqueName: \"kubernetes.io/projected/c851d9bd-0f76-470c-afae-698dac4798db-kube-api-access-69v6j\") pod \"redhat-operators-49csn\" (UID: \"c851d9bd-0f76-470c-afae-698dac4798db\") " pod="openshift-marketplace/redhat-operators-49csn" Oct 11 02:38:10 crc kubenswrapper[4743]: I1011 02:38:10.262262 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c851d9bd-0f76-470c-afae-698dac4798db-utilities\") pod \"redhat-operators-49csn\" (UID: \"c851d9bd-0f76-470c-afae-698dac4798db\") " pod="openshift-marketplace/redhat-operators-49csn" Oct 11 02:38:10 crc kubenswrapper[4743]: I1011 02:38:10.364759 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c851d9bd-0f76-470c-afae-698dac4798db-utilities\") pod \"redhat-operators-49csn\" (UID: \"c851d9bd-0f76-470c-afae-698dac4798db\") " pod="openshift-marketplace/redhat-operators-49csn" Oct 11 02:38:10 crc kubenswrapper[4743]: I1011 02:38:10.365025 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c851d9bd-0f76-470c-afae-698dac4798db-catalog-content\") pod \"redhat-operators-49csn\" (UID: \"c851d9bd-0f76-470c-afae-698dac4798db\") " pod="openshift-marketplace/redhat-operators-49csn" Oct 11 02:38:10 crc kubenswrapper[4743]: I1011 02:38:10.365056 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69v6j\" (UniqueName: \"kubernetes.io/projected/c851d9bd-0f76-470c-afae-698dac4798db-kube-api-access-69v6j\") pod \"redhat-operators-49csn\" (UID: \"c851d9bd-0f76-470c-afae-698dac4798db\") " pod="openshift-marketplace/redhat-operators-49csn" Oct 11 02:38:10 crc kubenswrapper[4743]: I1011 02:38:10.365935 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c851d9bd-0f76-470c-afae-698dac4798db-utilities\") pod \"redhat-operators-49csn\" (UID: \"c851d9bd-0f76-470c-afae-698dac4798db\") " pod="openshift-marketplace/redhat-operators-49csn" Oct 11 02:38:10 crc kubenswrapper[4743]: I1011 02:38:10.366286 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c851d9bd-0f76-470c-afae-698dac4798db-catalog-content\") pod \"redhat-operators-49csn\" (UID: \"c851d9bd-0f76-470c-afae-698dac4798db\") " pod="openshift-marketplace/redhat-operators-49csn" Oct 11 02:38:10 crc kubenswrapper[4743]: I1011 02:38:10.384542 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69v6j\" (UniqueName: \"kubernetes.io/projected/c851d9bd-0f76-470c-afae-698dac4798db-kube-api-access-69v6j\") pod \"redhat-operators-49csn\" (UID: \"c851d9bd-0f76-470c-afae-698dac4798db\") " pod="openshift-marketplace/redhat-operators-49csn" Oct 11 02:38:10 crc kubenswrapper[4743]: I1011 02:38:10.484034 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-49csn" Oct 11 02:38:11 crc kubenswrapper[4743]: I1011 02:38:11.092669 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:38:11 crc kubenswrapper[4743]: E1011 02:38:11.093431 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:38:11 crc kubenswrapper[4743]: I1011 02:38:11.236388 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-49csn"] Oct 11 02:38:11 crc kubenswrapper[4743]: I1011 02:38:11.902919 4743 generic.go:334] "Generic (PLEG): container finished" podID="c851d9bd-0f76-470c-afae-698dac4798db" containerID="428e45cbc6ff517bec7a2bf0379808f700f0c69d709b9a1bab9022d77ba0112d" exitCode=0 Oct 11 02:38:11 crc kubenswrapper[4743]: I1011 02:38:11.903010 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49csn" event={"ID":"c851d9bd-0f76-470c-afae-698dac4798db","Type":"ContainerDied","Data":"428e45cbc6ff517bec7a2bf0379808f700f0c69d709b9a1bab9022d77ba0112d"} Oct 11 02:38:11 crc kubenswrapper[4743]: I1011 02:38:11.903309 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49csn" event={"ID":"c851d9bd-0f76-470c-afae-698dac4798db","Type":"ContainerStarted","Data":"49352da1640d6609f3cc310d0946a4c1336f8192ff20d647e0b3ddfadd9afdc7"} Oct 11 02:38:12 crc kubenswrapper[4743]: I1011 02:38:12.916877 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49csn" event={"ID":"c851d9bd-0f76-470c-afae-698dac4798db","Type":"ContainerStarted","Data":"a7780b4b2f1e317de9f3f42f2fcf41086d3afc3379047193116dfdf334ba0580"} Oct 11 02:38:15 crc kubenswrapper[4743]: I1011 02:38:15.953702 4743 generic.go:334] "Generic (PLEG): container finished" podID="c851d9bd-0f76-470c-afae-698dac4798db" containerID="a7780b4b2f1e317de9f3f42f2fcf41086d3afc3379047193116dfdf334ba0580" exitCode=0 Oct 11 02:38:15 crc kubenswrapper[4743]: I1011 02:38:15.953977 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49csn" event={"ID":"c851d9bd-0f76-470c-afae-698dac4798db","Type":"ContainerDied","Data":"a7780b4b2f1e317de9f3f42f2fcf41086d3afc3379047193116dfdf334ba0580"} Oct 11 02:38:16 crc kubenswrapper[4743]: I1011 02:38:16.969827 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49csn" event={"ID":"c851d9bd-0f76-470c-afae-698dac4798db","Type":"ContainerStarted","Data":"6ca47ac9d9cc48f5fe45ed4618623c13032ec791ba3b6b0019e286bab4048e6f"} Oct 11 02:38:16 crc kubenswrapper[4743]: I1011 02:38:16.994805 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-49csn" podStartSLOduration=2.480299464 podStartE2EDuration="6.994788728s" podCreationTimestamp="2025-10-11 02:38:10 +0000 UTC" firstStartedPulling="2025-10-11 02:38:11.905086503 +0000 UTC m=+6386.558066930" lastFinishedPulling="2025-10-11 02:38:16.419575787 +0000 UTC m=+6391.072556194" observedRunningTime="2025-10-11 02:38:16.990204734 +0000 UTC m=+6391.643185131" watchObservedRunningTime="2025-10-11 02:38:16.994788728 +0000 UTC m=+6391.647769125" Oct 11 02:38:20 crc kubenswrapper[4743]: I1011 02:38:20.484137 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-49csn" Oct 11 02:38:20 crc kubenswrapper[4743]: I1011 02:38:20.484812 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-49csn" Oct 11 02:38:21 crc kubenswrapper[4743]: I1011 02:38:21.552951 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-49csn" podUID="c851d9bd-0f76-470c-afae-698dac4798db" containerName="registry-server" probeResult="failure" output=< Oct 11 02:38:21 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Oct 11 02:38:21 crc kubenswrapper[4743]: > Oct 11 02:38:24 crc kubenswrapper[4743]: I1011 02:38:24.094583 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:38:24 crc kubenswrapper[4743]: E1011 02:38:24.095432 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:38:30 crc kubenswrapper[4743]: I1011 02:38:30.562707 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-49csn" Oct 11 02:38:30 crc kubenswrapper[4743]: I1011 02:38:30.639510 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-49csn" Oct 11 02:38:30 crc kubenswrapper[4743]: I1011 02:38:30.800714 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-49csn"] Oct 11 02:38:32 crc kubenswrapper[4743]: I1011 02:38:32.136569 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-49csn" podUID="c851d9bd-0f76-470c-afae-698dac4798db" containerName="registry-server" containerID="cri-o://6ca47ac9d9cc48f5fe45ed4618623c13032ec791ba3b6b0019e286bab4048e6f" gracePeriod=2 Oct 11 02:38:32 crc kubenswrapper[4743]: I1011 02:38:32.982220 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-49csn" Oct 11 02:38:33 crc kubenswrapper[4743]: I1011 02:38:33.150334 4743 generic.go:334] "Generic (PLEG): container finished" podID="c851d9bd-0f76-470c-afae-698dac4798db" containerID="6ca47ac9d9cc48f5fe45ed4618623c13032ec791ba3b6b0019e286bab4048e6f" exitCode=0 Oct 11 02:38:33 crc kubenswrapper[4743]: I1011 02:38:33.150375 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49csn" event={"ID":"c851d9bd-0f76-470c-afae-698dac4798db","Type":"ContainerDied","Data":"6ca47ac9d9cc48f5fe45ed4618623c13032ec791ba3b6b0019e286bab4048e6f"} Oct 11 02:38:33 crc kubenswrapper[4743]: I1011 02:38:33.150401 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49csn" event={"ID":"c851d9bd-0f76-470c-afae-698dac4798db","Type":"ContainerDied","Data":"49352da1640d6609f3cc310d0946a4c1336f8192ff20d647e0b3ddfadd9afdc7"} Oct 11 02:38:33 crc kubenswrapper[4743]: I1011 02:38:33.150419 4743 scope.go:117] "RemoveContainer" containerID="6ca47ac9d9cc48f5fe45ed4618623c13032ec791ba3b6b0019e286bab4048e6f" Oct 11 02:38:33 crc kubenswrapper[4743]: I1011 02:38:33.150539 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-49csn" Oct 11 02:38:33 crc kubenswrapper[4743]: I1011 02:38:33.179979 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69v6j\" (UniqueName: \"kubernetes.io/projected/c851d9bd-0f76-470c-afae-698dac4798db-kube-api-access-69v6j\") pod \"c851d9bd-0f76-470c-afae-698dac4798db\" (UID: \"c851d9bd-0f76-470c-afae-698dac4798db\") " Oct 11 02:38:33 crc kubenswrapper[4743]: I1011 02:38:33.180046 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c851d9bd-0f76-470c-afae-698dac4798db-catalog-content\") pod \"c851d9bd-0f76-470c-afae-698dac4798db\" (UID: \"c851d9bd-0f76-470c-afae-698dac4798db\") " Oct 11 02:38:33 crc kubenswrapper[4743]: I1011 02:38:33.180487 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c851d9bd-0f76-470c-afae-698dac4798db-utilities\") pod \"c851d9bd-0f76-470c-afae-698dac4798db\" (UID: \"c851d9bd-0f76-470c-afae-698dac4798db\") " Oct 11 02:38:33 crc kubenswrapper[4743]: I1011 02:38:33.180989 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c851d9bd-0f76-470c-afae-698dac4798db-utilities" (OuterVolumeSpecName: "utilities") pod "c851d9bd-0f76-470c-afae-698dac4798db" (UID: "c851d9bd-0f76-470c-afae-698dac4798db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:38:33 crc kubenswrapper[4743]: I1011 02:38:33.181225 4743 scope.go:117] "RemoveContainer" containerID="a7780b4b2f1e317de9f3f42f2fcf41086d3afc3379047193116dfdf334ba0580" Oct 11 02:38:33 crc kubenswrapper[4743]: I1011 02:38:33.182140 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c851d9bd-0f76-470c-afae-698dac4798db-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 02:38:33 crc kubenswrapper[4743]: I1011 02:38:33.195645 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c851d9bd-0f76-470c-afae-698dac4798db-kube-api-access-69v6j" (OuterVolumeSpecName: "kube-api-access-69v6j") pod "c851d9bd-0f76-470c-afae-698dac4798db" (UID: "c851d9bd-0f76-470c-afae-698dac4798db"). InnerVolumeSpecName "kube-api-access-69v6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:38:33 crc kubenswrapper[4743]: I1011 02:38:33.229770 4743 scope.go:117] "RemoveContainer" containerID="428e45cbc6ff517bec7a2bf0379808f700f0c69d709b9a1bab9022d77ba0112d" Oct 11 02:38:33 crc kubenswrapper[4743]: I1011 02:38:33.236114 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c851d9bd-0f76-470c-afae-698dac4798db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c851d9bd-0f76-470c-afae-698dac4798db" (UID: "c851d9bd-0f76-470c-afae-698dac4798db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:38:33 crc kubenswrapper[4743]: I1011 02:38:33.284551 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69v6j\" (UniqueName: \"kubernetes.io/projected/c851d9bd-0f76-470c-afae-698dac4798db-kube-api-access-69v6j\") on node \"crc\" DevicePath \"\"" Oct 11 02:38:33 crc kubenswrapper[4743]: I1011 02:38:33.284616 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c851d9bd-0f76-470c-afae-698dac4798db-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 02:38:33 crc kubenswrapper[4743]: I1011 02:38:33.326046 4743 scope.go:117] "RemoveContainer" containerID="6ca47ac9d9cc48f5fe45ed4618623c13032ec791ba3b6b0019e286bab4048e6f" Oct 11 02:38:33 crc kubenswrapper[4743]: E1011 02:38:33.326476 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ca47ac9d9cc48f5fe45ed4618623c13032ec791ba3b6b0019e286bab4048e6f\": container with ID starting with 6ca47ac9d9cc48f5fe45ed4618623c13032ec791ba3b6b0019e286bab4048e6f not found: ID does not exist" containerID="6ca47ac9d9cc48f5fe45ed4618623c13032ec791ba3b6b0019e286bab4048e6f" Oct 11 02:38:33 crc kubenswrapper[4743]: I1011 02:38:33.326535 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ca47ac9d9cc48f5fe45ed4618623c13032ec791ba3b6b0019e286bab4048e6f"} err="failed to get container status \"6ca47ac9d9cc48f5fe45ed4618623c13032ec791ba3b6b0019e286bab4048e6f\": rpc error: code = NotFound desc = could not find container \"6ca47ac9d9cc48f5fe45ed4618623c13032ec791ba3b6b0019e286bab4048e6f\": container with ID starting with 6ca47ac9d9cc48f5fe45ed4618623c13032ec791ba3b6b0019e286bab4048e6f not found: ID does not exist" Oct 11 02:38:33 crc kubenswrapper[4743]: I1011 02:38:33.326563 4743 scope.go:117] "RemoveContainer" containerID="a7780b4b2f1e317de9f3f42f2fcf41086d3afc3379047193116dfdf334ba0580" Oct 11 02:38:33 crc kubenswrapper[4743]: E1011 02:38:33.326820 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7780b4b2f1e317de9f3f42f2fcf41086d3afc3379047193116dfdf334ba0580\": container with ID starting with a7780b4b2f1e317de9f3f42f2fcf41086d3afc3379047193116dfdf334ba0580 not found: ID does not exist" containerID="a7780b4b2f1e317de9f3f42f2fcf41086d3afc3379047193116dfdf334ba0580" Oct 11 02:38:33 crc kubenswrapper[4743]: I1011 02:38:33.326848 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7780b4b2f1e317de9f3f42f2fcf41086d3afc3379047193116dfdf334ba0580"} err="failed to get container status \"a7780b4b2f1e317de9f3f42f2fcf41086d3afc3379047193116dfdf334ba0580\": rpc error: code = NotFound desc = could not find container \"a7780b4b2f1e317de9f3f42f2fcf41086d3afc3379047193116dfdf334ba0580\": container with ID starting with a7780b4b2f1e317de9f3f42f2fcf41086d3afc3379047193116dfdf334ba0580 not found: ID does not exist" Oct 11 02:38:33 crc kubenswrapper[4743]: I1011 02:38:33.326883 4743 scope.go:117] "RemoveContainer" containerID="428e45cbc6ff517bec7a2bf0379808f700f0c69d709b9a1bab9022d77ba0112d" Oct 11 02:38:33 crc kubenswrapper[4743]: E1011 02:38:33.327180 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"428e45cbc6ff517bec7a2bf0379808f700f0c69d709b9a1bab9022d77ba0112d\": container with ID starting with 428e45cbc6ff517bec7a2bf0379808f700f0c69d709b9a1bab9022d77ba0112d not found: ID does not exist" containerID="428e45cbc6ff517bec7a2bf0379808f700f0c69d709b9a1bab9022d77ba0112d" Oct 11 02:38:33 crc kubenswrapper[4743]: I1011 02:38:33.327208 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"428e45cbc6ff517bec7a2bf0379808f700f0c69d709b9a1bab9022d77ba0112d"} err="failed to get container status \"428e45cbc6ff517bec7a2bf0379808f700f0c69d709b9a1bab9022d77ba0112d\": rpc error: code = NotFound desc = could not find container \"428e45cbc6ff517bec7a2bf0379808f700f0c69d709b9a1bab9022d77ba0112d\": container with ID starting with 428e45cbc6ff517bec7a2bf0379808f700f0c69d709b9a1bab9022d77ba0112d not found: ID does not exist" Oct 11 02:38:33 crc kubenswrapper[4743]: I1011 02:38:33.489825 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-49csn"] Oct 11 02:38:33 crc kubenswrapper[4743]: I1011 02:38:33.501679 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-49csn"] Oct 11 02:38:34 crc kubenswrapper[4743]: I1011 02:38:34.114793 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c851d9bd-0f76-470c-afae-698dac4798db" path="/var/lib/kubelet/pods/c851d9bd-0f76-470c-afae-698dac4798db/volumes" Oct 11 02:38:36 crc kubenswrapper[4743]: I1011 02:38:36.099845 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:38:36 crc kubenswrapper[4743]: E1011 02:38:36.100468 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:38:49 crc kubenswrapper[4743]: I1011 02:38:49.093084 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:38:49 crc kubenswrapper[4743]: E1011 02:38:49.094248 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:39:01 crc kubenswrapper[4743]: I1011 02:39:01.091839 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:39:01 crc kubenswrapper[4743]: E1011 02:39:01.092629 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:39:14 crc kubenswrapper[4743]: I1011 02:39:14.092144 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:39:14 crc kubenswrapper[4743]: E1011 02:39:14.093078 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:39:28 crc kubenswrapper[4743]: I1011 02:39:28.093045 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:39:28 crc kubenswrapper[4743]: E1011 02:39:28.093977 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:39:43 crc kubenswrapper[4743]: I1011 02:39:43.092567 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:39:43 crc kubenswrapper[4743]: E1011 02:39:43.093398 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:39:54 crc kubenswrapper[4743]: I1011 02:39:54.091939 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:39:54 crc kubenswrapper[4743]: E1011 02:39:54.092714 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:40:08 crc kubenswrapper[4743]: I1011 02:40:08.091696 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:40:08 crc kubenswrapper[4743]: E1011 02:40:08.092472 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:40:23 crc kubenswrapper[4743]: I1011 02:40:23.091724 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:40:23 crc kubenswrapper[4743]: E1011 02:40:23.092622 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:40:35 crc kubenswrapper[4743]: I1011 02:40:35.091897 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:40:35 crc kubenswrapper[4743]: E1011 02:40:35.093029 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:40:49 crc kubenswrapper[4743]: I1011 02:40:49.092300 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:40:49 crc kubenswrapper[4743]: E1011 02:40:49.093348 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:41:01 crc kubenswrapper[4743]: I1011 02:41:01.092204 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:41:01 crc kubenswrapper[4743]: E1011 02:41:01.093135 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:41:12 crc kubenswrapper[4743]: I1011 02:41:12.093010 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:41:12 crc kubenswrapper[4743]: E1011 02:41:12.094361 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:41:12 crc kubenswrapper[4743]: I1011 02:41:12.953657 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tjm6q"] Oct 11 02:41:12 crc kubenswrapper[4743]: E1011 02:41:12.956091 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c851d9bd-0f76-470c-afae-698dac4798db" containerName="registry-server" Oct 11 02:41:12 crc kubenswrapper[4743]: I1011 02:41:12.956133 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c851d9bd-0f76-470c-afae-698dac4798db" containerName="registry-server" Oct 11 02:41:12 crc kubenswrapper[4743]: E1011 02:41:12.956173 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c851d9bd-0f76-470c-afae-698dac4798db" containerName="extract-utilities" Oct 11 02:41:12 crc kubenswrapper[4743]: I1011 02:41:12.956195 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c851d9bd-0f76-470c-afae-698dac4798db" containerName="extract-utilities" Oct 11 02:41:12 crc kubenswrapper[4743]: E1011 02:41:12.956222 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c851d9bd-0f76-470c-afae-698dac4798db" containerName="extract-content" Oct 11 02:41:12 crc kubenswrapper[4743]: I1011 02:41:12.956230 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c851d9bd-0f76-470c-afae-698dac4798db" containerName="extract-content" Oct 11 02:41:12 crc kubenswrapper[4743]: I1011 02:41:12.956522 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c851d9bd-0f76-470c-afae-698dac4798db" containerName="registry-server" Oct 11 02:41:12 crc kubenswrapper[4743]: I1011 02:41:12.960945 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tjm6q" Oct 11 02:41:12 crc kubenswrapper[4743]: I1011 02:41:12.984878 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjm6q"] Oct 11 02:41:13 crc kubenswrapper[4743]: I1011 02:41:13.067731 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf7e8361-7cb6-499e-aa63-bc2ef01b08d7-catalog-content\") pod \"redhat-marketplace-tjm6q\" (UID: \"cf7e8361-7cb6-499e-aa63-bc2ef01b08d7\") " pod="openshift-marketplace/redhat-marketplace-tjm6q" Oct 11 02:41:13 crc kubenswrapper[4743]: I1011 02:41:13.067818 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx78p\" (UniqueName: \"kubernetes.io/projected/cf7e8361-7cb6-499e-aa63-bc2ef01b08d7-kube-api-access-zx78p\") pod \"redhat-marketplace-tjm6q\" (UID: \"cf7e8361-7cb6-499e-aa63-bc2ef01b08d7\") " pod="openshift-marketplace/redhat-marketplace-tjm6q" Oct 11 02:41:13 crc kubenswrapper[4743]: I1011 02:41:13.067991 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf7e8361-7cb6-499e-aa63-bc2ef01b08d7-utilities\") pod \"redhat-marketplace-tjm6q\" (UID: \"cf7e8361-7cb6-499e-aa63-bc2ef01b08d7\") " pod="openshift-marketplace/redhat-marketplace-tjm6q" Oct 11 02:41:13 crc kubenswrapper[4743]: I1011 02:41:13.170465 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf7e8361-7cb6-499e-aa63-bc2ef01b08d7-utilities\") pod \"redhat-marketplace-tjm6q\" (UID: \"cf7e8361-7cb6-499e-aa63-bc2ef01b08d7\") " pod="openshift-marketplace/redhat-marketplace-tjm6q" Oct 11 02:41:13 crc kubenswrapper[4743]: I1011 02:41:13.170629 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf7e8361-7cb6-499e-aa63-bc2ef01b08d7-catalog-content\") pod \"redhat-marketplace-tjm6q\" (UID: \"cf7e8361-7cb6-499e-aa63-bc2ef01b08d7\") " pod="openshift-marketplace/redhat-marketplace-tjm6q" Oct 11 02:41:13 crc kubenswrapper[4743]: I1011 02:41:13.170713 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx78p\" (UniqueName: \"kubernetes.io/projected/cf7e8361-7cb6-499e-aa63-bc2ef01b08d7-kube-api-access-zx78p\") pod \"redhat-marketplace-tjm6q\" (UID: \"cf7e8361-7cb6-499e-aa63-bc2ef01b08d7\") " pod="openshift-marketplace/redhat-marketplace-tjm6q" Oct 11 02:41:13 crc kubenswrapper[4743]: I1011 02:41:13.173461 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf7e8361-7cb6-499e-aa63-bc2ef01b08d7-catalog-content\") pod \"redhat-marketplace-tjm6q\" (UID: \"cf7e8361-7cb6-499e-aa63-bc2ef01b08d7\") " pod="openshift-marketplace/redhat-marketplace-tjm6q" Oct 11 02:41:13 crc kubenswrapper[4743]: I1011 02:41:13.173593 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf7e8361-7cb6-499e-aa63-bc2ef01b08d7-utilities\") pod \"redhat-marketplace-tjm6q\" (UID: \"cf7e8361-7cb6-499e-aa63-bc2ef01b08d7\") " pod="openshift-marketplace/redhat-marketplace-tjm6q" Oct 11 02:41:13 crc kubenswrapper[4743]: I1011 02:41:13.202731 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx78p\" (UniqueName: \"kubernetes.io/projected/cf7e8361-7cb6-499e-aa63-bc2ef01b08d7-kube-api-access-zx78p\") pod \"redhat-marketplace-tjm6q\" (UID: \"cf7e8361-7cb6-499e-aa63-bc2ef01b08d7\") " pod="openshift-marketplace/redhat-marketplace-tjm6q" Oct 11 02:41:13 crc kubenswrapper[4743]: I1011 02:41:13.296773 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tjm6q" Oct 11 02:41:14 crc kubenswrapper[4743]: I1011 02:41:14.021526 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjm6q"] Oct 11 02:41:14 crc kubenswrapper[4743]: I1011 02:41:14.975814 4743 generic.go:334] "Generic (PLEG): container finished" podID="cf7e8361-7cb6-499e-aa63-bc2ef01b08d7" containerID="6b3f01509d61dc313281b0fb1dae0860dad3680c5ec8349ddf4f45ed69a2774d" exitCode=0 Oct 11 02:41:14 crc kubenswrapper[4743]: I1011 02:41:14.975893 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjm6q" event={"ID":"cf7e8361-7cb6-499e-aa63-bc2ef01b08d7","Type":"ContainerDied","Data":"6b3f01509d61dc313281b0fb1dae0860dad3680c5ec8349ddf4f45ed69a2774d"} Oct 11 02:41:14 crc kubenswrapper[4743]: I1011 02:41:14.976249 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjm6q" event={"ID":"cf7e8361-7cb6-499e-aa63-bc2ef01b08d7","Type":"ContainerStarted","Data":"e908b47a97bc632374db8ecfc8062ee631618be0c282ce3ee8e6e860c0185ca4"} Oct 11 02:41:17 crc kubenswrapper[4743]: I1011 02:41:17.003250 4743 generic.go:334] "Generic (PLEG): container finished" podID="cf7e8361-7cb6-499e-aa63-bc2ef01b08d7" containerID="80de73716376061a40f2c0bc10c702c566d2aa94adde23f22b8d6c492b29d360" exitCode=0 Oct 11 02:41:17 crc kubenswrapper[4743]: I1011 02:41:17.004925 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjm6q" event={"ID":"cf7e8361-7cb6-499e-aa63-bc2ef01b08d7","Type":"ContainerDied","Data":"80de73716376061a40f2c0bc10c702c566d2aa94adde23f22b8d6c492b29d360"} Oct 11 02:41:18 crc kubenswrapper[4743]: I1011 02:41:18.019129 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjm6q" event={"ID":"cf7e8361-7cb6-499e-aa63-bc2ef01b08d7","Type":"ContainerStarted","Data":"a3ada3074887a3a4091e28f138aedb60bba19c16704b0dd6fb7494011b343285"} Oct 11 02:41:18 crc kubenswrapper[4743]: I1011 02:41:18.047997 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tjm6q" podStartSLOduration=3.586952116 podStartE2EDuration="6.047977699s" podCreationTimestamp="2025-10-11 02:41:12 +0000 UTC" firstStartedPulling="2025-10-11 02:41:14.978560428 +0000 UTC m=+6569.631540825" lastFinishedPulling="2025-10-11 02:41:17.439586011 +0000 UTC m=+6572.092566408" observedRunningTime="2025-10-11 02:41:18.037415786 +0000 UTC m=+6572.690396203" watchObservedRunningTime="2025-10-11 02:41:18.047977699 +0000 UTC m=+6572.700958096" Oct 11 02:41:23 crc kubenswrapper[4743]: I1011 02:41:23.297490 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tjm6q" Oct 11 02:41:23 crc kubenswrapper[4743]: I1011 02:41:23.298039 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tjm6q" Oct 11 02:41:23 crc kubenswrapper[4743]: I1011 02:41:23.346971 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tjm6q" Oct 11 02:41:24 crc kubenswrapper[4743]: I1011 02:41:24.144080 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tjm6q" Oct 11 02:41:24 crc kubenswrapper[4743]: I1011 02:41:24.210724 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjm6q"] Oct 11 02:41:25 crc kubenswrapper[4743]: I1011 02:41:25.091699 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:41:25 crc kubenswrapper[4743]: E1011 02:41:25.093000 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:41:26 crc kubenswrapper[4743]: I1011 02:41:26.103875 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tjm6q" podUID="cf7e8361-7cb6-499e-aa63-bc2ef01b08d7" containerName="registry-server" containerID="cri-o://a3ada3074887a3a4091e28f138aedb60bba19c16704b0dd6fb7494011b343285" gracePeriod=2 Oct 11 02:41:26 crc kubenswrapper[4743]: I1011 02:41:26.744063 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tjm6q" Oct 11 02:41:26 crc kubenswrapper[4743]: I1011 02:41:26.813745 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zx78p\" (UniqueName: \"kubernetes.io/projected/cf7e8361-7cb6-499e-aa63-bc2ef01b08d7-kube-api-access-zx78p\") pod \"cf7e8361-7cb6-499e-aa63-bc2ef01b08d7\" (UID: \"cf7e8361-7cb6-499e-aa63-bc2ef01b08d7\") " Oct 11 02:41:26 crc kubenswrapper[4743]: I1011 02:41:26.813957 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf7e8361-7cb6-499e-aa63-bc2ef01b08d7-utilities\") pod \"cf7e8361-7cb6-499e-aa63-bc2ef01b08d7\" (UID: \"cf7e8361-7cb6-499e-aa63-bc2ef01b08d7\") " Oct 11 02:41:26 crc kubenswrapper[4743]: I1011 02:41:26.814079 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf7e8361-7cb6-499e-aa63-bc2ef01b08d7-catalog-content\") pod \"cf7e8361-7cb6-499e-aa63-bc2ef01b08d7\" (UID: \"cf7e8361-7cb6-499e-aa63-bc2ef01b08d7\") " Oct 11 02:41:26 crc kubenswrapper[4743]: I1011 02:41:26.814609 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf7e8361-7cb6-499e-aa63-bc2ef01b08d7-utilities" (OuterVolumeSpecName: "utilities") pod "cf7e8361-7cb6-499e-aa63-bc2ef01b08d7" (UID: "cf7e8361-7cb6-499e-aa63-bc2ef01b08d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:41:26 crc kubenswrapper[4743]: I1011 02:41:26.826804 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf7e8361-7cb6-499e-aa63-bc2ef01b08d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf7e8361-7cb6-499e-aa63-bc2ef01b08d7" (UID: "cf7e8361-7cb6-499e-aa63-bc2ef01b08d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:41:26 crc kubenswrapper[4743]: I1011 02:41:26.830371 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf7e8361-7cb6-499e-aa63-bc2ef01b08d7-kube-api-access-zx78p" (OuterVolumeSpecName: "kube-api-access-zx78p") pod "cf7e8361-7cb6-499e-aa63-bc2ef01b08d7" (UID: "cf7e8361-7cb6-499e-aa63-bc2ef01b08d7"). InnerVolumeSpecName "kube-api-access-zx78p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:41:26 crc kubenswrapper[4743]: I1011 02:41:26.916647 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zx78p\" (UniqueName: \"kubernetes.io/projected/cf7e8361-7cb6-499e-aa63-bc2ef01b08d7-kube-api-access-zx78p\") on node \"crc\" DevicePath \"\"" Oct 11 02:41:26 crc kubenswrapper[4743]: I1011 02:41:26.916690 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf7e8361-7cb6-499e-aa63-bc2ef01b08d7-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 02:41:26 crc kubenswrapper[4743]: I1011 02:41:26.916704 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf7e8361-7cb6-499e-aa63-bc2ef01b08d7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 02:41:27 crc kubenswrapper[4743]: I1011 02:41:27.116155 4743 generic.go:334] "Generic (PLEG): container finished" podID="cf7e8361-7cb6-499e-aa63-bc2ef01b08d7" containerID="a3ada3074887a3a4091e28f138aedb60bba19c16704b0dd6fb7494011b343285" exitCode=0 Oct 11 02:41:27 crc kubenswrapper[4743]: I1011 02:41:27.116195 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjm6q" event={"ID":"cf7e8361-7cb6-499e-aa63-bc2ef01b08d7","Type":"ContainerDied","Data":"a3ada3074887a3a4091e28f138aedb60bba19c16704b0dd6fb7494011b343285"} Oct 11 02:41:27 crc kubenswrapper[4743]: I1011 02:41:27.116219 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tjm6q" event={"ID":"cf7e8361-7cb6-499e-aa63-bc2ef01b08d7","Type":"ContainerDied","Data":"e908b47a97bc632374db8ecfc8062ee631618be0c282ce3ee8e6e860c0185ca4"} Oct 11 02:41:27 crc kubenswrapper[4743]: I1011 02:41:27.116234 4743 scope.go:117] "RemoveContainer" containerID="a3ada3074887a3a4091e28f138aedb60bba19c16704b0dd6fb7494011b343285" Oct 11 02:41:27 crc kubenswrapper[4743]: I1011 02:41:27.116357 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tjm6q" Oct 11 02:41:27 crc kubenswrapper[4743]: I1011 02:41:27.147983 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjm6q"] Oct 11 02:41:27 crc kubenswrapper[4743]: I1011 02:41:27.149999 4743 scope.go:117] "RemoveContainer" containerID="80de73716376061a40f2c0bc10c702c566d2aa94adde23f22b8d6c492b29d360" Oct 11 02:41:27 crc kubenswrapper[4743]: I1011 02:41:27.161308 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tjm6q"] Oct 11 02:41:27 crc kubenswrapper[4743]: I1011 02:41:27.182589 4743 scope.go:117] "RemoveContainer" containerID="6b3f01509d61dc313281b0fb1dae0860dad3680c5ec8349ddf4f45ed69a2774d" Oct 11 02:41:27 crc kubenswrapper[4743]: I1011 02:41:27.223263 4743 scope.go:117] "RemoveContainer" containerID="a3ada3074887a3a4091e28f138aedb60bba19c16704b0dd6fb7494011b343285" Oct 11 02:41:27 crc kubenswrapper[4743]: E1011 02:41:27.223645 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3ada3074887a3a4091e28f138aedb60bba19c16704b0dd6fb7494011b343285\": container with ID starting with a3ada3074887a3a4091e28f138aedb60bba19c16704b0dd6fb7494011b343285 not found: ID does not exist" containerID="a3ada3074887a3a4091e28f138aedb60bba19c16704b0dd6fb7494011b343285" Oct 11 02:41:27 crc kubenswrapper[4743]: I1011 02:41:27.223690 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3ada3074887a3a4091e28f138aedb60bba19c16704b0dd6fb7494011b343285"} err="failed to get container status \"a3ada3074887a3a4091e28f138aedb60bba19c16704b0dd6fb7494011b343285\": rpc error: code = NotFound desc = could not find container \"a3ada3074887a3a4091e28f138aedb60bba19c16704b0dd6fb7494011b343285\": container with ID starting with a3ada3074887a3a4091e28f138aedb60bba19c16704b0dd6fb7494011b343285 not found: ID does not exist" Oct 11 02:41:27 crc kubenswrapper[4743]: I1011 02:41:27.223725 4743 scope.go:117] "RemoveContainer" containerID="80de73716376061a40f2c0bc10c702c566d2aa94adde23f22b8d6c492b29d360" Oct 11 02:41:27 crc kubenswrapper[4743]: E1011 02:41:27.224120 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80de73716376061a40f2c0bc10c702c566d2aa94adde23f22b8d6c492b29d360\": container with ID starting with 80de73716376061a40f2c0bc10c702c566d2aa94adde23f22b8d6c492b29d360 not found: ID does not exist" containerID="80de73716376061a40f2c0bc10c702c566d2aa94adde23f22b8d6c492b29d360" Oct 11 02:41:27 crc kubenswrapper[4743]: I1011 02:41:27.224144 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80de73716376061a40f2c0bc10c702c566d2aa94adde23f22b8d6c492b29d360"} err="failed to get container status \"80de73716376061a40f2c0bc10c702c566d2aa94adde23f22b8d6c492b29d360\": rpc error: code = NotFound desc = could not find container \"80de73716376061a40f2c0bc10c702c566d2aa94adde23f22b8d6c492b29d360\": container with ID starting with 80de73716376061a40f2c0bc10c702c566d2aa94adde23f22b8d6c492b29d360 not found: ID does not exist" Oct 11 02:41:27 crc kubenswrapper[4743]: I1011 02:41:27.224157 4743 scope.go:117] "RemoveContainer" containerID="6b3f01509d61dc313281b0fb1dae0860dad3680c5ec8349ddf4f45ed69a2774d" Oct 11 02:41:27 crc kubenswrapper[4743]: E1011 02:41:27.224360 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b3f01509d61dc313281b0fb1dae0860dad3680c5ec8349ddf4f45ed69a2774d\": container with ID starting with 6b3f01509d61dc313281b0fb1dae0860dad3680c5ec8349ddf4f45ed69a2774d not found: ID does not exist" containerID="6b3f01509d61dc313281b0fb1dae0860dad3680c5ec8349ddf4f45ed69a2774d" Oct 11 02:41:27 crc kubenswrapper[4743]: I1011 02:41:27.224381 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b3f01509d61dc313281b0fb1dae0860dad3680c5ec8349ddf4f45ed69a2774d"} err="failed to get container status \"6b3f01509d61dc313281b0fb1dae0860dad3680c5ec8349ddf4f45ed69a2774d\": rpc error: code = NotFound desc = could not find container \"6b3f01509d61dc313281b0fb1dae0860dad3680c5ec8349ddf4f45ed69a2774d\": container with ID starting with 6b3f01509d61dc313281b0fb1dae0860dad3680c5ec8349ddf4f45ed69a2774d not found: ID does not exist" Oct 11 02:41:28 crc kubenswrapper[4743]: I1011 02:41:28.122689 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf7e8361-7cb6-499e-aa63-bc2ef01b08d7" path="/var/lib/kubelet/pods/cf7e8361-7cb6-499e-aa63-bc2ef01b08d7/volumes" Oct 11 02:41:36 crc kubenswrapper[4743]: I1011 02:41:36.100432 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:41:36 crc kubenswrapper[4743]: E1011 02:41:36.101291 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:41:47 crc kubenswrapper[4743]: I1011 02:41:47.092606 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:41:47 crc kubenswrapper[4743]: E1011 02:41:47.093471 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:41:58 crc kubenswrapper[4743]: I1011 02:41:58.092646 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:41:58 crc kubenswrapper[4743]: E1011 02:41:58.093563 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:42:11 crc kubenswrapper[4743]: I1011 02:42:11.092164 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:42:11 crc kubenswrapper[4743]: E1011 02:42:11.093126 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:42:24 crc kubenswrapper[4743]: I1011 02:42:24.092326 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:42:24 crc kubenswrapper[4743]: E1011 02:42:24.093327 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:42:39 crc kubenswrapper[4743]: I1011 02:42:39.093347 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:42:39 crc kubenswrapper[4743]: E1011 02:42:39.094317 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:42:53 crc kubenswrapper[4743]: I1011 02:42:53.092344 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:42:54 crc kubenswrapper[4743]: I1011 02:42:54.135417 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"9bad1aa167993888f3603cd19501049640480abd6eccd72436d7aac64304a3a5"} Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.105833 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tvnl9"] Oct 11 02:45:00 crc kubenswrapper[4743]: E1011 02:45:00.108224 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7e8361-7cb6-499e-aa63-bc2ef01b08d7" containerName="extract-content" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.108330 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7e8361-7cb6-499e-aa63-bc2ef01b08d7" containerName="extract-content" Oct 11 02:45:00 crc kubenswrapper[4743]: E1011 02:45:00.108414 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7e8361-7cb6-499e-aa63-bc2ef01b08d7" containerName="extract-utilities" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.108473 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7e8361-7cb6-499e-aa63-bc2ef01b08d7" containerName="extract-utilities" Oct 11 02:45:00 crc kubenswrapper[4743]: E1011 02:45:00.108562 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7e8361-7cb6-499e-aa63-bc2ef01b08d7" containerName="registry-server" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.108629 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7e8361-7cb6-499e-aa63-bc2ef01b08d7" containerName="registry-server" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.108997 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf7e8361-7cb6-499e-aa63-bc2ef01b08d7" containerName="registry-server" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.111299 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tvnl9" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.117981 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tvnl9"] Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.218166 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335845-bmmvj"] Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.219776 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335845-bmmvj" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.221405 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7bb91e0-822d-464b-b827-b5e8431a43d4-utilities\") pod \"certified-operators-tvnl9\" (UID: \"d7bb91e0-822d-464b-b827-b5e8431a43d4\") " pod="openshift-marketplace/certified-operators-tvnl9" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.221499 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdm22\" (UniqueName: \"kubernetes.io/projected/d7bb91e0-822d-464b-b827-b5e8431a43d4-kube-api-access-cdm22\") pod \"certified-operators-tvnl9\" (UID: \"d7bb91e0-822d-464b-b827-b5e8431a43d4\") " pod="openshift-marketplace/certified-operators-tvnl9" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.221586 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7bb91e0-822d-464b-b827-b5e8431a43d4-catalog-content\") pod \"certified-operators-tvnl9\" (UID: \"d7bb91e0-822d-464b-b827-b5e8431a43d4\") " pod="openshift-marketplace/certified-operators-tvnl9" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.223872 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.224268 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.241383 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335845-bmmvj"] Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.324201 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7bb91e0-822d-464b-b827-b5e8431a43d4-catalog-content\") pod \"certified-operators-tvnl9\" (UID: \"d7bb91e0-822d-464b-b827-b5e8431a43d4\") " pod="openshift-marketplace/certified-operators-tvnl9" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.324264 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5464fb1d-4bf4-4e87-9ae4-9463cda9c553-secret-volume\") pod \"collect-profiles-29335845-bmmvj\" (UID: \"5464fb1d-4bf4-4e87-9ae4-9463cda9c553\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335845-bmmvj" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.324299 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5464fb1d-4bf4-4e87-9ae4-9463cda9c553-config-volume\") pod \"collect-profiles-29335845-bmmvj\" (UID: \"5464fb1d-4bf4-4e87-9ae4-9463cda9c553\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335845-bmmvj" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.324373 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sltsp\" (UniqueName: \"kubernetes.io/projected/5464fb1d-4bf4-4e87-9ae4-9463cda9c553-kube-api-access-sltsp\") pod \"collect-profiles-29335845-bmmvj\" (UID: \"5464fb1d-4bf4-4e87-9ae4-9463cda9c553\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335845-bmmvj" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.324603 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7bb91e0-822d-464b-b827-b5e8431a43d4-utilities\") pod \"certified-operators-tvnl9\" (UID: \"d7bb91e0-822d-464b-b827-b5e8431a43d4\") " pod="openshift-marketplace/certified-operators-tvnl9" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.324714 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdm22\" (UniqueName: \"kubernetes.io/projected/d7bb91e0-822d-464b-b827-b5e8431a43d4-kube-api-access-cdm22\") pod \"certified-operators-tvnl9\" (UID: \"d7bb91e0-822d-464b-b827-b5e8431a43d4\") " pod="openshift-marketplace/certified-operators-tvnl9" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.324930 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7bb91e0-822d-464b-b827-b5e8431a43d4-utilities\") pod \"certified-operators-tvnl9\" (UID: \"d7bb91e0-822d-464b-b827-b5e8431a43d4\") " pod="openshift-marketplace/certified-operators-tvnl9" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.325161 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7bb91e0-822d-464b-b827-b5e8431a43d4-catalog-content\") pod \"certified-operators-tvnl9\" (UID: \"d7bb91e0-822d-464b-b827-b5e8431a43d4\") " pod="openshift-marketplace/certified-operators-tvnl9" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.349058 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdm22\" (UniqueName: \"kubernetes.io/projected/d7bb91e0-822d-464b-b827-b5e8431a43d4-kube-api-access-cdm22\") pod \"certified-operators-tvnl9\" (UID: \"d7bb91e0-822d-464b-b827-b5e8431a43d4\") " pod="openshift-marketplace/certified-operators-tvnl9" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.427172 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5464fb1d-4bf4-4e87-9ae4-9463cda9c553-secret-volume\") pod \"collect-profiles-29335845-bmmvj\" (UID: \"5464fb1d-4bf4-4e87-9ae4-9463cda9c553\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335845-bmmvj" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.427745 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5464fb1d-4bf4-4e87-9ae4-9463cda9c553-config-volume\") pod \"collect-profiles-29335845-bmmvj\" (UID: \"5464fb1d-4bf4-4e87-9ae4-9463cda9c553\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335845-bmmvj" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.427839 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sltsp\" (UniqueName: \"kubernetes.io/projected/5464fb1d-4bf4-4e87-9ae4-9463cda9c553-kube-api-access-sltsp\") pod \"collect-profiles-29335845-bmmvj\" (UID: \"5464fb1d-4bf4-4e87-9ae4-9463cda9c553\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335845-bmmvj" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.428775 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5464fb1d-4bf4-4e87-9ae4-9463cda9c553-config-volume\") pod \"collect-profiles-29335845-bmmvj\" (UID: \"5464fb1d-4bf4-4e87-9ae4-9463cda9c553\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335845-bmmvj" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.430892 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5464fb1d-4bf4-4e87-9ae4-9463cda9c553-secret-volume\") pod \"collect-profiles-29335845-bmmvj\" (UID: \"5464fb1d-4bf4-4e87-9ae4-9463cda9c553\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335845-bmmvj" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.460633 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tvnl9" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.463296 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sltsp\" (UniqueName: \"kubernetes.io/projected/5464fb1d-4bf4-4e87-9ae4-9463cda9c553-kube-api-access-sltsp\") pod \"collect-profiles-29335845-bmmvj\" (UID: \"5464fb1d-4bf4-4e87-9ae4-9463cda9c553\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335845-bmmvj" Oct 11 02:45:00 crc kubenswrapper[4743]: I1011 02:45:00.546510 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335845-bmmvj" Oct 11 02:45:01 crc kubenswrapper[4743]: I1011 02:45:01.123724 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tvnl9"] Oct 11 02:45:01 crc kubenswrapper[4743]: I1011 02:45:01.221527 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335845-bmmvj"] Oct 11 02:45:01 crc kubenswrapper[4743]: W1011 02:45:01.234127 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5464fb1d_4bf4_4e87_9ae4_9463cda9c553.slice/crio-98087a4af820673b479014917e3455f38917a8905e94ebe51d085d9b4622c8a7 WatchSource:0}: Error finding container 98087a4af820673b479014917e3455f38917a8905e94ebe51d085d9b4622c8a7: Status 404 returned error can't find the container with id 98087a4af820673b479014917e3455f38917a8905e94ebe51d085d9b4622c8a7 Oct 11 02:45:01 crc kubenswrapper[4743]: I1011 02:45:01.620787 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335845-bmmvj" event={"ID":"5464fb1d-4bf4-4e87-9ae4-9463cda9c553","Type":"ContainerStarted","Data":"4b48fe85a206b00969901f37d76fdc0714287eee8442253333a8627b4b8a9e77"} Oct 11 02:45:01 crc kubenswrapper[4743]: I1011 02:45:01.621049 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335845-bmmvj" event={"ID":"5464fb1d-4bf4-4e87-9ae4-9463cda9c553","Type":"ContainerStarted","Data":"98087a4af820673b479014917e3455f38917a8905e94ebe51d085d9b4622c8a7"} Oct 11 02:45:01 crc kubenswrapper[4743]: I1011 02:45:01.622763 4743 generic.go:334] "Generic (PLEG): container finished" podID="d7bb91e0-822d-464b-b827-b5e8431a43d4" containerID="5bf74c3ed180589e68a6a9ee21709b15bfcd7a369474cf6a812fa81c5e9688d7" exitCode=0 Oct 11 02:45:01 crc kubenswrapper[4743]: I1011 02:45:01.622794 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tvnl9" event={"ID":"d7bb91e0-822d-464b-b827-b5e8431a43d4","Type":"ContainerDied","Data":"5bf74c3ed180589e68a6a9ee21709b15bfcd7a369474cf6a812fa81c5e9688d7"} Oct 11 02:45:01 crc kubenswrapper[4743]: I1011 02:45:01.622808 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tvnl9" event={"ID":"d7bb91e0-822d-464b-b827-b5e8431a43d4","Type":"ContainerStarted","Data":"c94ab0a236ed990c3665436bf621759dda91427c40a979c5de29c33b64f05282"} Oct 11 02:45:01 crc kubenswrapper[4743]: I1011 02:45:01.624432 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 02:45:01 crc kubenswrapper[4743]: I1011 02:45:01.643539 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29335845-bmmvj" podStartSLOduration=1.643512348 podStartE2EDuration="1.643512348s" podCreationTimestamp="2025-10-11 02:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 02:45:01.640289887 +0000 UTC m=+6796.293270294" watchObservedRunningTime="2025-10-11 02:45:01.643512348 +0000 UTC m=+6796.296492745" Oct 11 02:45:02 crc kubenswrapper[4743]: I1011 02:45:02.484376 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4rkg6"] Oct 11 02:45:02 crc kubenswrapper[4743]: I1011 02:45:02.488116 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rkg6" Oct 11 02:45:02 crc kubenswrapper[4743]: I1011 02:45:02.499210 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4rkg6"] Oct 11 02:45:02 crc kubenswrapper[4743]: I1011 02:45:02.595603 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfdjs\" (UniqueName: \"kubernetes.io/projected/d7510726-3230-4f35-a9ee-311df506cd70-kube-api-access-gfdjs\") pod \"community-operators-4rkg6\" (UID: \"d7510726-3230-4f35-a9ee-311df506cd70\") " pod="openshift-marketplace/community-operators-4rkg6" Oct 11 02:45:02 crc kubenswrapper[4743]: I1011 02:45:02.595703 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7510726-3230-4f35-a9ee-311df506cd70-utilities\") pod \"community-operators-4rkg6\" (UID: \"d7510726-3230-4f35-a9ee-311df506cd70\") " pod="openshift-marketplace/community-operators-4rkg6" Oct 11 02:45:02 crc kubenswrapper[4743]: I1011 02:45:02.595870 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7510726-3230-4f35-a9ee-311df506cd70-catalog-content\") pod \"community-operators-4rkg6\" (UID: \"d7510726-3230-4f35-a9ee-311df506cd70\") " pod="openshift-marketplace/community-operators-4rkg6" Oct 11 02:45:02 crc kubenswrapper[4743]: I1011 02:45:02.635769 4743 generic.go:334] "Generic (PLEG): container finished" podID="5464fb1d-4bf4-4e87-9ae4-9463cda9c553" containerID="4b48fe85a206b00969901f37d76fdc0714287eee8442253333a8627b4b8a9e77" exitCode=0 Oct 11 02:45:02 crc kubenswrapper[4743]: I1011 02:45:02.635809 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335845-bmmvj" event={"ID":"5464fb1d-4bf4-4e87-9ae4-9463cda9c553","Type":"ContainerDied","Data":"4b48fe85a206b00969901f37d76fdc0714287eee8442253333a8627b4b8a9e77"} Oct 11 02:45:02 crc kubenswrapper[4743]: I1011 02:45:02.698075 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfdjs\" (UniqueName: \"kubernetes.io/projected/d7510726-3230-4f35-a9ee-311df506cd70-kube-api-access-gfdjs\") pod \"community-operators-4rkg6\" (UID: \"d7510726-3230-4f35-a9ee-311df506cd70\") " pod="openshift-marketplace/community-operators-4rkg6" Oct 11 02:45:02 crc kubenswrapper[4743]: I1011 02:45:02.698387 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7510726-3230-4f35-a9ee-311df506cd70-utilities\") pod \"community-operators-4rkg6\" (UID: \"d7510726-3230-4f35-a9ee-311df506cd70\") " pod="openshift-marketplace/community-operators-4rkg6" Oct 11 02:45:02 crc kubenswrapper[4743]: I1011 02:45:02.698481 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7510726-3230-4f35-a9ee-311df506cd70-catalog-content\") pod \"community-operators-4rkg6\" (UID: \"d7510726-3230-4f35-a9ee-311df506cd70\") " pod="openshift-marketplace/community-operators-4rkg6" Oct 11 02:45:02 crc kubenswrapper[4743]: I1011 02:45:02.698912 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7510726-3230-4f35-a9ee-311df506cd70-utilities\") pod \"community-operators-4rkg6\" (UID: \"d7510726-3230-4f35-a9ee-311df506cd70\") " pod="openshift-marketplace/community-operators-4rkg6" Oct 11 02:45:02 crc kubenswrapper[4743]: I1011 02:45:02.700346 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7510726-3230-4f35-a9ee-311df506cd70-catalog-content\") pod \"community-operators-4rkg6\" (UID: \"d7510726-3230-4f35-a9ee-311df506cd70\") " pod="openshift-marketplace/community-operators-4rkg6" Oct 11 02:45:02 crc kubenswrapper[4743]: I1011 02:45:02.721714 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfdjs\" (UniqueName: \"kubernetes.io/projected/d7510726-3230-4f35-a9ee-311df506cd70-kube-api-access-gfdjs\") pod \"community-operators-4rkg6\" (UID: \"d7510726-3230-4f35-a9ee-311df506cd70\") " pod="openshift-marketplace/community-operators-4rkg6" Oct 11 02:45:02 crc kubenswrapper[4743]: I1011 02:45:02.815047 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rkg6" Oct 11 02:45:03 crc kubenswrapper[4743]: I1011 02:45:03.287704 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4rkg6"] Oct 11 02:45:03 crc kubenswrapper[4743]: I1011 02:45:03.646923 4743 generic.go:334] "Generic (PLEG): container finished" podID="d7510726-3230-4f35-a9ee-311df506cd70" containerID="b14eef45adc24ab71e5e489cb756c50ef333de338083e06f845df2c840232e3e" exitCode=0 Oct 11 02:45:03 crc kubenswrapper[4743]: I1011 02:45:03.646993 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rkg6" event={"ID":"d7510726-3230-4f35-a9ee-311df506cd70","Type":"ContainerDied","Data":"b14eef45adc24ab71e5e489cb756c50ef333de338083e06f845df2c840232e3e"} Oct 11 02:45:03 crc kubenswrapper[4743]: I1011 02:45:03.647233 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rkg6" event={"ID":"d7510726-3230-4f35-a9ee-311df506cd70","Type":"ContainerStarted","Data":"96b746396499cbb1eb3da93c7c068cec10024a3f0ab9bbee9ded9e94646adf7d"} Oct 11 02:45:03 crc kubenswrapper[4743]: I1011 02:45:03.649079 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tvnl9" event={"ID":"d7bb91e0-822d-464b-b827-b5e8431a43d4","Type":"ContainerStarted","Data":"16e980e5d5d0e64a2834288b3f393c88fb7b1fa47c89be95268f98605258a9de"} Oct 11 02:45:04 crc kubenswrapper[4743]: I1011 02:45:04.046806 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335845-bmmvj" Oct 11 02:45:04 crc kubenswrapper[4743]: I1011 02:45:04.123465 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5464fb1d-4bf4-4e87-9ae4-9463cda9c553-secret-volume\") pod \"5464fb1d-4bf4-4e87-9ae4-9463cda9c553\" (UID: \"5464fb1d-4bf4-4e87-9ae4-9463cda9c553\") " Oct 11 02:45:04 crc kubenswrapper[4743]: I1011 02:45:04.123525 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sltsp\" (UniqueName: \"kubernetes.io/projected/5464fb1d-4bf4-4e87-9ae4-9463cda9c553-kube-api-access-sltsp\") pod \"5464fb1d-4bf4-4e87-9ae4-9463cda9c553\" (UID: \"5464fb1d-4bf4-4e87-9ae4-9463cda9c553\") " Oct 11 02:45:04 crc kubenswrapper[4743]: I1011 02:45:04.123571 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5464fb1d-4bf4-4e87-9ae4-9463cda9c553-config-volume\") pod \"5464fb1d-4bf4-4e87-9ae4-9463cda9c553\" (UID: \"5464fb1d-4bf4-4e87-9ae4-9463cda9c553\") " Oct 11 02:45:04 crc kubenswrapper[4743]: I1011 02:45:04.124590 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5464fb1d-4bf4-4e87-9ae4-9463cda9c553-config-volume" (OuterVolumeSpecName: "config-volume") pod "5464fb1d-4bf4-4e87-9ae4-9463cda9c553" (UID: "5464fb1d-4bf4-4e87-9ae4-9463cda9c553"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 02:45:04 crc kubenswrapper[4743]: I1011 02:45:04.137559 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5464fb1d-4bf4-4e87-9ae4-9463cda9c553-kube-api-access-sltsp" (OuterVolumeSpecName: "kube-api-access-sltsp") pod "5464fb1d-4bf4-4e87-9ae4-9463cda9c553" (UID: "5464fb1d-4bf4-4e87-9ae4-9463cda9c553"). InnerVolumeSpecName "kube-api-access-sltsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:45:04 crc kubenswrapper[4743]: I1011 02:45:04.142585 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5464fb1d-4bf4-4e87-9ae4-9463cda9c553-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5464fb1d-4bf4-4e87-9ae4-9463cda9c553" (UID: "5464fb1d-4bf4-4e87-9ae4-9463cda9c553"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:45:04 crc kubenswrapper[4743]: I1011 02:45:04.227124 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5464fb1d-4bf4-4e87-9ae4-9463cda9c553-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 11 02:45:04 crc kubenswrapper[4743]: I1011 02:45:04.227378 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sltsp\" (UniqueName: \"kubernetes.io/projected/5464fb1d-4bf4-4e87-9ae4-9463cda9c553-kube-api-access-sltsp\") on node \"crc\" DevicePath \"\"" Oct 11 02:45:04 crc kubenswrapper[4743]: I1011 02:45:04.227439 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5464fb1d-4bf4-4e87-9ae4-9463cda9c553-config-volume\") on node \"crc\" DevicePath \"\"" Oct 11 02:45:04 crc kubenswrapper[4743]: I1011 02:45:04.284393 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335800-4bpgr"] Oct 11 02:45:04 crc kubenswrapper[4743]: I1011 02:45:04.297721 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335800-4bpgr"] Oct 11 02:45:04 crc kubenswrapper[4743]: I1011 02:45:04.663986 4743 generic.go:334] "Generic (PLEG): container finished" podID="d7bb91e0-822d-464b-b827-b5e8431a43d4" containerID="16e980e5d5d0e64a2834288b3f393c88fb7b1fa47c89be95268f98605258a9de" exitCode=0 Oct 11 02:45:04 crc kubenswrapper[4743]: I1011 02:45:04.664041 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tvnl9" event={"ID":"d7bb91e0-822d-464b-b827-b5e8431a43d4","Type":"ContainerDied","Data":"16e980e5d5d0e64a2834288b3f393c88fb7b1fa47c89be95268f98605258a9de"} Oct 11 02:45:04 crc kubenswrapper[4743]: I1011 02:45:04.668348 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335845-bmmvj" event={"ID":"5464fb1d-4bf4-4e87-9ae4-9463cda9c553","Type":"ContainerDied","Data":"98087a4af820673b479014917e3455f38917a8905e94ebe51d085d9b4622c8a7"} Oct 11 02:45:04 crc kubenswrapper[4743]: I1011 02:45:04.668438 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335845-bmmvj" Oct 11 02:45:04 crc kubenswrapper[4743]: I1011 02:45:04.668383 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98087a4af820673b479014917e3455f38917a8905e94ebe51d085d9b4622c8a7" Oct 11 02:45:05 crc kubenswrapper[4743]: I1011 02:45:05.693137 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rkg6" event={"ID":"d7510726-3230-4f35-a9ee-311df506cd70","Type":"ContainerStarted","Data":"4c257e6e461c522784d5f6fa7f827edcee51ca0a53180d87185d6a6389c078b4"} Oct 11 02:45:05 crc kubenswrapper[4743]: I1011 02:45:05.695307 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tvnl9" event={"ID":"d7bb91e0-822d-464b-b827-b5e8431a43d4","Type":"ContainerStarted","Data":"a1291c7cabe69d18d844a08637a9823a5247acee6d4c5cf86f2cad555d89e49a"} Oct 11 02:45:05 crc kubenswrapper[4743]: I1011 02:45:05.749631 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tvnl9" podStartSLOduration=2.2861217209999998 podStartE2EDuration="5.749611396s" podCreationTimestamp="2025-10-11 02:45:00 +0000 UTC" firstStartedPulling="2025-10-11 02:45:01.624148094 +0000 UTC m=+6796.277128491" lastFinishedPulling="2025-10-11 02:45:05.087637779 +0000 UTC m=+6799.740618166" observedRunningTime="2025-10-11 02:45:05.739060114 +0000 UTC m=+6800.392040531" watchObservedRunningTime="2025-10-11 02:45:05.749611396 +0000 UTC m=+6800.402591803" Oct 11 02:45:06 crc kubenswrapper[4743]: I1011 02:45:06.103926 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e2de2bb-f463-4ca1-9666-34d2e889665c" path="/var/lib/kubelet/pods/0e2de2bb-f463-4ca1-9666-34d2e889665c/volumes" Oct 11 02:45:07 crc kubenswrapper[4743]: I1011 02:45:07.714056 4743 generic.go:334] "Generic (PLEG): container finished" podID="d7510726-3230-4f35-a9ee-311df506cd70" containerID="4c257e6e461c522784d5f6fa7f827edcee51ca0a53180d87185d6a6389c078b4" exitCode=0 Oct 11 02:45:07 crc kubenswrapper[4743]: I1011 02:45:07.714121 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rkg6" event={"ID":"d7510726-3230-4f35-a9ee-311df506cd70","Type":"ContainerDied","Data":"4c257e6e461c522784d5f6fa7f827edcee51ca0a53180d87185d6a6389c078b4"} Oct 11 02:45:08 crc kubenswrapper[4743]: I1011 02:45:08.725362 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rkg6" event={"ID":"d7510726-3230-4f35-a9ee-311df506cd70","Type":"ContainerStarted","Data":"78bed97dd7b710aa4ae6c72bddab565eb96caa0b33dad726d9fe00ec5ac502b0"} Oct 11 02:45:08 crc kubenswrapper[4743]: I1011 02:45:08.745537 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4rkg6" podStartSLOduration=2.173545075 podStartE2EDuration="6.74551977s" podCreationTimestamp="2025-10-11 02:45:02 +0000 UTC" firstStartedPulling="2025-10-11 02:45:03.648623229 +0000 UTC m=+6798.301603626" lastFinishedPulling="2025-10-11 02:45:08.220597924 +0000 UTC m=+6802.873578321" observedRunningTime="2025-10-11 02:45:08.742020733 +0000 UTC m=+6803.395001170" watchObservedRunningTime="2025-10-11 02:45:08.74551977 +0000 UTC m=+6803.398500167" Oct 11 02:45:10 crc kubenswrapper[4743]: I1011 02:45:10.461270 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tvnl9" Oct 11 02:45:10 crc kubenswrapper[4743]: I1011 02:45:10.461886 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tvnl9" Oct 11 02:45:10 crc kubenswrapper[4743]: I1011 02:45:10.506043 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tvnl9" Oct 11 02:45:10 crc kubenswrapper[4743]: I1011 02:45:10.802327 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tvnl9" Oct 11 02:45:11 crc kubenswrapper[4743]: I1011 02:45:11.889502 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tvnl9"] Oct 11 02:45:12 crc kubenswrapper[4743]: I1011 02:45:12.767655 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tvnl9" podUID="d7bb91e0-822d-464b-b827-b5e8431a43d4" containerName="registry-server" containerID="cri-o://a1291c7cabe69d18d844a08637a9823a5247acee6d4c5cf86f2cad555d89e49a" gracePeriod=2 Oct 11 02:45:12 crc kubenswrapper[4743]: I1011 02:45:12.816174 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4rkg6" Oct 11 02:45:12 crc kubenswrapper[4743]: I1011 02:45:12.816220 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4rkg6" Oct 11 02:45:12 crc kubenswrapper[4743]: I1011 02:45:12.869405 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4rkg6" Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.369771 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tvnl9" Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.463231 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7bb91e0-822d-464b-b827-b5e8431a43d4-utilities\") pod \"d7bb91e0-822d-464b-b827-b5e8431a43d4\" (UID: \"d7bb91e0-822d-464b-b827-b5e8431a43d4\") " Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.463625 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7bb91e0-822d-464b-b827-b5e8431a43d4-catalog-content\") pod \"d7bb91e0-822d-464b-b827-b5e8431a43d4\" (UID: \"d7bb91e0-822d-464b-b827-b5e8431a43d4\") " Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.463721 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdm22\" (UniqueName: \"kubernetes.io/projected/d7bb91e0-822d-464b-b827-b5e8431a43d4-kube-api-access-cdm22\") pod \"d7bb91e0-822d-464b-b827-b5e8431a43d4\" (UID: \"d7bb91e0-822d-464b-b827-b5e8431a43d4\") " Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.464402 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7bb91e0-822d-464b-b827-b5e8431a43d4-utilities" (OuterVolumeSpecName: "utilities") pod "d7bb91e0-822d-464b-b827-b5e8431a43d4" (UID: "d7bb91e0-822d-464b-b827-b5e8431a43d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.465465 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7bb91e0-822d-464b-b827-b5e8431a43d4-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.470003 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7bb91e0-822d-464b-b827-b5e8431a43d4-kube-api-access-cdm22" (OuterVolumeSpecName: "kube-api-access-cdm22") pod "d7bb91e0-822d-464b-b827-b5e8431a43d4" (UID: "d7bb91e0-822d-464b-b827-b5e8431a43d4"). InnerVolumeSpecName "kube-api-access-cdm22". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.516001 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7bb91e0-822d-464b-b827-b5e8431a43d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7bb91e0-822d-464b-b827-b5e8431a43d4" (UID: "d7bb91e0-822d-464b-b827-b5e8431a43d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.567601 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7bb91e0-822d-464b-b827-b5e8431a43d4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.567660 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdm22\" (UniqueName: \"kubernetes.io/projected/d7bb91e0-822d-464b-b827-b5e8431a43d4-kube-api-access-cdm22\") on node \"crc\" DevicePath \"\"" Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.783121 4743 generic.go:334] "Generic (PLEG): container finished" podID="d7bb91e0-822d-464b-b827-b5e8431a43d4" containerID="a1291c7cabe69d18d844a08637a9823a5247acee6d4c5cf86f2cad555d89e49a" exitCode=0 Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.783551 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tvnl9" Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.783997 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tvnl9" event={"ID":"d7bb91e0-822d-464b-b827-b5e8431a43d4","Type":"ContainerDied","Data":"a1291c7cabe69d18d844a08637a9823a5247acee6d4c5cf86f2cad555d89e49a"} Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.784053 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tvnl9" event={"ID":"d7bb91e0-822d-464b-b827-b5e8431a43d4","Type":"ContainerDied","Data":"c94ab0a236ed990c3665436bf621759dda91427c40a979c5de29c33b64f05282"} Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.784071 4743 scope.go:117] "RemoveContainer" containerID="a1291c7cabe69d18d844a08637a9823a5247acee6d4c5cf86f2cad555d89e49a" Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.818952 4743 scope.go:117] "RemoveContainer" containerID="16e980e5d5d0e64a2834288b3f393c88fb7b1fa47c89be95268f98605258a9de" Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.834822 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tvnl9"] Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.843989 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4rkg6" Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.845597 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tvnl9"] Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.860945 4743 scope.go:117] "RemoveContainer" containerID="5bf74c3ed180589e68a6a9ee21709b15bfcd7a369474cf6a812fa81c5e9688d7" Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.929874 4743 scope.go:117] "RemoveContainer" containerID="a1291c7cabe69d18d844a08637a9823a5247acee6d4c5cf86f2cad555d89e49a" Oct 11 02:45:13 crc kubenswrapper[4743]: E1011 02:45:13.930511 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1291c7cabe69d18d844a08637a9823a5247acee6d4c5cf86f2cad555d89e49a\": container with ID starting with a1291c7cabe69d18d844a08637a9823a5247acee6d4c5cf86f2cad555d89e49a not found: ID does not exist" containerID="a1291c7cabe69d18d844a08637a9823a5247acee6d4c5cf86f2cad555d89e49a" Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.930571 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1291c7cabe69d18d844a08637a9823a5247acee6d4c5cf86f2cad555d89e49a"} err="failed to get container status \"a1291c7cabe69d18d844a08637a9823a5247acee6d4c5cf86f2cad555d89e49a\": rpc error: code = NotFound desc = could not find container \"a1291c7cabe69d18d844a08637a9823a5247acee6d4c5cf86f2cad555d89e49a\": container with ID starting with a1291c7cabe69d18d844a08637a9823a5247acee6d4c5cf86f2cad555d89e49a not found: ID does not exist" Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.930618 4743 scope.go:117] "RemoveContainer" containerID="16e980e5d5d0e64a2834288b3f393c88fb7b1fa47c89be95268f98605258a9de" Oct 11 02:45:13 crc kubenswrapper[4743]: E1011 02:45:13.931135 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16e980e5d5d0e64a2834288b3f393c88fb7b1fa47c89be95268f98605258a9de\": container with ID starting with 16e980e5d5d0e64a2834288b3f393c88fb7b1fa47c89be95268f98605258a9de not found: ID does not exist" containerID="16e980e5d5d0e64a2834288b3f393c88fb7b1fa47c89be95268f98605258a9de" Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.931184 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16e980e5d5d0e64a2834288b3f393c88fb7b1fa47c89be95268f98605258a9de"} err="failed to get container status \"16e980e5d5d0e64a2834288b3f393c88fb7b1fa47c89be95268f98605258a9de\": rpc error: code = NotFound desc = could not find container \"16e980e5d5d0e64a2834288b3f393c88fb7b1fa47c89be95268f98605258a9de\": container with ID starting with 16e980e5d5d0e64a2834288b3f393c88fb7b1fa47c89be95268f98605258a9de not found: ID does not exist" Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.931239 4743 scope.go:117] "RemoveContainer" containerID="5bf74c3ed180589e68a6a9ee21709b15bfcd7a369474cf6a812fa81c5e9688d7" Oct 11 02:45:13 crc kubenswrapper[4743]: E1011 02:45:13.931725 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bf74c3ed180589e68a6a9ee21709b15bfcd7a369474cf6a812fa81c5e9688d7\": container with ID starting with 5bf74c3ed180589e68a6a9ee21709b15bfcd7a369474cf6a812fa81c5e9688d7 not found: ID does not exist" containerID="5bf74c3ed180589e68a6a9ee21709b15bfcd7a369474cf6a812fa81c5e9688d7" Oct 11 02:45:13 crc kubenswrapper[4743]: I1011 02:45:13.931769 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bf74c3ed180589e68a6a9ee21709b15bfcd7a369474cf6a812fa81c5e9688d7"} err="failed to get container status \"5bf74c3ed180589e68a6a9ee21709b15bfcd7a369474cf6a812fa81c5e9688d7\": rpc error: code = NotFound desc = could not find container \"5bf74c3ed180589e68a6a9ee21709b15bfcd7a369474cf6a812fa81c5e9688d7\": container with ID starting with 5bf74c3ed180589e68a6a9ee21709b15bfcd7a369474cf6a812fa81c5e9688d7 not found: ID does not exist" Oct 11 02:45:14 crc kubenswrapper[4743]: I1011 02:45:14.106010 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7bb91e0-822d-464b-b827-b5e8431a43d4" path="/var/lib/kubelet/pods/d7bb91e0-822d-464b-b827-b5e8431a43d4/volumes" Oct 11 02:45:14 crc kubenswrapper[4743]: I1011 02:45:14.458274 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:45:14 crc kubenswrapper[4743]: I1011 02:45:14.458656 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:45:15 crc kubenswrapper[4743]: I1011 02:45:15.279769 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4rkg6"] Oct 11 02:45:15 crc kubenswrapper[4743]: I1011 02:45:15.805520 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4rkg6" podUID="d7510726-3230-4f35-a9ee-311df506cd70" containerName="registry-server" containerID="cri-o://78bed97dd7b710aa4ae6c72bddab565eb96caa0b33dad726d9fe00ec5ac502b0" gracePeriod=2 Oct 11 02:45:16 crc kubenswrapper[4743]: I1011 02:45:16.404118 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rkg6" Oct 11 02:45:16 crc kubenswrapper[4743]: I1011 02:45:16.542322 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7510726-3230-4f35-a9ee-311df506cd70-catalog-content\") pod \"d7510726-3230-4f35-a9ee-311df506cd70\" (UID: \"d7510726-3230-4f35-a9ee-311df506cd70\") " Oct 11 02:45:16 crc kubenswrapper[4743]: I1011 02:45:16.542832 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7510726-3230-4f35-a9ee-311df506cd70-utilities\") pod \"d7510726-3230-4f35-a9ee-311df506cd70\" (UID: \"d7510726-3230-4f35-a9ee-311df506cd70\") " Oct 11 02:45:16 crc kubenswrapper[4743]: I1011 02:45:16.543159 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfdjs\" (UniqueName: \"kubernetes.io/projected/d7510726-3230-4f35-a9ee-311df506cd70-kube-api-access-gfdjs\") pod \"d7510726-3230-4f35-a9ee-311df506cd70\" (UID: \"d7510726-3230-4f35-a9ee-311df506cd70\") " Oct 11 02:45:16 crc kubenswrapper[4743]: I1011 02:45:16.543475 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7510726-3230-4f35-a9ee-311df506cd70-utilities" (OuterVolumeSpecName: "utilities") pod "d7510726-3230-4f35-a9ee-311df506cd70" (UID: "d7510726-3230-4f35-a9ee-311df506cd70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:45:16 crc kubenswrapper[4743]: I1011 02:45:16.544623 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7510726-3230-4f35-a9ee-311df506cd70-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 02:45:16 crc kubenswrapper[4743]: I1011 02:45:16.548918 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7510726-3230-4f35-a9ee-311df506cd70-kube-api-access-gfdjs" (OuterVolumeSpecName: "kube-api-access-gfdjs") pod "d7510726-3230-4f35-a9ee-311df506cd70" (UID: "d7510726-3230-4f35-a9ee-311df506cd70"). InnerVolumeSpecName "kube-api-access-gfdjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:45:16 crc kubenswrapper[4743]: I1011 02:45:16.597282 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7510726-3230-4f35-a9ee-311df506cd70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7510726-3230-4f35-a9ee-311df506cd70" (UID: "d7510726-3230-4f35-a9ee-311df506cd70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:45:16 crc kubenswrapper[4743]: I1011 02:45:16.647516 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfdjs\" (UniqueName: \"kubernetes.io/projected/d7510726-3230-4f35-a9ee-311df506cd70-kube-api-access-gfdjs\") on node \"crc\" DevicePath \"\"" Oct 11 02:45:16 crc kubenswrapper[4743]: I1011 02:45:16.647562 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7510726-3230-4f35-a9ee-311df506cd70-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 02:45:16 crc kubenswrapper[4743]: I1011 02:45:16.816211 4743 generic.go:334] "Generic (PLEG): container finished" podID="d7510726-3230-4f35-a9ee-311df506cd70" containerID="78bed97dd7b710aa4ae6c72bddab565eb96caa0b33dad726d9fe00ec5ac502b0" exitCode=0 Oct 11 02:45:16 crc kubenswrapper[4743]: I1011 02:45:16.816248 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rkg6" event={"ID":"d7510726-3230-4f35-a9ee-311df506cd70","Type":"ContainerDied","Data":"78bed97dd7b710aa4ae6c72bddab565eb96caa0b33dad726d9fe00ec5ac502b0"} Oct 11 02:45:16 crc kubenswrapper[4743]: I1011 02:45:16.816273 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rkg6" event={"ID":"d7510726-3230-4f35-a9ee-311df506cd70","Type":"ContainerDied","Data":"96b746396499cbb1eb3da93c7c068cec10024a3f0ab9bbee9ded9e94646adf7d"} Oct 11 02:45:16 crc kubenswrapper[4743]: I1011 02:45:16.816276 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rkg6" Oct 11 02:45:16 crc kubenswrapper[4743]: I1011 02:45:16.816292 4743 scope.go:117] "RemoveContainer" containerID="78bed97dd7b710aa4ae6c72bddab565eb96caa0b33dad726d9fe00ec5ac502b0" Oct 11 02:45:16 crc kubenswrapper[4743]: I1011 02:45:16.846637 4743 scope.go:117] "RemoveContainer" containerID="4c257e6e461c522784d5f6fa7f827edcee51ca0a53180d87185d6a6389c078b4" Oct 11 02:45:16 crc kubenswrapper[4743]: I1011 02:45:16.872684 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4rkg6"] Oct 11 02:45:16 crc kubenswrapper[4743]: I1011 02:45:16.883303 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4rkg6"] Oct 11 02:45:16 crc kubenswrapper[4743]: I1011 02:45:16.901107 4743 scope.go:117] "RemoveContainer" containerID="b14eef45adc24ab71e5e489cb756c50ef333de338083e06f845df2c840232e3e" Oct 11 02:45:16 crc kubenswrapper[4743]: I1011 02:45:16.953164 4743 scope.go:117] "RemoveContainer" containerID="78bed97dd7b710aa4ae6c72bddab565eb96caa0b33dad726d9fe00ec5ac502b0" Oct 11 02:45:16 crc kubenswrapper[4743]: E1011 02:45:16.954058 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78bed97dd7b710aa4ae6c72bddab565eb96caa0b33dad726d9fe00ec5ac502b0\": container with ID starting with 78bed97dd7b710aa4ae6c72bddab565eb96caa0b33dad726d9fe00ec5ac502b0 not found: ID does not exist" containerID="78bed97dd7b710aa4ae6c72bddab565eb96caa0b33dad726d9fe00ec5ac502b0" Oct 11 02:45:16 crc kubenswrapper[4743]: I1011 02:45:16.954119 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78bed97dd7b710aa4ae6c72bddab565eb96caa0b33dad726d9fe00ec5ac502b0"} err="failed to get container status \"78bed97dd7b710aa4ae6c72bddab565eb96caa0b33dad726d9fe00ec5ac502b0\": rpc error: code = NotFound desc = could not find container \"78bed97dd7b710aa4ae6c72bddab565eb96caa0b33dad726d9fe00ec5ac502b0\": container with ID starting with 78bed97dd7b710aa4ae6c72bddab565eb96caa0b33dad726d9fe00ec5ac502b0 not found: ID does not exist" Oct 11 02:45:16 crc kubenswrapper[4743]: I1011 02:45:16.954153 4743 scope.go:117] "RemoveContainer" containerID="4c257e6e461c522784d5f6fa7f827edcee51ca0a53180d87185d6a6389c078b4" Oct 11 02:45:16 crc kubenswrapper[4743]: E1011 02:45:16.955645 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c257e6e461c522784d5f6fa7f827edcee51ca0a53180d87185d6a6389c078b4\": container with ID starting with 4c257e6e461c522784d5f6fa7f827edcee51ca0a53180d87185d6a6389c078b4 not found: ID does not exist" containerID="4c257e6e461c522784d5f6fa7f827edcee51ca0a53180d87185d6a6389c078b4" Oct 11 02:45:16 crc kubenswrapper[4743]: I1011 02:45:16.955677 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c257e6e461c522784d5f6fa7f827edcee51ca0a53180d87185d6a6389c078b4"} err="failed to get container status \"4c257e6e461c522784d5f6fa7f827edcee51ca0a53180d87185d6a6389c078b4\": rpc error: code = NotFound desc = could not find container \"4c257e6e461c522784d5f6fa7f827edcee51ca0a53180d87185d6a6389c078b4\": container with ID starting with 4c257e6e461c522784d5f6fa7f827edcee51ca0a53180d87185d6a6389c078b4 not found: ID does not exist" Oct 11 02:45:16 crc kubenswrapper[4743]: I1011 02:45:16.955698 4743 scope.go:117] "RemoveContainer" containerID="b14eef45adc24ab71e5e489cb756c50ef333de338083e06f845df2c840232e3e" Oct 11 02:45:16 crc kubenswrapper[4743]: E1011 02:45:16.955965 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b14eef45adc24ab71e5e489cb756c50ef333de338083e06f845df2c840232e3e\": container with ID starting with b14eef45adc24ab71e5e489cb756c50ef333de338083e06f845df2c840232e3e not found: ID does not exist" containerID="b14eef45adc24ab71e5e489cb756c50ef333de338083e06f845df2c840232e3e" Oct 11 02:45:16 crc kubenswrapper[4743]: I1011 02:45:16.955982 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b14eef45adc24ab71e5e489cb756c50ef333de338083e06f845df2c840232e3e"} err="failed to get container status \"b14eef45adc24ab71e5e489cb756c50ef333de338083e06f845df2c840232e3e\": rpc error: code = NotFound desc = could not find container \"b14eef45adc24ab71e5e489cb756c50ef333de338083e06f845df2c840232e3e\": container with ID starting with b14eef45adc24ab71e5e489cb756c50ef333de338083e06f845df2c840232e3e not found: ID does not exist" Oct 11 02:45:18 crc kubenswrapper[4743]: I1011 02:45:18.105471 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7510726-3230-4f35-a9ee-311df506cd70" path="/var/lib/kubelet/pods/d7510726-3230-4f35-a9ee-311df506cd70/volumes" Oct 11 02:45:44 crc kubenswrapper[4743]: I1011 02:45:44.457951 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:45:44 crc kubenswrapper[4743]: I1011 02:45:44.458513 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:45:58 crc kubenswrapper[4743]: I1011 02:45:58.804528 4743 scope.go:117] "RemoveContainer" containerID="0170fcfc69bfd13c70a7f3fc633a3f1458adda2257e9f3b70331ada44d0e01fe" Oct 11 02:46:14 crc kubenswrapper[4743]: I1011 02:46:14.458185 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:46:14 crc kubenswrapper[4743]: I1011 02:46:14.458623 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:46:14 crc kubenswrapper[4743]: I1011 02:46:14.458670 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 02:46:14 crc kubenswrapper[4743]: I1011 02:46:14.459509 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9bad1aa167993888f3603cd19501049640480abd6eccd72436d7aac64304a3a5"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 02:46:14 crc kubenswrapper[4743]: I1011 02:46:14.459560 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://9bad1aa167993888f3603cd19501049640480abd6eccd72436d7aac64304a3a5" gracePeriod=600 Oct 11 02:46:15 crc kubenswrapper[4743]: I1011 02:46:15.448024 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="9bad1aa167993888f3603cd19501049640480abd6eccd72436d7aac64304a3a5" exitCode=0 Oct 11 02:46:15 crc kubenswrapper[4743]: I1011 02:46:15.448116 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"9bad1aa167993888f3603cd19501049640480abd6eccd72436d7aac64304a3a5"} Oct 11 02:46:15 crc kubenswrapper[4743]: I1011 02:46:15.448836 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0"} Oct 11 02:46:15 crc kubenswrapper[4743]: I1011 02:46:15.448886 4743 scope.go:117] "RemoveContainer" containerID="7d3e7e17173697b69ee1bb733cd6891f396ed2b62c761ae95305dcea918a8e2b" Oct 11 02:48:14 crc kubenswrapper[4743]: I1011 02:48:14.458415 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:48:14 crc kubenswrapper[4743]: I1011 02:48:14.459076 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:48:44 crc kubenswrapper[4743]: I1011 02:48:44.458687 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:48:44 crc kubenswrapper[4743]: I1011 02:48:44.459317 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:49:05 crc kubenswrapper[4743]: I1011 02:49:05.108437 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8v6n8"] Oct 11 02:49:05 crc kubenswrapper[4743]: E1011 02:49:05.109538 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7bb91e0-822d-464b-b827-b5e8431a43d4" containerName="extract-content" Oct 11 02:49:05 crc kubenswrapper[4743]: I1011 02:49:05.109556 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7bb91e0-822d-464b-b827-b5e8431a43d4" containerName="extract-content" Oct 11 02:49:05 crc kubenswrapper[4743]: E1011 02:49:05.109569 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7510726-3230-4f35-a9ee-311df506cd70" containerName="registry-server" Oct 11 02:49:05 crc kubenswrapper[4743]: I1011 02:49:05.109577 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7510726-3230-4f35-a9ee-311df506cd70" containerName="registry-server" Oct 11 02:49:05 crc kubenswrapper[4743]: E1011 02:49:05.109597 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5464fb1d-4bf4-4e87-9ae4-9463cda9c553" containerName="collect-profiles" Oct 11 02:49:05 crc kubenswrapper[4743]: I1011 02:49:05.109605 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5464fb1d-4bf4-4e87-9ae4-9463cda9c553" containerName="collect-profiles" Oct 11 02:49:05 crc kubenswrapper[4743]: E1011 02:49:05.109618 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7bb91e0-822d-464b-b827-b5e8431a43d4" containerName="extract-utilities" Oct 11 02:49:05 crc kubenswrapper[4743]: I1011 02:49:05.109626 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7bb91e0-822d-464b-b827-b5e8431a43d4" containerName="extract-utilities" Oct 11 02:49:05 crc kubenswrapper[4743]: E1011 02:49:05.109641 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7510726-3230-4f35-a9ee-311df506cd70" containerName="extract-utilities" Oct 11 02:49:05 crc kubenswrapper[4743]: I1011 02:49:05.109663 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7510726-3230-4f35-a9ee-311df506cd70" containerName="extract-utilities" Oct 11 02:49:05 crc kubenswrapper[4743]: E1011 02:49:05.109672 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7510726-3230-4f35-a9ee-311df506cd70" containerName="extract-content" Oct 11 02:49:05 crc kubenswrapper[4743]: I1011 02:49:05.109680 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7510726-3230-4f35-a9ee-311df506cd70" containerName="extract-content" Oct 11 02:49:05 crc kubenswrapper[4743]: E1011 02:49:05.109697 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7bb91e0-822d-464b-b827-b5e8431a43d4" containerName="registry-server" Oct 11 02:49:05 crc kubenswrapper[4743]: I1011 02:49:05.109707 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7bb91e0-822d-464b-b827-b5e8431a43d4" containerName="registry-server" Oct 11 02:49:05 crc kubenswrapper[4743]: I1011 02:49:05.110032 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7bb91e0-822d-464b-b827-b5e8431a43d4" containerName="registry-server" Oct 11 02:49:05 crc kubenswrapper[4743]: I1011 02:49:05.110063 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7510726-3230-4f35-a9ee-311df506cd70" containerName="registry-server" Oct 11 02:49:05 crc kubenswrapper[4743]: I1011 02:49:05.110090 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5464fb1d-4bf4-4e87-9ae4-9463cda9c553" containerName="collect-profiles" Oct 11 02:49:05 crc kubenswrapper[4743]: I1011 02:49:05.112953 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8v6n8" Oct 11 02:49:05 crc kubenswrapper[4743]: I1011 02:49:05.132228 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8v6n8"] Oct 11 02:49:05 crc kubenswrapper[4743]: I1011 02:49:05.267108 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvp2l\" (UniqueName: \"kubernetes.io/projected/2c7cf1f2-5d43-4038-a3f3-f78a6859b47a-kube-api-access-mvp2l\") pod \"redhat-operators-8v6n8\" (UID: \"2c7cf1f2-5d43-4038-a3f3-f78a6859b47a\") " pod="openshift-marketplace/redhat-operators-8v6n8" Oct 11 02:49:05 crc kubenswrapper[4743]: I1011 02:49:05.267210 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c7cf1f2-5d43-4038-a3f3-f78a6859b47a-utilities\") pod \"redhat-operators-8v6n8\" (UID: \"2c7cf1f2-5d43-4038-a3f3-f78a6859b47a\") " pod="openshift-marketplace/redhat-operators-8v6n8" Oct 11 02:49:05 crc kubenswrapper[4743]: I1011 02:49:05.267289 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c7cf1f2-5d43-4038-a3f3-f78a6859b47a-catalog-content\") pod \"redhat-operators-8v6n8\" (UID: \"2c7cf1f2-5d43-4038-a3f3-f78a6859b47a\") " pod="openshift-marketplace/redhat-operators-8v6n8" Oct 11 02:49:05 crc kubenswrapper[4743]: I1011 02:49:05.369575 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvp2l\" (UniqueName: \"kubernetes.io/projected/2c7cf1f2-5d43-4038-a3f3-f78a6859b47a-kube-api-access-mvp2l\") pod \"redhat-operators-8v6n8\" (UID: \"2c7cf1f2-5d43-4038-a3f3-f78a6859b47a\") " pod="openshift-marketplace/redhat-operators-8v6n8" Oct 11 02:49:05 crc kubenswrapper[4743]: I1011 02:49:05.369675 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c7cf1f2-5d43-4038-a3f3-f78a6859b47a-utilities\") pod \"redhat-operators-8v6n8\" (UID: \"2c7cf1f2-5d43-4038-a3f3-f78a6859b47a\") " pod="openshift-marketplace/redhat-operators-8v6n8" Oct 11 02:49:05 crc kubenswrapper[4743]: I1011 02:49:05.369729 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c7cf1f2-5d43-4038-a3f3-f78a6859b47a-catalog-content\") pod \"redhat-operators-8v6n8\" (UID: \"2c7cf1f2-5d43-4038-a3f3-f78a6859b47a\") " pod="openshift-marketplace/redhat-operators-8v6n8" Oct 11 02:49:05 crc kubenswrapper[4743]: I1011 02:49:05.370414 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c7cf1f2-5d43-4038-a3f3-f78a6859b47a-catalog-content\") pod \"redhat-operators-8v6n8\" (UID: \"2c7cf1f2-5d43-4038-a3f3-f78a6859b47a\") " pod="openshift-marketplace/redhat-operators-8v6n8" Oct 11 02:49:05 crc kubenswrapper[4743]: I1011 02:49:05.370519 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c7cf1f2-5d43-4038-a3f3-f78a6859b47a-utilities\") pod \"redhat-operators-8v6n8\" (UID: \"2c7cf1f2-5d43-4038-a3f3-f78a6859b47a\") " pod="openshift-marketplace/redhat-operators-8v6n8" Oct 11 02:49:05 crc kubenswrapper[4743]: I1011 02:49:05.394683 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvp2l\" (UniqueName: \"kubernetes.io/projected/2c7cf1f2-5d43-4038-a3f3-f78a6859b47a-kube-api-access-mvp2l\") pod \"redhat-operators-8v6n8\" (UID: \"2c7cf1f2-5d43-4038-a3f3-f78a6859b47a\") " pod="openshift-marketplace/redhat-operators-8v6n8" Oct 11 02:49:05 crc kubenswrapper[4743]: I1011 02:49:05.443684 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8v6n8" Oct 11 02:49:06 crc kubenswrapper[4743]: I1011 02:49:06.022142 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8v6n8"] Oct 11 02:49:06 crc kubenswrapper[4743]: I1011 02:49:06.410038 4743 generic.go:334] "Generic (PLEG): container finished" podID="2c7cf1f2-5d43-4038-a3f3-f78a6859b47a" containerID="16d23f660b912944c1e87c9aa34ae1df4d228ba36633c97ef0a287bec97d0236" exitCode=0 Oct 11 02:49:06 crc kubenswrapper[4743]: I1011 02:49:06.410127 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8v6n8" event={"ID":"2c7cf1f2-5d43-4038-a3f3-f78a6859b47a","Type":"ContainerDied","Data":"16d23f660b912944c1e87c9aa34ae1df4d228ba36633c97ef0a287bec97d0236"} Oct 11 02:49:06 crc kubenswrapper[4743]: I1011 02:49:06.410399 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8v6n8" event={"ID":"2c7cf1f2-5d43-4038-a3f3-f78a6859b47a","Type":"ContainerStarted","Data":"9ec653c1f50b16cd7b1992b29ec3e9cc8e1007647fc6660d9ab395aec9aaf430"} Oct 11 02:49:07 crc kubenswrapper[4743]: I1011 02:49:07.425059 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8v6n8" event={"ID":"2c7cf1f2-5d43-4038-a3f3-f78a6859b47a","Type":"ContainerStarted","Data":"be4f372644884613013467bb5c686db00ba8a569d927c9ee7709f9c83c2917b2"} Oct 11 02:49:10 crc kubenswrapper[4743]: I1011 02:49:10.470511 4743 generic.go:334] "Generic (PLEG): container finished" podID="2c7cf1f2-5d43-4038-a3f3-f78a6859b47a" containerID="be4f372644884613013467bb5c686db00ba8a569d927c9ee7709f9c83c2917b2" exitCode=0 Oct 11 02:49:10 crc kubenswrapper[4743]: I1011 02:49:10.470592 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8v6n8" event={"ID":"2c7cf1f2-5d43-4038-a3f3-f78a6859b47a","Type":"ContainerDied","Data":"be4f372644884613013467bb5c686db00ba8a569d927c9ee7709f9c83c2917b2"} Oct 11 02:49:12 crc kubenswrapper[4743]: I1011 02:49:12.496538 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8v6n8" event={"ID":"2c7cf1f2-5d43-4038-a3f3-f78a6859b47a","Type":"ContainerStarted","Data":"258fe0b8200c99837d7a08ef30974b444c3fac33f7416c72e1863fe6d3de36e5"} Oct 11 02:49:12 crc kubenswrapper[4743]: I1011 02:49:12.516280 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8v6n8" podStartSLOduration=2.906685546 podStartE2EDuration="7.515528994s" podCreationTimestamp="2025-10-11 02:49:05 +0000 UTC" firstStartedPulling="2025-10-11 02:49:06.412204129 +0000 UTC m=+7041.065184526" lastFinishedPulling="2025-10-11 02:49:11.021047577 +0000 UTC m=+7045.674027974" observedRunningTime="2025-10-11 02:49:12.513267708 +0000 UTC m=+7047.166248105" watchObservedRunningTime="2025-10-11 02:49:12.515528994 +0000 UTC m=+7047.168509431" Oct 11 02:49:14 crc kubenswrapper[4743]: I1011 02:49:14.457749 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:49:14 crc kubenswrapper[4743]: I1011 02:49:14.458117 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:49:14 crc kubenswrapper[4743]: I1011 02:49:14.458164 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 02:49:14 crc kubenswrapper[4743]: I1011 02:49:14.459323 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 02:49:14 crc kubenswrapper[4743]: I1011 02:49:14.459378 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" gracePeriod=600 Oct 11 02:49:14 crc kubenswrapper[4743]: E1011 02:49:14.598532 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:49:15 crc kubenswrapper[4743]: I1011 02:49:15.444594 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8v6n8" Oct 11 02:49:15 crc kubenswrapper[4743]: I1011 02:49:15.444646 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8v6n8" Oct 11 02:49:15 crc kubenswrapper[4743]: I1011 02:49:15.535384 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" exitCode=0 Oct 11 02:49:15 crc kubenswrapper[4743]: I1011 02:49:15.535433 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0"} Oct 11 02:49:15 crc kubenswrapper[4743]: I1011 02:49:15.535472 4743 scope.go:117] "RemoveContainer" containerID="9bad1aa167993888f3603cd19501049640480abd6eccd72436d7aac64304a3a5" Oct 11 02:49:15 crc kubenswrapper[4743]: I1011 02:49:15.536199 4743 scope.go:117] "RemoveContainer" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" Oct 11 02:49:15 crc kubenswrapper[4743]: E1011 02:49:15.536537 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:49:16 crc kubenswrapper[4743]: I1011 02:49:16.506360 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8v6n8" podUID="2c7cf1f2-5d43-4038-a3f3-f78a6859b47a" containerName="registry-server" probeResult="failure" output=< Oct 11 02:49:16 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Oct 11 02:49:16 crc kubenswrapper[4743]: > Oct 11 02:49:25 crc kubenswrapper[4743]: I1011 02:49:25.528223 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8v6n8" Oct 11 02:49:25 crc kubenswrapper[4743]: I1011 02:49:25.590776 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8v6n8" Oct 11 02:49:25 crc kubenswrapper[4743]: I1011 02:49:25.776135 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8v6n8"] Oct 11 02:49:26 crc kubenswrapper[4743]: I1011 02:49:26.688830 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8v6n8" podUID="2c7cf1f2-5d43-4038-a3f3-f78a6859b47a" containerName="registry-server" containerID="cri-o://258fe0b8200c99837d7a08ef30974b444c3fac33f7416c72e1863fe6d3de36e5" gracePeriod=2 Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.095657 4743 scope.go:117] "RemoveContainer" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" Oct 11 02:49:27 crc kubenswrapper[4743]: E1011 02:49:27.096208 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.333735 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8v6n8" Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.459565 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvp2l\" (UniqueName: \"kubernetes.io/projected/2c7cf1f2-5d43-4038-a3f3-f78a6859b47a-kube-api-access-mvp2l\") pod \"2c7cf1f2-5d43-4038-a3f3-f78a6859b47a\" (UID: \"2c7cf1f2-5d43-4038-a3f3-f78a6859b47a\") " Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.460109 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c7cf1f2-5d43-4038-a3f3-f78a6859b47a-utilities\") pod \"2c7cf1f2-5d43-4038-a3f3-f78a6859b47a\" (UID: \"2c7cf1f2-5d43-4038-a3f3-f78a6859b47a\") " Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.460209 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c7cf1f2-5d43-4038-a3f3-f78a6859b47a-catalog-content\") pod \"2c7cf1f2-5d43-4038-a3f3-f78a6859b47a\" (UID: \"2c7cf1f2-5d43-4038-a3f3-f78a6859b47a\") " Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.461342 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c7cf1f2-5d43-4038-a3f3-f78a6859b47a-utilities" (OuterVolumeSpecName: "utilities") pod "2c7cf1f2-5d43-4038-a3f3-f78a6859b47a" (UID: "2c7cf1f2-5d43-4038-a3f3-f78a6859b47a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.466481 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c7cf1f2-5d43-4038-a3f3-f78a6859b47a-kube-api-access-mvp2l" (OuterVolumeSpecName: "kube-api-access-mvp2l") pod "2c7cf1f2-5d43-4038-a3f3-f78a6859b47a" (UID: "2c7cf1f2-5d43-4038-a3f3-f78a6859b47a"). InnerVolumeSpecName "kube-api-access-mvp2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.542643 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c7cf1f2-5d43-4038-a3f3-f78a6859b47a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c7cf1f2-5d43-4038-a3f3-f78a6859b47a" (UID: "2c7cf1f2-5d43-4038-a3f3-f78a6859b47a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.563993 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c7cf1f2-5d43-4038-a3f3-f78a6859b47a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.564036 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvp2l\" (UniqueName: \"kubernetes.io/projected/2c7cf1f2-5d43-4038-a3f3-f78a6859b47a-kube-api-access-mvp2l\") on node \"crc\" DevicePath \"\"" Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.564050 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c7cf1f2-5d43-4038-a3f3-f78a6859b47a-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.702846 4743 generic.go:334] "Generic (PLEG): container finished" podID="2c7cf1f2-5d43-4038-a3f3-f78a6859b47a" containerID="258fe0b8200c99837d7a08ef30974b444c3fac33f7416c72e1863fe6d3de36e5" exitCode=0 Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.703172 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8v6n8" event={"ID":"2c7cf1f2-5d43-4038-a3f3-f78a6859b47a","Type":"ContainerDied","Data":"258fe0b8200c99837d7a08ef30974b444c3fac33f7416c72e1863fe6d3de36e5"} Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.703212 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8v6n8" event={"ID":"2c7cf1f2-5d43-4038-a3f3-f78a6859b47a","Type":"ContainerDied","Data":"9ec653c1f50b16cd7b1992b29ec3e9cc8e1007647fc6660d9ab395aec9aaf430"} Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.703234 4743 scope.go:117] "RemoveContainer" containerID="258fe0b8200c99837d7a08ef30974b444c3fac33f7416c72e1863fe6d3de36e5" Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.703372 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8v6n8" Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.770242 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8v6n8"] Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.771080 4743 scope.go:117] "RemoveContainer" containerID="be4f372644884613013467bb5c686db00ba8a569d927c9ee7709f9c83c2917b2" Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.784056 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8v6n8"] Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.805323 4743 scope.go:117] "RemoveContainer" containerID="16d23f660b912944c1e87c9aa34ae1df4d228ba36633c97ef0a287bec97d0236" Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.874102 4743 scope.go:117] "RemoveContainer" containerID="258fe0b8200c99837d7a08ef30974b444c3fac33f7416c72e1863fe6d3de36e5" Oct 11 02:49:27 crc kubenswrapper[4743]: E1011 02:49:27.877381 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"258fe0b8200c99837d7a08ef30974b444c3fac33f7416c72e1863fe6d3de36e5\": container with ID starting with 258fe0b8200c99837d7a08ef30974b444c3fac33f7416c72e1863fe6d3de36e5 not found: ID does not exist" containerID="258fe0b8200c99837d7a08ef30974b444c3fac33f7416c72e1863fe6d3de36e5" Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.877437 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"258fe0b8200c99837d7a08ef30974b444c3fac33f7416c72e1863fe6d3de36e5"} err="failed to get container status \"258fe0b8200c99837d7a08ef30974b444c3fac33f7416c72e1863fe6d3de36e5\": rpc error: code = NotFound desc = could not find container \"258fe0b8200c99837d7a08ef30974b444c3fac33f7416c72e1863fe6d3de36e5\": container with ID starting with 258fe0b8200c99837d7a08ef30974b444c3fac33f7416c72e1863fe6d3de36e5 not found: ID does not exist" Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.877470 4743 scope.go:117] "RemoveContainer" containerID="be4f372644884613013467bb5c686db00ba8a569d927c9ee7709f9c83c2917b2" Oct 11 02:49:27 crc kubenswrapper[4743]: E1011 02:49:27.877983 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be4f372644884613013467bb5c686db00ba8a569d927c9ee7709f9c83c2917b2\": container with ID starting with be4f372644884613013467bb5c686db00ba8a569d927c9ee7709f9c83c2917b2 not found: ID does not exist" containerID="be4f372644884613013467bb5c686db00ba8a569d927c9ee7709f9c83c2917b2" Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.878014 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be4f372644884613013467bb5c686db00ba8a569d927c9ee7709f9c83c2917b2"} err="failed to get container status \"be4f372644884613013467bb5c686db00ba8a569d927c9ee7709f9c83c2917b2\": rpc error: code = NotFound desc = could not find container \"be4f372644884613013467bb5c686db00ba8a569d927c9ee7709f9c83c2917b2\": container with ID starting with be4f372644884613013467bb5c686db00ba8a569d927c9ee7709f9c83c2917b2 not found: ID does not exist" Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.878031 4743 scope.go:117] "RemoveContainer" containerID="16d23f660b912944c1e87c9aa34ae1df4d228ba36633c97ef0a287bec97d0236" Oct 11 02:49:27 crc kubenswrapper[4743]: E1011 02:49:27.878317 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16d23f660b912944c1e87c9aa34ae1df4d228ba36633c97ef0a287bec97d0236\": container with ID starting with 16d23f660b912944c1e87c9aa34ae1df4d228ba36633c97ef0a287bec97d0236 not found: ID does not exist" containerID="16d23f660b912944c1e87c9aa34ae1df4d228ba36633c97ef0a287bec97d0236" Oct 11 02:49:27 crc kubenswrapper[4743]: I1011 02:49:27.878348 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16d23f660b912944c1e87c9aa34ae1df4d228ba36633c97ef0a287bec97d0236"} err="failed to get container status \"16d23f660b912944c1e87c9aa34ae1df4d228ba36633c97ef0a287bec97d0236\": rpc error: code = NotFound desc = could not find container \"16d23f660b912944c1e87c9aa34ae1df4d228ba36633c97ef0a287bec97d0236\": container with ID starting with 16d23f660b912944c1e87c9aa34ae1df4d228ba36633c97ef0a287bec97d0236 not found: ID does not exist" Oct 11 02:49:28 crc kubenswrapper[4743]: I1011 02:49:28.103328 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c7cf1f2-5d43-4038-a3f3-f78a6859b47a" path="/var/lib/kubelet/pods/2c7cf1f2-5d43-4038-a3f3-f78a6859b47a/volumes" Oct 11 02:49:41 crc kubenswrapper[4743]: I1011 02:49:41.092212 4743 scope.go:117] "RemoveContainer" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" Oct 11 02:49:41 crc kubenswrapper[4743]: E1011 02:49:41.093086 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:49:56 crc kubenswrapper[4743]: I1011 02:49:56.098677 4743 scope.go:117] "RemoveContainer" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" Oct 11 02:49:56 crc kubenswrapper[4743]: E1011 02:49:56.099531 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:50:09 crc kubenswrapper[4743]: I1011 02:50:09.092110 4743 scope.go:117] "RemoveContainer" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" Oct 11 02:50:09 crc kubenswrapper[4743]: E1011 02:50:09.093155 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:50:22 crc kubenswrapper[4743]: I1011 02:50:22.091814 4743 scope.go:117] "RemoveContainer" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" Oct 11 02:50:22 crc kubenswrapper[4743]: E1011 02:50:22.093106 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:50:33 crc kubenswrapper[4743]: I1011 02:50:33.092100 4743 scope.go:117] "RemoveContainer" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" Oct 11 02:50:33 crc kubenswrapper[4743]: E1011 02:50:33.092849 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:50:48 crc kubenswrapper[4743]: I1011 02:50:48.091803 4743 scope.go:117] "RemoveContainer" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" Oct 11 02:50:48 crc kubenswrapper[4743]: E1011 02:50:48.092667 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:51:02 crc kubenswrapper[4743]: I1011 02:51:02.091451 4743 scope.go:117] "RemoveContainer" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" Oct 11 02:51:02 crc kubenswrapper[4743]: E1011 02:51:02.092337 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:51:15 crc kubenswrapper[4743]: I1011 02:51:15.092323 4743 scope.go:117] "RemoveContainer" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" Oct 11 02:51:15 crc kubenswrapper[4743]: E1011 02:51:15.093100 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:51:30 crc kubenswrapper[4743]: I1011 02:51:30.092204 4743 scope.go:117] "RemoveContainer" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" Oct 11 02:51:30 crc kubenswrapper[4743]: E1011 02:51:30.093389 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:51:40 crc kubenswrapper[4743]: I1011 02:51:40.355382 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zzsm4"] Oct 11 02:51:40 crc kubenswrapper[4743]: E1011 02:51:40.357296 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c7cf1f2-5d43-4038-a3f3-f78a6859b47a" containerName="registry-server" Oct 11 02:51:40 crc kubenswrapper[4743]: I1011 02:51:40.357326 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c7cf1f2-5d43-4038-a3f3-f78a6859b47a" containerName="registry-server" Oct 11 02:51:40 crc kubenswrapper[4743]: E1011 02:51:40.357503 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c7cf1f2-5d43-4038-a3f3-f78a6859b47a" containerName="extract-content" Oct 11 02:51:40 crc kubenswrapper[4743]: I1011 02:51:40.357533 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c7cf1f2-5d43-4038-a3f3-f78a6859b47a" containerName="extract-content" Oct 11 02:51:40 crc kubenswrapper[4743]: E1011 02:51:40.357606 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c7cf1f2-5d43-4038-a3f3-f78a6859b47a" containerName="extract-utilities" Oct 11 02:51:40 crc kubenswrapper[4743]: I1011 02:51:40.357624 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c7cf1f2-5d43-4038-a3f3-f78a6859b47a" containerName="extract-utilities" Oct 11 02:51:40 crc kubenswrapper[4743]: I1011 02:51:40.358290 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c7cf1f2-5d43-4038-a3f3-f78a6859b47a" containerName="registry-server" Oct 11 02:51:40 crc kubenswrapper[4743]: I1011 02:51:40.362055 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zzsm4" Oct 11 02:51:40 crc kubenswrapper[4743]: I1011 02:51:40.384677 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde72df4-563d-43fb-b9e8-7937fb62a4ce-utilities\") pod \"redhat-marketplace-zzsm4\" (UID: \"bde72df4-563d-43fb-b9e8-7937fb62a4ce\") " pod="openshift-marketplace/redhat-marketplace-zzsm4" Oct 11 02:51:40 crc kubenswrapper[4743]: I1011 02:51:40.384835 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxcch\" (UniqueName: \"kubernetes.io/projected/bde72df4-563d-43fb-b9e8-7937fb62a4ce-kube-api-access-zxcch\") pod \"redhat-marketplace-zzsm4\" (UID: \"bde72df4-563d-43fb-b9e8-7937fb62a4ce\") " pod="openshift-marketplace/redhat-marketplace-zzsm4" Oct 11 02:51:40 crc kubenswrapper[4743]: I1011 02:51:40.384894 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde72df4-563d-43fb-b9e8-7937fb62a4ce-catalog-content\") pod \"redhat-marketplace-zzsm4\" (UID: \"bde72df4-563d-43fb-b9e8-7937fb62a4ce\") " pod="openshift-marketplace/redhat-marketplace-zzsm4" Oct 11 02:51:40 crc kubenswrapper[4743]: I1011 02:51:40.406775 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zzsm4"] Oct 11 02:51:40 crc kubenswrapper[4743]: I1011 02:51:40.487732 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde72df4-563d-43fb-b9e8-7937fb62a4ce-utilities\") pod \"redhat-marketplace-zzsm4\" (UID: \"bde72df4-563d-43fb-b9e8-7937fb62a4ce\") " pod="openshift-marketplace/redhat-marketplace-zzsm4" Oct 11 02:51:40 crc kubenswrapper[4743]: I1011 02:51:40.487920 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxcch\" (UniqueName: \"kubernetes.io/projected/bde72df4-563d-43fb-b9e8-7937fb62a4ce-kube-api-access-zxcch\") pod \"redhat-marketplace-zzsm4\" (UID: \"bde72df4-563d-43fb-b9e8-7937fb62a4ce\") " pod="openshift-marketplace/redhat-marketplace-zzsm4" Oct 11 02:51:40 crc kubenswrapper[4743]: I1011 02:51:40.487957 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde72df4-563d-43fb-b9e8-7937fb62a4ce-catalog-content\") pod \"redhat-marketplace-zzsm4\" (UID: \"bde72df4-563d-43fb-b9e8-7937fb62a4ce\") " pod="openshift-marketplace/redhat-marketplace-zzsm4" Oct 11 02:51:40 crc kubenswrapper[4743]: I1011 02:51:40.488454 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde72df4-563d-43fb-b9e8-7937fb62a4ce-utilities\") pod \"redhat-marketplace-zzsm4\" (UID: \"bde72df4-563d-43fb-b9e8-7937fb62a4ce\") " pod="openshift-marketplace/redhat-marketplace-zzsm4" Oct 11 02:51:40 crc kubenswrapper[4743]: I1011 02:51:40.488465 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde72df4-563d-43fb-b9e8-7937fb62a4ce-catalog-content\") pod \"redhat-marketplace-zzsm4\" (UID: \"bde72df4-563d-43fb-b9e8-7937fb62a4ce\") " pod="openshift-marketplace/redhat-marketplace-zzsm4" Oct 11 02:51:40 crc kubenswrapper[4743]: I1011 02:51:40.508079 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxcch\" (UniqueName: \"kubernetes.io/projected/bde72df4-563d-43fb-b9e8-7937fb62a4ce-kube-api-access-zxcch\") pod \"redhat-marketplace-zzsm4\" (UID: \"bde72df4-563d-43fb-b9e8-7937fb62a4ce\") " pod="openshift-marketplace/redhat-marketplace-zzsm4" Oct 11 02:51:40 crc kubenswrapper[4743]: I1011 02:51:40.688648 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zzsm4" Oct 11 02:51:41 crc kubenswrapper[4743]: I1011 02:51:41.176719 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zzsm4"] Oct 11 02:51:41 crc kubenswrapper[4743]: I1011 02:51:41.183269 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzsm4" event={"ID":"bde72df4-563d-43fb-b9e8-7937fb62a4ce","Type":"ContainerStarted","Data":"71148b0c31fe8f1b022fdb1c9537b8529b4d848230f4e4132c09a61085f85898"} Oct 11 02:51:42 crc kubenswrapper[4743]: I1011 02:51:42.093366 4743 scope.go:117] "RemoveContainer" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" Oct 11 02:51:42 crc kubenswrapper[4743]: E1011 02:51:42.094479 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:51:42 crc kubenswrapper[4743]: I1011 02:51:42.202826 4743 generic.go:334] "Generic (PLEG): container finished" podID="bde72df4-563d-43fb-b9e8-7937fb62a4ce" containerID="da63295ccdd6647eea6b1586ef116fdcf95251d6071389b2d05781b841e8c84e" exitCode=0 Oct 11 02:51:42 crc kubenswrapper[4743]: I1011 02:51:42.203226 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzsm4" event={"ID":"bde72df4-563d-43fb-b9e8-7937fb62a4ce","Type":"ContainerDied","Data":"da63295ccdd6647eea6b1586ef116fdcf95251d6071389b2d05781b841e8c84e"} Oct 11 02:51:42 crc kubenswrapper[4743]: I1011 02:51:42.207845 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 02:51:43 crc kubenswrapper[4743]: I1011 02:51:43.215203 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzsm4" event={"ID":"bde72df4-563d-43fb-b9e8-7937fb62a4ce","Type":"ContainerStarted","Data":"f868d8483bb31bf96b13e163677053d28209332ae0d6f128cca72269386e5f5f"} Oct 11 02:51:43 crc kubenswrapper[4743]: E1011 02:51:43.614608 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbde72df4_563d_43fb_b9e8_7937fb62a4ce.slice/crio-f868d8483bb31bf96b13e163677053d28209332ae0d6f128cca72269386e5f5f.scope\": RecentStats: unable to find data in memory cache]" Oct 11 02:51:44 crc kubenswrapper[4743]: I1011 02:51:44.230121 4743 generic.go:334] "Generic (PLEG): container finished" podID="bde72df4-563d-43fb-b9e8-7937fb62a4ce" containerID="f868d8483bb31bf96b13e163677053d28209332ae0d6f128cca72269386e5f5f" exitCode=0 Oct 11 02:51:44 crc kubenswrapper[4743]: I1011 02:51:44.230213 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzsm4" event={"ID":"bde72df4-563d-43fb-b9e8-7937fb62a4ce","Type":"ContainerDied","Data":"f868d8483bb31bf96b13e163677053d28209332ae0d6f128cca72269386e5f5f"} Oct 11 02:51:45 crc kubenswrapper[4743]: I1011 02:51:45.243188 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzsm4" event={"ID":"bde72df4-563d-43fb-b9e8-7937fb62a4ce","Type":"ContainerStarted","Data":"de15386b60e2e1b4fd71f25ef25fb1fbe0e8c758eb4d07ce8e9562d001407f22"} Oct 11 02:51:45 crc kubenswrapper[4743]: I1011 02:51:45.268634 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zzsm4" podStartSLOduration=2.875303917 podStartE2EDuration="5.268615317s" podCreationTimestamp="2025-10-11 02:51:40 +0000 UTC" firstStartedPulling="2025-10-11 02:51:42.207369632 +0000 UTC m=+7196.860350069" lastFinishedPulling="2025-10-11 02:51:44.600681062 +0000 UTC m=+7199.253661469" observedRunningTime="2025-10-11 02:51:45.262077805 +0000 UTC m=+7199.915058202" watchObservedRunningTime="2025-10-11 02:51:45.268615317 +0000 UTC m=+7199.921595714" Oct 11 02:51:50 crc kubenswrapper[4743]: I1011 02:51:50.689103 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zzsm4" Oct 11 02:51:50 crc kubenswrapper[4743]: I1011 02:51:50.689812 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zzsm4" Oct 11 02:51:50 crc kubenswrapper[4743]: I1011 02:51:50.748756 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zzsm4" Oct 11 02:51:51 crc kubenswrapper[4743]: I1011 02:51:51.385129 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zzsm4" Oct 11 02:51:51 crc kubenswrapper[4743]: I1011 02:51:51.430303 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zzsm4"] Oct 11 02:51:53 crc kubenswrapper[4743]: I1011 02:51:53.092627 4743 scope.go:117] "RemoveContainer" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" Oct 11 02:51:53 crc kubenswrapper[4743]: E1011 02:51:53.092942 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:51:53 crc kubenswrapper[4743]: I1011 02:51:53.331847 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zzsm4" podUID="bde72df4-563d-43fb-b9e8-7937fb62a4ce" containerName="registry-server" containerID="cri-o://de15386b60e2e1b4fd71f25ef25fb1fbe0e8c758eb4d07ce8e9562d001407f22" gracePeriod=2 Oct 11 02:51:53 crc kubenswrapper[4743]: I1011 02:51:53.853255 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zzsm4" Oct 11 02:51:53 crc kubenswrapper[4743]: I1011 02:51:53.947971 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde72df4-563d-43fb-b9e8-7937fb62a4ce-utilities\") pod \"bde72df4-563d-43fb-b9e8-7937fb62a4ce\" (UID: \"bde72df4-563d-43fb-b9e8-7937fb62a4ce\") " Oct 11 02:51:53 crc kubenswrapper[4743]: I1011 02:51:53.948493 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde72df4-563d-43fb-b9e8-7937fb62a4ce-catalog-content\") pod \"bde72df4-563d-43fb-b9e8-7937fb62a4ce\" (UID: \"bde72df4-563d-43fb-b9e8-7937fb62a4ce\") " Oct 11 02:51:53 crc kubenswrapper[4743]: I1011 02:51:53.948535 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxcch\" (UniqueName: \"kubernetes.io/projected/bde72df4-563d-43fb-b9e8-7937fb62a4ce-kube-api-access-zxcch\") pod \"bde72df4-563d-43fb-b9e8-7937fb62a4ce\" (UID: \"bde72df4-563d-43fb-b9e8-7937fb62a4ce\") " Oct 11 02:51:53 crc kubenswrapper[4743]: I1011 02:51:53.948888 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde72df4-563d-43fb-b9e8-7937fb62a4ce-utilities" (OuterVolumeSpecName: "utilities") pod "bde72df4-563d-43fb-b9e8-7937fb62a4ce" (UID: "bde72df4-563d-43fb-b9e8-7937fb62a4ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:51:53 crc kubenswrapper[4743]: I1011 02:51:53.951421 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde72df4-563d-43fb-b9e8-7937fb62a4ce-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 02:51:53 crc kubenswrapper[4743]: I1011 02:51:53.954248 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bde72df4-563d-43fb-b9e8-7937fb62a4ce-kube-api-access-zxcch" (OuterVolumeSpecName: "kube-api-access-zxcch") pod "bde72df4-563d-43fb-b9e8-7937fb62a4ce" (UID: "bde72df4-563d-43fb-b9e8-7937fb62a4ce"). InnerVolumeSpecName "kube-api-access-zxcch". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:51:53 crc kubenswrapper[4743]: I1011 02:51:53.962927 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde72df4-563d-43fb-b9e8-7937fb62a4ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bde72df4-563d-43fb-b9e8-7937fb62a4ce" (UID: "bde72df4-563d-43fb-b9e8-7937fb62a4ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:51:54 crc kubenswrapper[4743]: I1011 02:51:54.053726 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde72df4-563d-43fb-b9e8-7937fb62a4ce-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 02:51:54 crc kubenswrapper[4743]: I1011 02:51:54.053772 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxcch\" (UniqueName: \"kubernetes.io/projected/bde72df4-563d-43fb-b9e8-7937fb62a4ce-kube-api-access-zxcch\") on node \"crc\" DevicePath \"\"" Oct 11 02:51:54 crc kubenswrapper[4743]: I1011 02:51:54.348552 4743 generic.go:334] "Generic (PLEG): container finished" podID="bde72df4-563d-43fb-b9e8-7937fb62a4ce" containerID="de15386b60e2e1b4fd71f25ef25fb1fbe0e8c758eb4d07ce8e9562d001407f22" exitCode=0 Oct 11 02:51:54 crc kubenswrapper[4743]: I1011 02:51:54.348619 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zzsm4" Oct 11 02:51:54 crc kubenswrapper[4743]: I1011 02:51:54.348640 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzsm4" event={"ID":"bde72df4-563d-43fb-b9e8-7937fb62a4ce","Type":"ContainerDied","Data":"de15386b60e2e1b4fd71f25ef25fb1fbe0e8c758eb4d07ce8e9562d001407f22"} Oct 11 02:51:54 crc kubenswrapper[4743]: I1011 02:51:54.348964 4743 scope.go:117] "RemoveContainer" containerID="de15386b60e2e1b4fd71f25ef25fb1fbe0e8c758eb4d07ce8e9562d001407f22" Oct 11 02:51:54 crc kubenswrapper[4743]: I1011 02:51:54.348966 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzsm4" event={"ID":"bde72df4-563d-43fb-b9e8-7937fb62a4ce","Type":"ContainerDied","Data":"71148b0c31fe8f1b022fdb1c9537b8529b4d848230f4e4132c09a61085f85898"} Oct 11 02:51:54 crc kubenswrapper[4743]: I1011 02:51:54.377722 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zzsm4"] Oct 11 02:51:54 crc kubenswrapper[4743]: I1011 02:51:54.386793 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zzsm4"] Oct 11 02:51:54 crc kubenswrapper[4743]: I1011 02:51:54.406683 4743 scope.go:117] "RemoveContainer" containerID="f868d8483bb31bf96b13e163677053d28209332ae0d6f128cca72269386e5f5f" Oct 11 02:51:54 crc kubenswrapper[4743]: I1011 02:51:54.431260 4743 scope.go:117] "RemoveContainer" containerID="da63295ccdd6647eea6b1586ef116fdcf95251d6071389b2d05781b841e8c84e" Oct 11 02:51:54 crc kubenswrapper[4743]: I1011 02:51:54.487454 4743 scope.go:117] "RemoveContainer" containerID="de15386b60e2e1b4fd71f25ef25fb1fbe0e8c758eb4d07ce8e9562d001407f22" Oct 11 02:51:54 crc kubenswrapper[4743]: E1011 02:51:54.488047 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de15386b60e2e1b4fd71f25ef25fb1fbe0e8c758eb4d07ce8e9562d001407f22\": container with ID starting with de15386b60e2e1b4fd71f25ef25fb1fbe0e8c758eb4d07ce8e9562d001407f22 not found: ID does not exist" containerID="de15386b60e2e1b4fd71f25ef25fb1fbe0e8c758eb4d07ce8e9562d001407f22" Oct 11 02:51:54 crc kubenswrapper[4743]: I1011 02:51:54.488085 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de15386b60e2e1b4fd71f25ef25fb1fbe0e8c758eb4d07ce8e9562d001407f22"} err="failed to get container status \"de15386b60e2e1b4fd71f25ef25fb1fbe0e8c758eb4d07ce8e9562d001407f22\": rpc error: code = NotFound desc = could not find container \"de15386b60e2e1b4fd71f25ef25fb1fbe0e8c758eb4d07ce8e9562d001407f22\": container with ID starting with de15386b60e2e1b4fd71f25ef25fb1fbe0e8c758eb4d07ce8e9562d001407f22 not found: ID does not exist" Oct 11 02:51:54 crc kubenswrapper[4743]: I1011 02:51:54.488111 4743 scope.go:117] "RemoveContainer" containerID="f868d8483bb31bf96b13e163677053d28209332ae0d6f128cca72269386e5f5f" Oct 11 02:51:54 crc kubenswrapper[4743]: E1011 02:51:54.488403 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f868d8483bb31bf96b13e163677053d28209332ae0d6f128cca72269386e5f5f\": container with ID starting with f868d8483bb31bf96b13e163677053d28209332ae0d6f128cca72269386e5f5f not found: ID does not exist" containerID="f868d8483bb31bf96b13e163677053d28209332ae0d6f128cca72269386e5f5f" Oct 11 02:51:54 crc kubenswrapper[4743]: I1011 02:51:54.488445 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f868d8483bb31bf96b13e163677053d28209332ae0d6f128cca72269386e5f5f"} err="failed to get container status \"f868d8483bb31bf96b13e163677053d28209332ae0d6f128cca72269386e5f5f\": rpc error: code = NotFound desc = could not find container \"f868d8483bb31bf96b13e163677053d28209332ae0d6f128cca72269386e5f5f\": container with ID starting with f868d8483bb31bf96b13e163677053d28209332ae0d6f128cca72269386e5f5f not found: ID does not exist" Oct 11 02:51:54 crc kubenswrapper[4743]: I1011 02:51:54.488462 4743 scope.go:117] "RemoveContainer" containerID="da63295ccdd6647eea6b1586ef116fdcf95251d6071389b2d05781b841e8c84e" Oct 11 02:51:54 crc kubenswrapper[4743]: E1011 02:51:54.488942 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da63295ccdd6647eea6b1586ef116fdcf95251d6071389b2d05781b841e8c84e\": container with ID starting with da63295ccdd6647eea6b1586ef116fdcf95251d6071389b2d05781b841e8c84e not found: ID does not exist" containerID="da63295ccdd6647eea6b1586ef116fdcf95251d6071389b2d05781b841e8c84e" Oct 11 02:51:54 crc kubenswrapper[4743]: I1011 02:51:54.489000 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da63295ccdd6647eea6b1586ef116fdcf95251d6071389b2d05781b841e8c84e"} err="failed to get container status \"da63295ccdd6647eea6b1586ef116fdcf95251d6071389b2d05781b841e8c84e\": rpc error: code = NotFound desc = could not find container \"da63295ccdd6647eea6b1586ef116fdcf95251d6071389b2d05781b841e8c84e\": container with ID starting with da63295ccdd6647eea6b1586ef116fdcf95251d6071389b2d05781b841e8c84e not found: ID does not exist" Oct 11 02:51:56 crc kubenswrapper[4743]: I1011 02:51:56.112040 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bde72df4-563d-43fb-b9e8-7937fb62a4ce" path="/var/lib/kubelet/pods/bde72df4-563d-43fb-b9e8-7937fb62a4ce/volumes" Oct 11 02:52:07 crc kubenswrapper[4743]: I1011 02:52:07.092781 4743 scope.go:117] "RemoveContainer" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" Oct 11 02:52:07 crc kubenswrapper[4743]: E1011 02:52:07.094008 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:52:22 crc kubenswrapper[4743]: I1011 02:52:22.093384 4743 scope.go:117] "RemoveContainer" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" Oct 11 02:52:22 crc kubenswrapper[4743]: E1011 02:52:22.094233 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:52:36 crc kubenswrapper[4743]: I1011 02:52:36.098100 4743 scope.go:117] "RemoveContainer" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" Oct 11 02:52:36 crc kubenswrapper[4743]: E1011 02:52:36.098683 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:52:47 crc kubenswrapper[4743]: I1011 02:52:47.092048 4743 scope.go:117] "RemoveContainer" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" Oct 11 02:52:47 crc kubenswrapper[4743]: E1011 02:52:47.092970 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:52:59 crc kubenswrapper[4743]: I1011 02:52:59.093185 4743 scope.go:117] "RemoveContainer" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" Oct 11 02:52:59 crc kubenswrapper[4743]: E1011 02:52:59.095679 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:53:10 crc kubenswrapper[4743]: I1011 02:53:10.092990 4743 scope.go:117] "RemoveContainer" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" Oct 11 02:53:10 crc kubenswrapper[4743]: E1011 02:53:10.093808 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:53:21 crc kubenswrapper[4743]: I1011 02:53:21.092613 4743 scope.go:117] "RemoveContainer" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" Oct 11 02:53:21 crc kubenswrapper[4743]: E1011 02:53:21.093749 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:53:36 crc kubenswrapper[4743]: I1011 02:53:36.113511 4743 scope.go:117] "RemoveContainer" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" Oct 11 02:53:36 crc kubenswrapper[4743]: E1011 02:53:36.114665 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:53:50 crc kubenswrapper[4743]: I1011 02:53:50.092399 4743 scope.go:117] "RemoveContainer" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" Oct 11 02:53:50 crc kubenswrapper[4743]: E1011 02:53:50.093523 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:54:03 crc kubenswrapper[4743]: I1011 02:54:03.092087 4743 scope.go:117] "RemoveContainer" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" Oct 11 02:54:03 crc kubenswrapper[4743]: E1011 02:54:03.092995 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 02:54:18 crc kubenswrapper[4743]: I1011 02:54:18.091364 4743 scope.go:117] "RemoveContainer" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" Oct 11 02:54:18 crc kubenswrapper[4743]: I1011 02:54:18.933936 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"85f2b99de1cc7ef2f34c96872391b70e6bcbca23b2568563cfde1c4d96fac977"} Oct 11 02:56:39 crc kubenswrapper[4743]: I1011 02:56:39.427194 4743 generic.go:334] "Generic (PLEG): container finished" podID="59e812a1-677a-4aca-bb9a-c4f0d166710a" containerID="c1417cf18987330c309aba79a1f08fe5bcc5e606328a37a1d6bfcb964e4d4210" exitCode=0 Oct 11 02:56:39 crc kubenswrapper[4743]: I1011 02:56:39.427258 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"59e812a1-677a-4aca-bb9a-c4f0d166710a","Type":"ContainerDied","Data":"c1417cf18987330c309aba79a1f08fe5bcc5e606328a37a1d6bfcb964e4d4210"} Oct 11 02:56:40 crc kubenswrapper[4743]: I1011 02:56:40.986598 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.048604 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"59e812a1-677a-4aca-bb9a-c4f0d166710a\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.048691 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/59e812a1-677a-4aca-bb9a-c4f0d166710a-ca-certs\") pod \"59e812a1-677a-4aca-bb9a-c4f0d166710a\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.048789 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/59e812a1-677a-4aca-bb9a-c4f0d166710a-config-data\") pod \"59e812a1-677a-4aca-bb9a-c4f0d166710a\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.048820 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59e812a1-677a-4aca-bb9a-c4f0d166710a-ssh-key\") pod \"59e812a1-677a-4aca-bb9a-c4f0d166710a\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.048885 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tbht\" (UniqueName: \"kubernetes.io/projected/59e812a1-677a-4aca-bb9a-c4f0d166710a-kube-api-access-2tbht\") pod \"59e812a1-677a-4aca-bb9a-c4f0d166710a\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.048935 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/59e812a1-677a-4aca-bb9a-c4f0d166710a-openstack-config-secret\") pod \"59e812a1-677a-4aca-bb9a-c4f0d166710a\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.048964 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/59e812a1-677a-4aca-bb9a-c4f0d166710a-openstack-config\") pod \"59e812a1-677a-4aca-bb9a-c4f0d166710a\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.049212 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/59e812a1-677a-4aca-bb9a-c4f0d166710a-test-operator-ephemeral-temporary\") pod \"59e812a1-677a-4aca-bb9a-c4f0d166710a\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.049326 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/59e812a1-677a-4aca-bb9a-c4f0d166710a-test-operator-ephemeral-workdir\") pod \"59e812a1-677a-4aca-bb9a-c4f0d166710a\" (UID: \"59e812a1-677a-4aca-bb9a-c4f0d166710a\") " Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.049948 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59e812a1-677a-4aca-bb9a-c4f0d166710a-config-data" (OuterVolumeSpecName: "config-data") pod "59e812a1-677a-4aca-bb9a-c4f0d166710a" (UID: "59e812a1-677a-4aca-bb9a-c4f0d166710a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.050200 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59e812a1-677a-4aca-bb9a-c4f0d166710a-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "59e812a1-677a-4aca-bb9a-c4f0d166710a" (UID: "59e812a1-677a-4aca-bb9a-c4f0d166710a"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.058012 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e812a1-677a-4aca-bb9a-c4f0d166710a-kube-api-access-2tbht" (OuterVolumeSpecName: "kube-api-access-2tbht") pod "59e812a1-677a-4aca-bb9a-c4f0d166710a" (UID: "59e812a1-677a-4aca-bb9a-c4f0d166710a"). InnerVolumeSpecName "kube-api-access-2tbht". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.058266 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "test-operator-logs") pod "59e812a1-677a-4aca-bb9a-c4f0d166710a" (UID: "59e812a1-677a-4aca-bb9a-c4f0d166710a"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.066802 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59e812a1-677a-4aca-bb9a-c4f0d166710a-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "59e812a1-677a-4aca-bb9a-c4f0d166710a" (UID: "59e812a1-677a-4aca-bb9a-c4f0d166710a"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.069643 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/59e812a1-677a-4aca-bb9a-c4f0d166710a-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.069676 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tbht\" (UniqueName: \"kubernetes.io/projected/59e812a1-677a-4aca-bb9a-c4f0d166710a-kube-api-access-2tbht\") on node \"crc\" DevicePath \"\"" Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.069694 4743 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/59e812a1-677a-4aca-bb9a-c4f0d166710a-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.069707 4743 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/59e812a1-677a-4aca-bb9a-c4f0d166710a-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.069757 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.091157 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e812a1-677a-4aca-bb9a-c4f0d166710a-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "59e812a1-677a-4aca-bb9a-c4f0d166710a" (UID: "59e812a1-677a-4aca-bb9a-c4f0d166710a"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.091588 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e812a1-677a-4aca-bb9a-c4f0d166710a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "59e812a1-677a-4aca-bb9a-c4f0d166710a" (UID: "59e812a1-677a-4aca-bb9a-c4f0d166710a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.108704 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e812a1-677a-4aca-bb9a-c4f0d166710a-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "59e812a1-677a-4aca-bb9a-c4f0d166710a" (UID: "59e812a1-677a-4aca-bb9a-c4f0d166710a"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.130490 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.131849 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59e812a1-677a-4aca-bb9a-c4f0d166710a-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "59e812a1-677a-4aca-bb9a-c4f0d166710a" (UID: "59e812a1-677a-4aca-bb9a-c4f0d166710a"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.171420 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.171451 4743 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/59e812a1-677a-4aca-bb9a-c4f0d166710a-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.171460 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59e812a1-677a-4aca-bb9a-c4f0d166710a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.171469 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/59e812a1-677a-4aca-bb9a-c4f0d166710a-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.171480 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/59e812a1-677a-4aca-bb9a-c4f0d166710a-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.452649 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"59e812a1-677a-4aca-bb9a-c4f0d166710a","Type":"ContainerDied","Data":"4cbcedab8669dc7e88e557e1077dcba47b8cba98b942e275eced85107128365a"} Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.452941 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cbcedab8669dc7e88e557e1077dcba47b8cba98b942e275eced85107128365a" Oct 11 02:56:41 crc kubenswrapper[4743]: I1011 02:56:41.452758 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 11 02:56:44 crc kubenswrapper[4743]: I1011 02:56:44.458168 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:56:44 crc kubenswrapper[4743]: I1011 02:56:44.458653 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:56:50 crc kubenswrapper[4743]: I1011 02:56:50.489932 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 11 02:56:50 crc kubenswrapper[4743]: E1011 02:56:50.491251 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde72df4-563d-43fb-b9e8-7937fb62a4ce" containerName="extract-utilities" Oct 11 02:56:50 crc kubenswrapper[4743]: I1011 02:56:50.491275 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde72df4-563d-43fb-b9e8-7937fb62a4ce" containerName="extract-utilities" Oct 11 02:56:50 crc kubenswrapper[4743]: E1011 02:56:50.491404 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde72df4-563d-43fb-b9e8-7937fb62a4ce" containerName="extract-content" Oct 11 02:56:50 crc kubenswrapper[4743]: I1011 02:56:50.491418 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde72df4-563d-43fb-b9e8-7937fb62a4ce" containerName="extract-content" Oct 11 02:56:50 crc kubenswrapper[4743]: E1011 02:56:50.491443 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e812a1-677a-4aca-bb9a-c4f0d166710a" containerName="tempest-tests-tempest-tests-runner" Oct 11 02:56:50 crc kubenswrapper[4743]: I1011 02:56:50.491455 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e812a1-677a-4aca-bb9a-c4f0d166710a" containerName="tempest-tests-tempest-tests-runner" Oct 11 02:56:50 crc kubenswrapper[4743]: E1011 02:56:50.491535 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde72df4-563d-43fb-b9e8-7937fb62a4ce" containerName="registry-server" Oct 11 02:56:50 crc kubenswrapper[4743]: I1011 02:56:50.491548 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde72df4-563d-43fb-b9e8-7937fb62a4ce" containerName="registry-server" Oct 11 02:56:50 crc kubenswrapper[4743]: I1011 02:56:50.491952 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="bde72df4-563d-43fb-b9e8-7937fb62a4ce" containerName="registry-server" Oct 11 02:56:50 crc kubenswrapper[4743]: I1011 02:56:50.491975 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="59e812a1-677a-4aca-bb9a-c4f0d166710a" containerName="tempest-tests-tempest-tests-runner" Oct 11 02:56:50 crc kubenswrapper[4743]: I1011 02:56:50.493396 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 11 02:56:50 crc kubenswrapper[4743]: I1011 02:56:50.495723 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-xtpzb" Oct 11 02:56:50 crc kubenswrapper[4743]: I1011 02:56:50.502612 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 11 02:56:50 crc kubenswrapper[4743]: I1011 02:56:50.606008 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4wkn\" (UniqueName: \"kubernetes.io/projected/795afd2c-7ccb-435d-8cc6-6ef474ddf6e1-kube-api-access-k4wkn\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"795afd2c-7ccb-435d-8cc6-6ef474ddf6e1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 11 02:56:50 crc kubenswrapper[4743]: I1011 02:56:50.606581 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"795afd2c-7ccb-435d-8cc6-6ef474ddf6e1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 11 02:56:50 crc kubenswrapper[4743]: I1011 02:56:50.708124 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4wkn\" (UniqueName: \"kubernetes.io/projected/795afd2c-7ccb-435d-8cc6-6ef474ddf6e1-kube-api-access-k4wkn\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"795afd2c-7ccb-435d-8cc6-6ef474ddf6e1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 11 02:56:50 crc kubenswrapper[4743]: I1011 02:56:50.708279 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"795afd2c-7ccb-435d-8cc6-6ef474ddf6e1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 11 02:56:50 crc kubenswrapper[4743]: I1011 02:56:50.709399 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"795afd2c-7ccb-435d-8cc6-6ef474ddf6e1\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 11 02:56:50 crc kubenswrapper[4743]: I1011 02:56:50.740321 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4wkn\" (UniqueName: \"kubernetes.io/projected/795afd2c-7ccb-435d-8cc6-6ef474ddf6e1-kube-api-access-k4wkn\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"795afd2c-7ccb-435d-8cc6-6ef474ddf6e1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 11 02:56:50 crc kubenswrapper[4743]: I1011 02:56:50.743319 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"795afd2c-7ccb-435d-8cc6-6ef474ddf6e1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 11 02:56:50 crc kubenswrapper[4743]: I1011 02:56:50.817710 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 11 02:56:51 crc kubenswrapper[4743]: I1011 02:56:51.306519 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 11 02:56:51 crc kubenswrapper[4743]: W1011 02:56:51.316305 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod795afd2c_7ccb_435d_8cc6_6ef474ddf6e1.slice/crio-8f2dcf56a14752d21cf811952cda73f99e8a9603c200f95f8bda6058027b6bd1 WatchSource:0}: Error finding container 8f2dcf56a14752d21cf811952cda73f99e8a9603c200f95f8bda6058027b6bd1: Status 404 returned error can't find the container with id 8f2dcf56a14752d21cf811952cda73f99e8a9603c200f95f8bda6058027b6bd1 Oct 11 02:56:51 crc kubenswrapper[4743]: I1011 02:56:51.321209 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 02:56:51 crc kubenswrapper[4743]: I1011 02:56:51.567295 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"795afd2c-7ccb-435d-8cc6-6ef474ddf6e1","Type":"ContainerStarted","Data":"8f2dcf56a14752d21cf811952cda73f99e8a9603c200f95f8bda6058027b6bd1"} Oct 11 02:56:53 crc kubenswrapper[4743]: I1011 02:56:53.589768 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"795afd2c-7ccb-435d-8cc6-6ef474ddf6e1","Type":"ContainerStarted","Data":"3e2933a70f01d04fa24bd183e42e2a00cfe48f895729e766b65364fa8652f890"} Oct 11 02:56:53 crc kubenswrapper[4743]: I1011 02:56:53.607687 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.381979539 podStartE2EDuration="3.607667824s" podCreationTimestamp="2025-10-11 02:56:50 +0000 UTC" firstStartedPulling="2025-10-11 02:56:51.321034441 +0000 UTC m=+7505.974014838" lastFinishedPulling="2025-10-11 02:56:52.546722736 +0000 UTC m=+7507.199703123" observedRunningTime="2025-10-11 02:56:53.604130616 +0000 UTC m=+7508.257111033" watchObservedRunningTime="2025-10-11 02:56:53.607667824 +0000 UTC m=+7508.260648221" Oct 11 02:57:14 crc kubenswrapper[4743]: I1011 02:57:14.458608 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:57:14 crc kubenswrapper[4743]: I1011 02:57:14.459247 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:57:16 crc kubenswrapper[4743]: I1011 02:57:16.388395 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dmn4d/must-gather-svv82"] Oct 11 02:57:16 crc kubenswrapper[4743]: I1011 02:57:16.393172 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dmn4d/must-gather-svv82" Oct 11 02:57:16 crc kubenswrapper[4743]: I1011 02:57:16.397360 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-dmn4d"/"default-dockercfg-67g48" Oct 11 02:57:16 crc kubenswrapper[4743]: I1011 02:57:16.397956 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-dmn4d"/"kube-root-ca.crt" Oct 11 02:57:16 crc kubenswrapper[4743]: I1011 02:57:16.398308 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-dmn4d"/"openshift-service-ca.crt" Oct 11 02:57:16 crc kubenswrapper[4743]: I1011 02:57:16.412604 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dmn4d/must-gather-svv82"] Oct 11 02:57:16 crc kubenswrapper[4743]: I1011 02:57:16.492296 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwnvm\" (UniqueName: \"kubernetes.io/projected/6ac11bdb-263b-4572-889e-311b00b61201-kube-api-access-kwnvm\") pod \"must-gather-svv82\" (UID: \"6ac11bdb-263b-4572-889e-311b00b61201\") " pod="openshift-must-gather-dmn4d/must-gather-svv82" Oct 11 02:57:16 crc kubenswrapper[4743]: I1011 02:57:16.492791 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6ac11bdb-263b-4572-889e-311b00b61201-must-gather-output\") pod \"must-gather-svv82\" (UID: \"6ac11bdb-263b-4572-889e-311b00b61201\") " pod="openshift-must-gather-dmn4d/must-gather-svv82" Oct 11 02:57:16 crc kubenswrapper[4743]: I1011 02:57:16.595408 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6ac11bdb-263b-4572-889e-311b00b61201-must-gather-output\") pod \"must-gather-svv82\" (UID: \"6ac11bdb-263b-4572-889e-311b00b61201\") " pod="openshift-must-gather-dmn4d/must-gather-svv82" Oct 11 02:57:16 crc kubenswrapper[4743]: I1011 02:57:16.595491 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwnvm\" (UniqueName: \"kubernetes.io/projected/6ac11bdb-263b-4572-889e-311b00b61201-kube-api-access-kwnvm\") pod \"must-gather-svv82\" (UID: \"6ac11bdb-263b-4572-889e-311b00b61201\") " pod="openshift-must-gather-dmn4d/must-gather-svv82" Oct 11 02:57:16 crc kubenswrapper[4743]: I1011 02:57:16.596060 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6ac11bdb-263b-4572-889e-311b00b61201-must-gather-output\") pod \"must-gather-svv82\" (UID: \"6ac11bdb-263b-4572-889e-311b00b61201\") " pod="openshift-must-gather-dmn4d/must-gather-svv82" Oct 11 02:57:16 crc kubenswrapper[4743]: I1011 02:57:16.618091 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwnvm\" (UniqueName: \"kubernetes.io/projected/6ac11bdb-263b-4572-889e-311b00b61201-kube-api-access-kwnvm\") pod \"must-gather-svv82\" (UID: \"6ac11bdb-263b-4572-889e-311b00b61201\") " pod="openshift-must-gather-dmn4d/must-gather-svv82" Oct 11 02:57:16 crc kubenswrapper[4743]: I1011 02:57:16.718253 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dmn4d/must-gather-svv82" Oct 11 02:57:17 crc kubenswrapper[4743]: I1011 02:57:17.233790 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dmn4d/must-gather-svv82"] Oct 11 02:57:17 crc kubenswrapper[4743]: I1011 02:57:17.841331 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dmn4d/must-gather-svv82" event={"ID":"6ac11bdb-263b-4572-889e-311b00b61201","Type":"ContainerStarted","Data":"6862d86526f5a00023e610e6127cc69a27fae4520ad94d7cba5c550356867291"} Oct 11 02:57:25 crc kubenswrapper[4743]: I1011 02:57:25.926165 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dmn4d/must-gather-svv82" event={"ID":"6ac11bdb-263b-4572-889e-311b00b61201","Type":"ContainerStarted","Data":"c4099bc965f55fc6a5ff36fd5ffc6fef73824b3c110c4a6a35f80123916ace83"} Oct 11 02:57:25 crc kubenswrapper[4743]: I1011 02:57:25.926891 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dmn4d/must-gather-svv82" event={"ID":"6ac11bdb-263b-4572-889e-311b00b61201","Type":"ContainerStarted","Data":"b09072a149bbfd55f7798798a0fdf816b98d19e5b0821a0efe726be18ad892d6"} Oct 11 02:57:25 crc kubenswrapper[4743]: I1011 02:57:25.948426 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dmn4d/must-gather-svv82" podStartSLOduration=1.9874557849999999 podStartE2EDuration="9.948403148s" podCreationTimestamp="2025-10-11 02:57:16 +0000 UTC" firstStartedPulling="2025-10-11 02:57:17.24298744 +0000 UTC m=+7531.895967837" lastFinishedPulling="2025-10-11 02:57:25.203934803 +0000 UTC m=+7539.856915200" observedRunningTime="2025-10-11 02:57:25.94366551 +0000 UTC m=+7540.596645907" watchObservedRunningTime="2025-10-11 02:57:25.948403148 +0000 UTC m=+7540.601383545" Oct 11 02:57:30 crc kubenswrapper[4743]: E1011 02:57:30.627011 4743 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.106:56292->38.102.83.106:39201: write tcp 38.102.83.106:56292->38.102.83.106:39201: write: connection reset by peer Oct 11 02:57:31 crc kubenswrapper[4743]: I1011 02:57:31.742409 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dmn4d/crc-debug-jzxhb"] Oct 11 02:57:31 crc kubenswrapper[4743]: I1011 02:57:31.744136 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dmn4d/crc-debug-jzxhb" Oct 11 02:57:31 crc kubenswrapper[4743]: I1011 02:57:31.755227 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/190b6487-b2c6-4bec-b39f-d55952738f06-host\") pod \"crc-debug-jzxhb\" (UID: \"190b6487-b2c6-4bec-b39f-d55952738f06\") " pod="openshift-must-gather-dmn4d/crc-debug-jzxhb" Oct 11 02:57:31 crc kubenswrapper[4743]: I1011 02:57:31.755509 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwg75\" (UniqueName: \"kubernetes.io/projected/190b6487-b2c6-4bec-b39f-d55952738f06-kube-api-access-cwg75\") pod \"crc-debug-jzxhb\" (UID: \"190b6487-b2c6-4bec-b39f-d55952738f06\") " pod="openshift-must-gather-dmn4d/crc-debug-jzxhb" Oct 11 02:57:31 crc kubenswrapper[4743]: I1011 02:57:31.858165 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/190b6487-b2c6-4bec-b39f-d55952738f06-host\") pod \"crc-debug-jzxhb\" (UID: \"190b6487-b2c6-4bec-b39f-d55952738f06\") " pod="openshift-must-gather-dmn4d/crc-debug-jzxhb" Oct 11 02:57:31 crc kubenswrapper[4743]: I1011 02:57:31.858673 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwg75\" (UniqueName: \"kubernetes.io/projected/190b6487-b2c6-4bec-b39f-d55952738f06-kube-api-access-cwg75\") pod \"crc-debug-jzxhb\" (UID: \"190b6487-b2c6-4bec-b39f-d55952738f06\") " pod="openshift-must-gather-dmn4d/crc-debug-jzxhb" Oct 11 02:57:31 crc kubenswrapper[4743]: I1011 02:57:31.859383 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/190b6487-b2c6-4bec-b39f-d55952738f06-host\") pod \"crc-debug-jzxhb\" (UID: \"190b6487-b2c6-4bec-b39f-d55952738f06\") " pod="openshift-must-gather-dmn4d/crc-debug-jzxhb" Oct 11 02:57:31 crc kubenswrapper[4743]: I1011 02:57:31.887160 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwg75\" (UniqueName: \"kubernetes.io/projected/190b6487-b2c6-4bec-b39f-d55952738f06-kube-api-access-cwg75\") pod \"crc-debug-jzxhb\" (UID: \"190b6487-b2c6-4bec-b39f-d55952738f06\") " pod="openshift-must-gather-dmn4d/crc-debug-jzxhb" Oct 11 02:57:32 crc kubenswrapper[4743]: I1011 02:57:32.062121 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dmn4d/crc-debug-jzxhb" Oct 11 02:57:32 crc kubenswrapper[4743]: W1011 02:57:32.104520 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod190b6487_b2c6_4bec_b39f_d55952738f06.slice/crio-6a3bfd8e47057022335e72fb4ba868fbf0fc15cc44c2b6176f11eb543e31a5d9 WatchSource:0}: Error finding container 6a3bfd8e47057022335e72fb4ba868fbf0fc15cc44c2b6176f11eb543e31a5d9: Status 404 returned error can't find the container with id 6a3bfd8e47057022335e72fb4ba868fbf0fc15cc44c2b6176f11eb543e31a5d9 Oct 11 02:57:33 crc kubenswrapper[4743]: I1011 02:57:33.003959 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dmn4d/crc-debug-jzxhb" event={"ID":"190b6487-b2c6-4bec-b39f-d55952738f06","Type":"ContainerStarted","Data":"6a3bfd8e47057022335e72fb4ba868fbf0fc15cc44c2b6176f11eb543e31a5d9"} Oct 11 02:57:44 crc kubenswrapper[4743]: I1011 02:57:44.157895 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dmn4d/crc-debug-jzxhb" event={"ID":"190b6487-b2c6-4bec-b39f-d55952738f06","Type":"ContainerStarted","Data":"9dc7fcc1d4ae9fb30dfa876b976bade81cea21a40e14d1cbec11aae0e91205d9"} Oct 11 02:57:44 crc kubenswrapper[4743]: I1011 02:57:44.177947 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dmn4d/crc-debug-jzxhb" podStartSLOduration=2.077128279 podStartE2EDuration="13.177926117s" podCreationTimestamp="2025-10-11 02:57:31 +0000 UTC" firstStartedPulling="2025-10-11 02:57:32.108296537 +0000 UTC m=+7546.761276934" lastFinishedPulling="2025-10-11 02:57:43.209094375 +0000 UTC m=+7557.862074772" observedRunningTime="2025-10-11 02:57:44.171788045 +0000 UTC m=+7558.824768442" watchObservedRunningTime="2025-10-11 02:57:44.177926117 +0000 UTC m=+7558.830906514" Oct 11 02:57:44 crc kubenswrapper[4743]: I1011 02:57:44.458084 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:57:44 crc kubenswrapper[4743]: I1011 02:57:44.458399 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:57:44 crc kubenswrapper[4743]: I1011 02:57:44.458439 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 02:57:44 crc kubenswrapper[4743]: I1011 02:57:44.459236 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"85f2b99de1cc7ef2f34c96872391b70e6bcbca23b2568563cfde1c4d96fac977"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 02:57:44 crc kubenswrapper[4743]: I1011 02:57:44.459296 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://85f2b99de1cc7ef2f34c96872391b70e6bcbca23b2568563cfde1c4d96fac977" gracePeriod=600 Oct 11 02:57:45 crc kubenswrapper[4743]: I1011 02:57:45.170198 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="85f2b99de1cc7ef2f34c96872391b70e6bcbca23b2568563cfde1c4d96fac977" exitCode=0 Oct 11 02:57:45 crc kubenswrapper[4743]: I1011 02:57:45.170273 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"85f2b99de1cc7ef2f34c96872391b70e6bcbca23b2568563cfde1c4d96fac977"} Oct 11 02:57:45 crc kubenswrapper[4743]: I1011 02:57:45.170784 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61"} Oct 11 02:57:45 crc kubenswrapper[4743]: I1011 02:57:45.170805 4743 scope.go:117] "RemoveContainer" containerID="b346d18aff914ecb002e09f3c8ca75154604a876bdd7435ae937d7b6917314b0" Oct 11 02:58:33 crc kubenswrapper[4743]: I1011 02:58:33.742530 4743 generic.go:334] "Generic (PLEG): container finished" podID="190b6487-b2c6-4bec-b39f-d55952738f06" containerID="9dc7fcc1d4ae9fb30dfa876b976bade81cea21a40e14d1cbec11aae0e91205d9" exitCode=0 Oct 11 02:58:33 crc kubenswrapper[4743]: I1011 02:58:33.742612 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dmn4d/crc-debug-jzxhb" event={"ID":"190b6487-b2c6-4bec-b39f-d55952738f06","Type":"ContainerDied","Data":"9dc7fcc1d4ae9fb30dfa876b976bade81cea21a40e14d1cbec11aae0e91205d9"} Oct 11 02:58:34 crc kubenswrapper[4743]: I1011 02:58:34.904752 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dmn4d/crc-debug-jzxhb" Oct 11 02:58:34 crc kubenswrapper[4743]: I1011 02:58:34.947899 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dmn4d/crc-debug-jzxhb"] Oct 11 02:58:34 crc kubenswrapper[4743]: I1011 02:58:34.958025 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dmn4d/crc-debug-jzxhb"] Oct 11 02:58:34 crc kubenswrapper[4743]: I1011 02:58:34.978997 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwg75\" (UniqueName: \"kubernetes.io/projected/190b6487-b2c6-4bec-b39f-d55952738f06-kube-api-access-cwg75\") pod \"190b6487-b2c6-4bec-b39f-d55952738f06\" (UID: \"190b6487-b2c6-4bec-b39f-d55952738f06\") " Oct 11 02:58:34 crc kubenswrapper[4743]: I1011 02:58:34.979125 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/190b6487-b2c6-4bec-b39f-d55952738f06-host\") pod \"190b6487-b2c6-4bec-b39f-d55952738f06\" (UID: \"190b6487-b2c6-4bec-b39f-d55952738f06\") " Oct 11 02:58:34 crc kubenswrapper[4743]: I1011 02:58:34.979253 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/190b6487-b2c6-4bec-b39f-d55952738f06-host" (OuterVolumeSpecName: "host") pod "190b6487-b2c6-4bec-b39f-d55952738f06" (UID: "190b6487-b2c6-4bec-b39f-d55952738f06"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 02:58:34 crc kubenswrapper[4743]: I1011 02:58:34.979645 4743 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/190b6487-b2c6-4bec-b39f-d55952738f06-host\") on node \"crc\" DevicePath \"\"" Oct 11 02:58:34 crc kubenswrapper[4743]: I1011 02:58:34.992283 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/190b6487-b2c6-4bec-b39f-d55952738f06-kube-api-access-cwg75" (OuterVolumeSpecName: "kube-api-access-cwg75") pod "190b6487-b2c6-4bec-b39f-d55952738f06" (UID: "190b6487-b2c6-4bec-b39f-d55952738f06"). InnerVolumeSpecName "kube-api-access-cwg75". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:58:35 crc kubenswrapper[4743]: I1011 02:58:35.082437 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwg75\" (UniqueName: \"kubernetes.io/projected/190b6487-b2c6-4bec-b39f-d55952738f06-kube-api-access-cwg75\") on node \"crc\" DevicePath \"\"" Oct 11 02:58:35 crc kubenswrapper[4743]: I1011 02:58:35.761133 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a3bfd8e47057022335e72fb4ba868fbf0fc15cc44c2b6176f11eb543e31a5d9" Oct 11 02:58:35 crc kubenswrapper[4743]: I1011 02:58:35.761581 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dmn4d/crc-debug-jzxhb" Oct 11 02:58:35 crc kubenswrapper[4743]: E1011 02:58:35.933113 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod190b6487_b2c6_4bec_b39f_d55952738f06.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod190b6487_b2c6_4bec_b39f_d55952738f06.slice/crio-6a3bfd8e47057022335e72fb4ba868fbf0fc15cc44c2b6176f11eb543e31a5d9\": RecentStats: unable to find data in memory cache]" Oct 11 02:58:36 crc kubenswrapper[4743]: I1011 02:58:36.111531 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="190b6487-b2c6-4bec-b39f-d55952738f06" path="/var/lib/kubelet/pods/190b6487-b2c6-4bec-b39f-d55952738f06/volumes" Oct 11 02:58:36 crc kubenswrapper[4743]: I1011 02:58:36.168669 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dmn4d/crc-debug-lpqb5"] Oct 11 02:58:36 crc kubenswrapper[4743]: E1011 02:58:36.169684 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="190b6487-b2c6-4bec-b39f-d55952738f06" containerName="container-00" Oct 11 02:58:36 crc kubenswrapper[4743]: I1011 02:58:36.169717 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="190b6487-b2c6-4bec-b39f-d55952738f06" containerName="container-00" Oct 11 02:58:36 crc kubenswrapper[4743]: I1011 02:58:36.170034 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="190b6487-b2c6-4bec-b39f-d55952738f06" containerName="container-00" Oct 11 02:58:36 crc kubenswrapper[4743]: I1011 02:58:36.171113 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dmn4d/crc-debug-lpqb5" Oct 11 02:58:36 crc kubenswrapper[4743]: I1011 02:58:36.305726 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25bc3b6e-679c-4905-bd80-b3d1afbf8523-host\") pod \"crc-debug-lpqb5\" (UID: \"25bc3b6e-679c-4905-bd80-b3d1afbf8523\") " pod="openshift-must-gather-dmn4d/crc-debug-lpqb5" Oct 11 02:58:36 crc kubenswrapper[4743]: I1011 02:58:36.305994 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s26gc\" (UniqueName: \"kubernetes.io/projected/25bc3b6e-679c-4905-bd80-b3d1afbf8523-kube-api-access-s26gc\") pod \"crc-debug-lpqb5\" (UID: \"25bc3b6e-679c-4905-bd80-b3d1afbf8523\") " pod="openshift-must-gather-dmn4d/crc-debug-lpqb5" Oct 11 02:58:36 crc kubenswrapper[4743]: I1011 02:58:36.408418 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25bc3b6e-679c-4905-bd80-b3d1afbf8523-host\") pod \"crc-debug-lpqb5\" (UID: \"25bc3b6e-679c-4905-bd80-b3d1afbf8523\") " pod="openshift-must-gather-dmn4d/crc-debug-lpqb5" Oct 11 02:58:36 crc kubenswrapper[4743]: I1011 02:58:36.408603 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s26gc\" (UniqueName: \"kubernetes.io/projected/25bc3b6e-679c-4905-bd80-b3d1afbf8523-kube-api-access-s26gc\") pod \"crc-debug-lpqb5\" (UID: \"25bc3b6e-679c-4905-bd80-b3d1afbf8523\") " pod="openshift-must-gather-dmn4d/crc-debug-lpqb5" Oct 11 02:58:36 crc kubenswrapper[4743]: I1011 02:58:36.408612 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25bc3b6e-679c-4905-bd80-b3d1afbf8523-host\") pod \"crc-debug-lpqb5\" (UID: \"25bc3b6e-679c-4905-bd80-b3d1afbf8523\") " pod="openshift-must-gather-dmn4d/crc-debug-lpqb5" Oct 11 02:58:36 crc kubenswrapper[4743]: I1011 02:58:36.428319 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s26gc\" (UniqueName: \"kubernetes.io/projected/25bc3b6e-679c-4905-bd80-b3d1afbf8523-kube-api-access-s26gc\") pod \"crc-debug-lpqb5\" (UID: \"25bc3b6e-679c-4905-bd80-b3d1afbf8523\") " pod="openshift-must-gather-dmn4d/crc-debug-lpqb5" Oct 11 02:58:36 crc kubenswrapper[4743]: I1011 02:58:36.489298 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dmn4d/crc-debug-lpqb5" Oct 11 02:58:36 crc kubenswrapper[4743]: I1011 02:58:36.771357 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dmn4d/crc-debug-lpqb5" event={"ID":"25bc3b6e-679c-4905-bd80-b3d1afbf8523","Type":"ContainerStarted","Data":"4fe2a0b945a5918863335bd2c3be7077767b20a4729112a641689ca1ccdd0c64"} Oct 11 02:58:37 crc kubenswrapper[4743]: I1011 02:58:37.781980 4743 generic.go:334] "Generic (PLEG): container finished" podID="25bc3b6e-679c-4905-bd80-b3d1afbf8523" containerID="d6de51480ebcaae0fb768156c5d7cd97e768d82d4a43b1bdf70f3871080fcd2e" exitCode=0 Oct 11 02:58:37 crc kubenswrapper[4743]: I1011 02:58:37.782044 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dmn4d/crc-debug-lpqb5" event={"ID":"25bc3b6e-679c-4905-bd80-b3d1afbf8523","Type":"ContainerDied","Data":"d6de51480ebcaae0fb768156c5d7cd97e768d82d4a43b1bdf70f3871080fcd2e"} Oct 11 02:58:38 crc kubenswrapper[4743]: I1011 02:58:38.901715 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dmn4d/crc-debug-lpqb5" Oct 11 02:58:38 crc kubenswrapper[4743]: I1011 02:58:38.968607 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25bc3b6e-679c-4905-bd80-b3d1afbf8523-host\") pod \"25bc3b6e-679c-4905-bd80-b3d1afbf8523\" (UID: \"25bc3b6e-679c-4905-bd80-b3d1afbf8523\") " Oct 11 02:58:38 crc kubenswrapper[4743]: I1011 02:58:38.968807 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s26gc\" (UniqueName: \"kubernetes.io/projected/25bc3b6e-679c-4905-bd80-b3d1afbf8523-kube-api-access-s26gc\") pod \"25bc3b6e-679c-4905-bd80-b3d1afbf8523\" (UID: \"25bc3b6e-679c-4905-bd80-b3d1afbf8523\") " Oct 11 02:58:38 crc kubenswrapper[4743]: I1011 02:58:38.968885 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25bc3b6e-679c-4905-bd80-b3d1afbf8523-host" (OuterVolumeSpecName: "host") pod "25bc3b6e-679c-4905-bd80-b3d1afbf8523" (UID: "25bc3b6e-679c-4905-bd80-b3d1afbf8523"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 02:58:38 crc kubenswrapper[4743]: I1011 02:58:38.969708 4743 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25bc3b6e-679c-4905-bd80-b3d1afbf8523-host\") on node \"crc\" DevicePath \"\"" Oct 11 02:58:38 crc kubenswrapper[4743]: I1011 02:58:38.978223 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25bc3b6e-679c-4905-bd80-b3d1afbf8523-kube-api-access-s26gc" (OuterVolumeSpecName: "kube-api-access-s26gc") pod "25bc3b6e-679c-4905-bd80-b3d1afbf8523" (UID: "25bc3b6e-679c-4905-bd80-b3d1afbf8523"). InnerVolumeSpecName "kube-api-access-s26gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:58:39 crc kubenswrapper[4743]: I1011 02:58:39.071406 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s26gc\" (UniqueName: \"kubernetes.io/projected/25bc3b6e-679c-4905-bd80-b3d1afbf8523-kube-api-access-s26gc\") on node \"crc\" DevicePath \"\"" Oct 11 02:58:39 crc kubenswrapper[4743]: I1011 02:58:39.817792 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dmn4d/crc-debug-lpqb5" event={"ID":"25bc3b6e-679c-4905-bd80-b3d1afbf8523","Type":"ContainerDied","Data":"4fe2a0b945a5918863335bd2c3be7077767b20a4729112a641689ca1ccdd0c64"} Oct 11 02:58:39 crc kubenswrapper[4743]: I1011 02:58:39.817835 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fe2a0b945a5918863335bd2c3be7077767b20a4729112a641689ca1ccdd0c64" Oct 11 02:58:39 crc kubenswrapper[4743]: I1011 02:58:39.817940 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dmn4d/crc-debug-lpqb5" Oct 11 02:58:40 crc kubenswrapper[4743]: I1011 02:58:40.123052 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dmn4d/crc-debug-lpqb5"] Oct 11 02:58:40 crc kubenswrapper[4743]: I1011 02:58:40.131994 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dmn4d/crc-debug-lpqb5"] Oct 11 02:58:41 crc kubenswrapper[4743]: I1011 02:58:41.304652 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dmn4d/crc-debug-g88hf"] Oct 11 02:58:41 crc kubenswrapper[4743]: E1011 02:58:41.305500 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25bc3b6e-679c-4905-bd80-b3d1afbf8523" containerName="container-00" Oct 11 02:58:41 crc kubenswrapper[4743]: I1011 02:58:41.305513 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="25bc3b6e-679c-4905-bd80-b3d1afbf8523" containerName="container-00" Oct 11 02:58:41 crc kubenswrapper[4743]: I1011 02:58:41.305713 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="25bc3b6e-679c-4905-bd80-b3d1afbf8523" containerName="container-00" Oct 11 02:58:41 crc kubenswrapper[4743]: I1011 02:58:41.306446 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dmn4d/crc-debug-g88hf" Oct 11 02:58:41 crc kubenswrapper[4743]: I1011 02:58:41.420784 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4e313cf-8fa3-425f-ab21-8ad901774ab4-host\") pod \"crc-debug-g88hf\" (UID: \"c4e313cf-8fa3-425f-ab21-8ad901774ab4\") " pod="openshift-must-gather-dmn4d/crc-debug-g88hf" Oct 11 02:58:41 crc kubenswrapper[4743]: I1011 02:58:41.420955 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh67l\" (UniqueName: \"kubernetes.io/projected/c4e313cf-8fa3-425f-ab21-8ad901774ab4-kube-api-access-rh67l\") pod \"crc-debug-g88hf\" (UID: \"c4e313cf-8fa3-425f-ab21-8ad901774ab4\") " pod="openshift-must-gather-dmn4d/crc-debug-g88hf" Oct 11 02:58:41 crc kubenswrapper[4743]: I1011 02:58:41.522846 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4e313cf-8fa3-425f-ab21-8ad901774ab4-host\") pod \"crc-debug-g88hf\" (UID: \"c4e313cf-8fa3-425f-ab21-8ad901774ab4\") " pod="openshift-must-gather-dmn4d/crc-debug-g88hf" Oct 11 02:58:41 crc kubenswrapper[4743]: I1011 02:58:41.522981 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh67l\" (UniqueName: \"kubernetes.io/projected/c4e313cf-8fa3-425f-ab21-8ad901774ab4-kube-api-access-rh67l\") pod \"crc-debug-g88hf\" (UID: \"c4e313cf-8fa3-425f-ab21-8ad901774ab4\") " pod="openshift-must-gather-dmn4d/crc-debug-g88hf" Oct 11 02:58:41 crc kubenswrapper[4743]: I1011 02:58:41.523066 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4e313cf-8fa3-425f-ab21-8ad901774ab4-host\") pod \"crc-debug-g88hf\" (UID: \"c4e313cf-8fa3-425f-ab21-8ad901774ab4\") " pod="openshift-must-gather-dmn4d/crc-debug-g88hf" Oct 11 02:58:41 crc kubenswrapper[4743]: I1011 02:58:41.547760 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh67l\" (UniqueName: \"kubernetes.io/projected/c4e313cf-8fa3-425f-ab21-8ad901774ab4-kube-api-access-rh67l\") pod \"crc-debug-g88hf\" (UID: \"c4e313cf-8fa3-425f-ab21-8ad901774ab4\") " pod="openshift-must-gather-dmn4d/crc-debug-g88hf" Oct 11 02:58:41 crc kubenswrapper[4743]: I1011 02:58:41.625791 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dmn4d/crc-debug-g88hf" Oct 11 02:58:41 crc kubenswrapper[4743]: I1011 02:58:41.842274 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dmn4d/crc-debug-g88hf" event={"ID":"c4e313cf-8fa3-425f-ab21-8ad901774ab4","Type":"ContainerStarted","Data":"b9340faf275f40a3ecc18b9851c592041245102731c347649a835b86bd3498ed"} Oct 11 02:58:42 crc kubenswrapper[4743]: I1011 02:58:42.107049 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25bc3b6e-679c-4905-bd80-b3d1afbf8523" path="/var/lib/kubelet/pods/25bc3b6e-679c-4905-bd80-b3d1afbf8523/volumes" Oct 11 02:58:42 crc kubenswrapper[4743]: I1011 02:58:42.857208 4743 generic.go:334] "Generic (PLEG): container finished" podID="c4e313cf-8fa3-425f-ab21-8ad901774ab4" containerID="fce7ce31009961ce8fe97ae666b9afa55ae760ed6c760e7ce688a09aa5644ee0" exitCode=0 Oct 11 02:58:42 crc kubenswrapper[4743]: I1011 02:58:42.857259 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dmn4d/crc-debug-g88hf" event={"ID":"c4e313cf-8fa3-425f-ab21-8ad901774ab4","Type":"ContainerDied","Data":"fce7ce31009961ce8fe97ae666b9afa55ae760ed6c760e7ce688a09aa5644ee0"} Oct 11 02:58:42 crc kubenswrapper[4743]: I1011 02:58:42.917641 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dmn4d/crc-debug-g88hf"] Oct 11 02:58:42 crc kubenswrapper[4743]: I1011 02:58:42.930515 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dmn4d/crc-debug-g88hf"] Oct 11 02:58:44 crc kubenswrapper[4743]: I1011 02:58:44.000005 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dmn4d/crc-debug-g88hf" Oct 11 02:58:44 crc kubenswrapper[4743]: I1011 02:58:44.081602 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4e313cf-8fa3-425f-ab21-8ad901774ab4-host\") pod \"c4e313cf-8fa3-425f-ab21-8ad901774ab4\" (UID: \"c4e313cf-8fa3-425f-ab21-8ad901774ab4\") " Oct 11 02:58:44 crc kubenswrapper[4743]: I1011 02:58:44.081796 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4e313cf-8fa3-425f-ab21-8ad901774ab4-host" (OuterVolumeSpecName: "host") pod "c4e313cf-8fa3-425f-ab21-8ad901774ab4" (UID: "c4e313cf-8fa3-425f-ab21-8ad901774ab4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 02:58:44 crc kubenswrapper[4743]: I1011 02:58:44.081967 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh67l\" (UniqueName: \"kubernetes.io/projected/c4e313cf-8fa3-425f-ab21-8ad901774ab4-kube-api-access-rh67l\") pod \"c4e313cf-8fa3-425f-ab21-8ad901774ab4\" (UID: \"c4e313cf-8fa3-425f-ab21-8ad901774ab4\") " Oct 11 02:58:44 crc kubenswrapper[4743]: I1011 02:58:44.082473 4743 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4e313cf-8fa3-425f-ab21-8ad901774ab4-host\") on node \"crc\" DevicePath \"\"" Oct 11 02:58:44 crc kubenswrapper[4743]: I1011 02:58:44.088089 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4e313cf-8fa3-425f-ab21-8ad901774ab4-kube-api-access-rh67l" (OuterVolumeSpecName: "kube-api-access-rh67l") pod "c4e313cf-8fa3-425f-ab21-8ad901774ab4" (UID: "c4e313cf-8fa3-425f-ab21-8ad901774ab4"). InnerVolumeSpecName "kube-api-access-rh67l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:58:44 crc kubenswrapper[4743]: I1011 02:58:44.104615 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4e313cf-8fa3-425f-ab21-8ad901774ab4" path="/var/lib/kubelet/pods/c4e313cf-8fa3-425f-ab21-8ad901774ab4/volumes" Oct 11 02:58:44 crc kubenswrapper[4743]: I1011 02:58:44.185979 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh67l\" (UniqueName: \"kubernetes.io/projected/c4e313cf-8fa3-425f-ab21-8ad901774ab4-kube-api-access-rh67l\") on node \"crc\" DevicePath \"\"" Oct 11 02:58:44 crc kubenswrapper[4743]: I1011 02:58:44.877693 4743 scope.go:117] "RemoveContainer" containerID="fce7ce31009961ce8fe97ae666b9afa55ae760ed6c760e7ce688a09aa5644ee0" Oct 11 02:58:44 crc kubenswrapper[4743]: I1011 02:58:44.877702 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dmn4d/crc-debug-g88hf" Oct 11 02:58:50 crc kubenswrapper[4743]: I1011 02:58:50.787788 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4n4jn"] Oct 11 02:58:50 crc kubenswrapper[4743]: E1011 02:58:50.789292 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4e313cf-8fa3-425f-ab21-8ad901774ab4" containerName="container-00" Oct 11 02:58:50 crc kubenswrapper[4743]: I1011 02:58:50.789357 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4e313cf-8fa3-425f-ab21-8ad901774ab4" containerName="container-00" Oct 11 02:58:50 crc kubenswrapper[4743]: I1011 02:58:50.789566 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4e313cf-8fa3-425f-ab21-8ad901774ab4" containerName="container-00" Oct 11 02:58:50 crc kubenswrapper[4743]: I1011 02:58:50.791738 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4n4jn" Oct 11 02:58:50 crc kubenswrapper[4743]: I1011 02:58:50.840196 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4n4jn"] Oct 11 02:58:50 crc kubenswrapper[4743]: I1011 02:58:50.851374 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4da7ac2f-f2f8-49cd-866c-51b54daa1b51-utilities\") pod \"community-operators-4n4jn\" (UID: \"4da7ac2f-f2f8-49cd-866c-51b54daa1b51\") " pod="openshift-marketplace/community-operators-4n4jn" Oct 11 02:58:50 crc kubenswrapper[4743]: I1011 02:58:50.851624 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5vf4\" (UniqueName: \"kubernetes.io/projected/4da7ac2f-f2f8-49cd-866c-51b54daa1b51-kube-api-access-v5vf4\") pod \"community-operators-4n4jn\" (UID: \"4da7ac2f-f2f8-49cd-866c-51b54daa1b51\") " pod="openshift-marketplace/community-operators-4n4jn" Oct 11 02:58:50 crc kubenswrapper[4743]: I1011 02:58:50.851944 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4da7ac2f-f2f8-49cd-866c-51b54daa1b51-catalog-content\") pod \"community-operators-4n4jn\" (UID: \"4da7ac2f-f2f8-49cd-866c-51b54daa1b51\") " pod="openshift-marketplace/community-operators-4n4jn" Oct 11 02:58:50 crc kubenswrapper[4743]: I1011 02:58:50.954217 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4da7ac2f-f2f8-49cd-866c-51b54daa1b51-utilities\") pod \"community-operators-4n4jn\" (UID: \"4da7ac2f-f2f8-49cd-866c-51b54daa1b51\") " pod="openshift-marketplace/community-operators-4n4jn" Oct 11 02:58:50 crc kubenswrapper[4743]: I1011 02:58:50.954298 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5vf4\" (UniqueName: \"kubernetes.io/projected/4da7ac2f-f2f8-49cd-866c-51b54daa1b51-kube-api-access-v5vf4\") pod \"community-operators-4n4jn\" (UID: \"4da7ac2f-f2f8-49cd-866c-51b54daa1b51\") " pod="openshift-marketplace/community-operators-4n4jn" Oct 11 02:58:50 crc kubenswrapper[4743]: I1011 02:58:50.954426 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4da7ac2f-f2f8-49cd-866c-51b54daa1b51-catalog-content\") pod \"community-operators-4n4jn\" (UID: \"4da7ac2f-f2f8-49cd-866c-51b54daa1b51\") " pod="openshift-marketplace/community-operators-4n4jn" Oct 11 02:58:50 crc kubenswrapper[4743]: I1011 02:58:50.955069 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4da7ac2f-f2f8-49cd-866c-51b54daa1b51-catalog-content\") pod \"community-operators-4n4jn\" (UID: \"4da7ac2f-f2f8-49cd-866c-51b54daa1b51\") " pod="openshift-marketplace/community-operators-4n4jn" Oct 11 02:58:50 crc kubenswrapper[4743]: I1011 02:58:50.955265 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4da7ac2f-f2f8-49cd-866c-51b54daa1b51-utilities\") pod \"community-operators-4n4jn\" (UID: \"4da7ac2f-f2f8-49cd-866c-51b54daa1b51\") " pod="openshift-marketplace/community-operators-4n4jn" Oct 11 02:58:50 crc kubenswrapper[4743]: I1011 02:58:50.976897 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5vf4\" (UniqueName: \"kubernetes.io/projected/4da7ac2f-f2f8-49cd-866c-51b54daa1b51-kube-api-access-v5vf4\") pod \"community-operators-4n4jn\" (UID: \"4da7ac2f-f2f8-49cd-866c-51b54daa1b51\") " pod="openshift-marketplace/community-operators-4n4jn" Oct 11 02:58:51 crc kubenswrapper[4743]: I1011 02:58:51.128080 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4n4jn" Oct 11 02:58:51 crc kubenswrapper[4743]: I1011 02:58:51.992260 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4n4jn"] Oct 11 02:58:52 crc kubenswrapper[4743]: I1011 02:58:52.995501 4743 generic.go:334] "Generic (PLEG): container finished" podID="4da7ac2f-f2f8-49cd-866c-51b54daa1b51" containerID="b90265755b866b9a0ddd74035e1780377d27807b4bbe81ab0e6ae61201f56b18" exitCode=0 Oct 11 02:58:52 crc kubenswrapper[4743]: I1011 02:58:52.995552 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4n4jn" event={"ID":"4da7ac2f-f2f8-49cd-866c-51b54daa1b51","Type":"ContainerDied","Data":"b90265755b866b9a0ddd74035e1780377d27807b4bbe81ab0e6ae61201f56b18"} Oct 11 02:58:52 crc kubenswrapper[4743]: I1011 02:58:52.996009 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4n4jn" event={"ID":"4da7ac2f-f2f8-49cd-866c-51b54daa1b51","Type":"ContainerStarted","Data":"2d8e824a02763f5b9632aeeaff0ebe7326623b5a3b2f00c697c7983eed08a32e"} Oct 11 02:58:53 crc kubenswrapper[4743]: I1011 02:58:53.168425 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2dfxt"] Oct 11 02:58:53 crc kubenswrapper[4743]: I1011 02:58:53.182816 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2dfxt"] Oct 11 02:58:53 crc kubenswrapper[4743]: I1011 02:58:53.187152 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2dfxt" Oct 11 02:58:53 crc kubenswrapper[4743]: I1011 02:58:53.330417 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a8e1204-b12f-465f-9cdc-9828e813199c-catalog-content\") pod \"certified-operators-2dfxt\" (UID: \"1a8e1204-b12f-465f-9cdc-9828e813199c\") " pod="openshift-marketplace/certified-operators-2dfxt" Oct 11 02:58:53 crc kubenswrapper[4743]: I1011 02:58:53.330493 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a8e1204-b12f-465f-9cdc-9828e813199c-utilities\") pod \"certified-operators-2dfxt\" (UID: \"1a8e1204-b12f-465f-9cdc-9828e813199c\") " pod="openshift-marketplace/certified-operators-2dfxt" Oct 11 02:58:53 crc kubenswrapper[4743]: I1011 02:58:53.330628 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffdqf\" (UniqueName: \"kubernetes.io/projected/1a8e1204-b12f-465f-9cdc-9828e813199c-kube-api-access-ffdqf\") pod \"certified-operators-2dfxt\" (UID: \"1a8e1204-b12f-465f-9cdc-9828e813199c\") " pod="openshift-marketplace/certified-operators-2dfxt" Oct 11 02:58:53 crc kubenswrapper[4743]: I1011 02:58:53.433266 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffdqf\" (UniqueName: \"kubernetes.io/projected/1a8e1204-b12f-465f-9cdc-9828e813199c-kube-api-access-ffdqf\") pod \"certified-operators-2dfxt\" (UID: \"1a8e1204-b12f-465f-9cdc-9828e813199c\") " pod="openshift-marketplace/certified-operators-2dfxt" Oct 11 02:58:53 crc kubenswrapper[4743]: I1011 02:58:53.433413 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a8e1204-b12f-465f-9cdc-9828e813199c-catalog-content\") pod \"certified-operators-2dfxt\" (UID: \"1a8e1204-b12f-465f-9cdc-9828e813199c\") " pod="openshift-marketplace/certified-operators-2dfxt" Oct 11 02:58:53 crc kubenswrapper[4743]: I1011 02:58:53.433495 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a8e1204-b12f-465f-9cdc-9828e813199c-utilities\") pod \"certified-operators-2dfxt\" (UID: \"1a8e1204-b12f-465f-9cdc-9828e813199c\") " pod="openshift-marketplace/certified-operators-2dfxt" Oct 11 02:58:53 crc kubenswrapper[4743]: I1011 02:58:53.434193 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a8e1204-b12f-465f-9cdc-9828e813199c-catalog-content\") pod \"certified-operators-2dfxt\" (UID: \"1a8e1204-b12f-465f-9cdc-9828e813199c\") " pod="openshift-marketplace/certified-operators-2dfxt" Oct 11 02:58:53 crc kubenswrapper[4743]: I1011 02:58:53.434350 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a8e1204-b12f-465f-9cdc-9828e813199c-utilities\") pod \"certified-operators-2dfxt\" (UID: \"1a8e1204-b12f-465f-9cdc-9828e813199c\") " pod="openshift-marketplace/certified-operators-2dfxt" Oct 11 02:58:53 crc kubenswrapper[4743]: I1011 02:58:53.451715 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffdqf\" (UniqueName: \"kubernetes.io/projected/1a8e1204-b12f-465f-9cdc-9828e813199c-kube-api-access-ffdqf\") pod \"certified-operators-2dfxt\" (UID: \"1a8e1204-b12f-465f-9cdc-9828e813199c\") " pod="openshift-marketplace/certified-operators-2dfxt" Oct 11 02:58:53 crc kubenswrapper[4743]: I1011 02:58:53.512928 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2dfxt" Oct 11 02:58:54 crc kubenswrapper[4743]: W1011 02:58:54.076384 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a8e1204_b12f_465f_9cdc_9828e813199c.slice/crio-8e238d59cf20984de6c6961d67f1456147f4afba1701c088c35299dca2d98fe1 WatchSource:0}: Error finding container 8e238d59cf20984de6c6961d67f1456147f4afba1701c088c35299dca2d98fe1: Status 404 returned error can't find the container with id 8e238d59cf20984de6c6961d67f1456147f4afba1701c088c35299dca2d98fe1 Oct 11 02:58:54 crc kubenswrapper[4743]: I1011 02:58:54.083198 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2dfxt"] Oct 11 02:58:55 crc kubenswrapper[4743]: I1011 02:58:55.022188 4743 generic.go:334] "Generic (PLEG): container finished" podID="1a8e1204-b12f-465f-9cdc-9828e813199c" containerID="712562a79758daa737ff9682f5fc2d748697e60ce3862892bf7b5530ac0feb94" exitCode=0 Oct 11 02:58:55 crc kubenswrapper[4743]: I1011 02:58:55.022298 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dfxt" event={"ID":"1a8e1204-b12f-465f-9cdc-9828e813199c","Type":"ContainerDied","Data":"712562a79758daa737ff9682f5fc2d748697e60ce3862892bf7b5530ac0feb94"} Oct 11 02:58:55 crc kubenswrapper[4743]: I1011 02:58:55.022547 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dfxt" event={"ID":"1a8e1204-b12f-465f-9cdc-9828e813199c","Type":"ContainerStarted","Data":"8e238d59cf20984de6c6961d67f1456147f4afba1701c088c35299dca2d98fe1"} Oct 11 02:58:55 crc kubenswrapper[4743]: I1011 02:58:55.027339 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4n4jn" event={"ID":"4da7ac2f-f2f8-49cd-866c-51b54daa1b51","Type":"ContainerStarted","Data":"e35b840616968a95337b28c0779631356ae7d92e26be43681424613467107bac"} Oct 11 02:58:56 crc kubenswrapper[4743]: I1011 02:58:56.037456 4743 generic.go:334] "Generic (PLEG): container finished" podID="4da7ac2f-f2f8-49cd-866c-51b54daa1b51" containerID="e35b840616968a95337b28c0779631356ae7d92e26be43681424613467107bac" exitCode=0 Oct 11 02:58:56 crc kubenswrapper[4743]: I1011 02:58:56.037524 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4n4jn" event={"ID":"4da7ac2f-f2f8-49cd-866c-51b54daa1b51","Type":"ContainerDied","Data":"e35b840616968a95337b28c0779631356ae7d92e26be43681424613467107bac"} Oct 11 02:58:57 crc kubenswrapper[4743]: I1011 02:58:57.066441 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dfxt" event={"ID":"1a8e1204-b12f-465f-9cdc-9828e813199c","Type":"ContainerStarted","Data":"938976c9f3608f30169b3bbf2aa8b9b14d943c382c948e10937640bcb768c0d0"} Oct 11 02:58:57 crc kubenswrapper[4743]: I1011 02:58:57.823312 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b61fc8c0-014e-481a-b189-e554dced0696/aodh-api/0.log" Oct 11 02:58:57 crc kubenswrapper[4743]: I1011 02:58:57.877797 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b61fc8c0-014e-481a-b189-e554dced0696/aodh-evaluator/0.log" Oct 11 02:58:58 crc kubenswrapper[4743]: I1011 02:58:58.037713 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b61fc8c0-014e-481a-b189-e554dced0696/aodh-listener/0.log" Oct 11 02:58:58 crc kubenswrapper[4743]: I1011 02:58:58.077783 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4n4jn" event={"ID":"4da7ac2f-f2f8-49cd-866c-51b54daa1b51","Type":"ContainerStarted","Data":"7750aff9414ac20d78d4a99f790d1fd5d78b6acc8ea61a7dd37b1e515518e950"} Oct 11 02:58:58 crc kubenswrapper[4743]: I1011 02:58:58.103449 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4n4jn" podStartSLOduration=4.148441494 podStartE2EDuration="8.103424748s" podCreationTimestamp="2025-10-11 02:58:50 +0000 UTC" firstStartedPulling="2025-10-11 02:58:52.998211402 +0000 UTC m=+7627.651191799" lastFinishedPulling="2025-10-11 02:58:56.953194656 +0000 UTC m=+7631.606175053" observedRunningTime="2025-10-11 02:58:58.10068191 +0000 UTC m=+7632.753662307" watchObservedRunningTime="2025-10-11 02:58:58.103424748 +0000 UTC m=+7632.756405145" Oct 11 02:58:58 crc kubenswrapper[4743]: I1011 02:58:58.224063 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b61fc8c0-014e-481a-b189-e554dced0696/aodh-notifier/0.log" Oct 11 02:58:58 crc kubenswrapper[4743]: I1011 02:58:58.248928 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-67c9948594-q58d2_9cd00887-e25b-4548-8084-5efab4f9cb27/barbican-api/0.log" Oct 11 02:58:58 crc kubenswrapper[4743]: I1011 02:58:58.351074 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-67c9948594-q58d2_9cd00887-e25b-4548-8084-5efab4f9cb27/barbican-api-log/0.log" Oct 11 02:58:58 crc kubenswrapper[4743]: I1011 02:58:58.466796 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-785cc87c98-slsn7_7f793c4b-6627-4c4b-9f2c-529641700221/barbican-keystone-listener/0.log" Oct 11 02:58:58 crc kubenswrapper[4743]: I1011 02:58:58.690038 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f98bddd87-6kv6r_b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d/barbican-worker/0.log" Oct 11 02:58:58 crc kubenswrapper[4743]: I1011 02:58:58.696053 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-785cc87c98-slsn7_7f793c4b-6627-4c4b-9f2c-529641700221/barbican-keystone-listener-log/0.log" Oct 11 02:58:58 crc kubenswrapper[4743]: I1011 02:58:58.866986 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f98bddd87-6kv6r_b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d/barbican-worker-log/0.log" Oct 11 02:58:58 crc kubenswrapper[4743]: I1011 02:58:58.998120 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc_5ce1ff59-69f9-466b-926d-4785eb4df84f/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 02:58:59 crc kubenswrapper[4743]: I1011 02:58:59.091079 4743 generic.go:334] "Generic (PLEG): container finished" podID="1a8e1204-b12f-465f-9cdc-9828e813199c" containerID="938976c9f3608f30169b3bbf2aa8b9b14d943c382c948e10937640bcb768c0d0" exitCode=0 Oct 11 02:58:59 crc kubenswrapper[4743]: I1011 02:58:59.091117 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dfxt" event={"ID":"1a8e1204-b12f-465f-9cdc-9828e813199c","Type":"ContainerDied","Data":"938976c9f3608f30169b3bbf2aa8b9b14d943c382c948e10937640bcb768c0d0"} Oct 11 02:58:59 crc kubenswrapper[4743]: I1011 02:58:59.272516 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0a98090b-ea2d-4b45-98ca-cdb8d619e42d/ceilometer-central-agent/0.log" Oct 11 02:58:59 crc kubenswrapper[4743]: I1011 02:58:59.388247 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0a98090b-ea2d-4b45-98ca-cdb8d619e42d/ceilometer-notification-agent/0.log" Oct 11 02:58:59 crc kubenswrapper[4743]: I1011 02:58:59.582891 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0a98090b-ea2d-4b45-98ca-cdb8d619e42d/proxy-httpd/0.log" Oct 11 02:58:59 crc kubenswrapper[4743]: I1011 02:58:59.707743 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0a98090b-ea2d-4b45-98ca-cdb8d619e42d/sg-core/0.log" Oct 11 02:58:59 crc kubenswrapper[4743]: I1011 02:58:59.874309 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd_2a7d527b-9f7c-40ec-8939-fbd2350a9ec3/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 02:59:00 crc kubenswrapper[4743]: I1011 02:59:00.044675 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg_3523070b-145c-4c82-9623-b4a9f2a32c11/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 02:59:00 crc kubenswrapper[4743]: I1011 02:59:00.104039 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dfxt" event={"ID":"1a8e1204-b12f-465f-9cdc-9828e813199c","Type":"ContainerStarted","Data":"3f5d8bf6044679b16527b8d03a96507dfadfa9d10e51a8ad8f32f5129807ad43"} Oct 11 02:59:00 crc kubenswrapper[4743]: I1011 02:59:00.306655 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9f904889-28d2-4cfd-86ee-2e5841f9fc04/cinder-api-log/0.log" Oct 11 02:59:00 crc kubenswrapper[4743]: I1011 02:59:00.372109 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9f904889-28d2-4cfd-86ee-2e5841f9fc04/cinder-api/0.log" Oct 11 02:59:00 crc kubenswrapper[4743]: I1011 02:59:00.656359 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_f333d397-070a-4624-8b2d-856964010b75/probe/0.log" Oct 11 02:59:00 crc kubenswrapper[4743]: I1011 02:59:00.704565 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_f333d397-070a-4624-8b2d-856964010b75/cinder-backup/0.log" Oct 11 02:59:00 crc kubenswrapper[4743]: I1011 02:59:00.943761 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_1570e831-5132-4e30-b791-6ac13faaeea4/cinder-scheduler/0.log" Oct 11 02:59:01 crc kubenswrapper[4743]: I1011 02:59:01.044037 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_1570e831-5132-4e30-b791-6ac13faaeea4/probe/0.log" Oct 11 02:59:01 crc kubenswrapper[4743]: I1011 02:59:01.128583 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4n4jn" Oct 11 02:59:01 crc kubenswrapper[4743]: I1011 02:59:01.128632 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4n4jn" Oct 11 02:59:01 crc kubenswrapper[4743]: I1011 02:59:01.185671 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_129685c1-9de5-4c18-9219-172fe359aa89/cinder-volume/0.log" Oct 11 02:59:01 crc kubenswrapper[4743]: I1011 02:59:01.235063 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_129685c1-9de5-4c18-9219-172fe359aa89/probe/0.log" Oct 11 02:59:01 crc kubenswrapper[4743]: I1011 02:59:01.405757 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr_5205ba97-c5be-49b8-a4a6-2570d1b602d2/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 02:59:01 crc kubenswrapper[4743]: I1011 02:59:01.471110 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr_43197ff3-1a5a-4c2f-a836-aa22d055d415/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 02:59:01 crc kubenswrapper[4743]: I1011 02:59:01.783245 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c8d8d886c-dsztl_1654f6a5-1abf-4c9e-b956-3bfc60c7077c/init/0.log" Oct 11 02:59:01 crc kubenswrapper[4743]: I1011 02:59:01.955613 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c8d8d886c-dsztl_1654f6a5-1abf-4c9e-b956-3bfc60c7077c/init/0.log" Oct 11 02:59:01 crc kubenswrapper[4743]: I1011 02:59:01.998808 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c8d8d886c-dsztl_1654f6a5-1abf-4c9e-b956-3bfc60c7077c/dnsmasq-dns/0.log" Oct 11 02:59:02 crc kubenswrapper[4743]: I1011 02:59:02.018586 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2e1cee17-cf14-4bf2-bda0-2f651412f042/glance-httpd/0.log" Oct 11 02:59:02 crc kubenswrapper[4743]: I1011 02:59:02.179360 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2e1cee17-cf14-4bf2-bda0-2f651412f042/glance-log/0.log" Oct 11 02:59:02 crc kubenswrapper[4743]: I1011 02:59:02.200977 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-4n4jn" podUID="4da7ac2f-f2f8-49cd-866c-51b54daa1b51" containerName="registry-server" probeResult="failure" output=< Oct 11 02:59:02 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Oct 11 02:59:02 crc kubenswrapper[4743]: > Oct 11 02:59:02 crc kubenswrapper[4743]: I1011 02:59:02.202668 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964/glance-httpd/0.log" Oct 11 02:59:02 crc kubenswrapper[4743]: I1011 02:59:02.240000 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964/glance-log/0.log" Oct 11 02:59:02 crc kubenswrapper[4743]: I1011 02:59:02.762722 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-57fbf8bd-mp4f8_ae21bf76-1584-4681-b679-29abbf1ef22a/heat-engine/0.log" Oct 11 02:59:03 crc kubenswrapper[4743]: I1011 02:59:03.198652 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-f46b79456-dm9d6_36f566d2-9c6b-4bc3-a1a3-47a11e6eee45/horizon/0.log" Oct 11 02:59:03 crc kubenswrapper[4743]: I1011 02:59:03.489003 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-7b7546fb69-4lvd7_db0cfd47-4287-4526-8e6b-0fd5bd770a1c/heat-cfnapi/0.log" Oct 11 02:59:03 crc kubenswrapper[4743]: I1011 02:59:03.513454 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2dfxt" Oct 11 02:59:03 crc kubenswrapper[4743]: I1011 02:59:03.513889 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2dfxt" Oct 11 02:59:03 crc kubenswrapper[4743]: I1011 02:59:03.543718 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-657c9dff6b-rgphg_de562f4f-80d8-407b-bf5a-9b584e013294/heat-api/0.log" Oct 11 02:59:03 crc kubenswrapper[4743]: I1011 02:59:03.688802 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-f46b79456-dm9d6_36f566d2-9c6b-4bc3-a1a3-47a11e6eee45/horizon-log/0.log" Oct 11 02:59:03 crc kubenswrapper[4743]: I1011 02:59:03.756422 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9_57fc6fbe-24cd-4185-a91e-dd39258e8d05/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 02:59:03 crc kubenswrapper[4743]: I1011 02:59:03.760972 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-lj96c_09ec7d44-c723-4a16-a24e-d473280d1321/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 02:59:03 crc kubenswrapper[4743]: I1011 02:59:03.992141 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29335801-cqb6t_90420843-1d2e-48e7-bec5-63cc4cd8557e/keystone-cron/0.log" Oct 11 02:59:04 crc kubenswrapper[4743]: I1011 02:59:04.225369 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_923b0fb7-1d93-491e-a1e0-73614b302fdb/kube-state-metrics/0.log" Oct 11 02:59:04 crc kubenswrapper[4743]: I1011 02:59:04.406596 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5bd8997d9d-dpxn8_8f249d47-90a4-4fc9-8bb8-e61bc0143ae7/keystone-api/0.log" Oct 11 02:59:04 crc kubenswrapper[4743]: I1011 02:59:04.467139 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt_6ea65353-b389-4222-8ff8-298d53283609/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 02:59:04 crc kubenswrapper[4743]: I1011 02:59:04.498585 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-bvnfw_abded9bf-eca7-43d5-bd5b-531d44751777/logging-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 02:59:04 crc kubenswrapper[4743]: I1011 02:59:04.565801 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-2dfxt" podUID="1a8e1204-b12f-465f-9cdc-9828e813199c" containerName="registry-server" probeResult="failure" output=< Oct 11 02:59:04 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Oct 11 02:59:04 crc kubenswrapper[4743]: > Oct 11 02:59:04 crc kubenswrapper[4743]: I1011 02:59:04.699172 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_7959191f-6ca5-4f63-84e9-815b7378c505/manila-api-log/0.log" Oct 11 02:59:04 crc kubenswrapper[4743]: I1011 02:59:04.825222 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_7959191f-6ca5-4f63-84e9-815b7378c505/manila-api/0.log" Oct 11 02:59:04 crc kubenswrapper[4743]: I1011 02:59:04.947263 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_00706963-dfb0-45d7-a0be-5875e1ae0a8f/probe/0.log" Oct 11 02:59:05 crc kubenswrapper[4743]: I1011 02:59:05.014972 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_00706963-dfb0-45d7-a0be-5875e1ae0a8f/manila-scheduler/0.log" Oct 11 02:59:05 crc kubenswrapper[4743]: I1011 02:59:05.100817 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_0c713aa6-9d10-4baa-855c-a05256d83be7/manila-share/0.log" Oct 11 02:59:05 crc kubenswrapper[4743]: I1011 02:59:05.150576 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_0c713aa6-9d10-4baa-855c-a05256d83be7/probe/0.log" Oct 11 02:59:05 crc kubenswrapper[4743]: I1011 02:59:05.380002 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_50798e93-52c7-4ee3-b94a-295fbcc7eeba/mysqld-exporter/0.log" Oct 11 02:59:05 crc kubenswrapper[4743]: I1011 02:59:05.826331 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5496cd5f5c-c9jx6_fdc6c654-6370-4cc4-99a1-c13dfd402b14/neutron-httpd/0.log" Oct 11 02:59:05 crc kubenswrapper[4743]: I1011 02:59:05.890002 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5496cd5f5c-c9jx6_fdc6c654-6370-4cc4-99a1-c13dfd402b14/neutron-api/0.log" Oct 11 02:59:06 crc kubenswrapper[4743]: I1011 02:59:06.056315 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l_d792f039-d865-44e3-9474-be444dee2d03/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 02:59:06 crc kubenswrapper[4743]: I1011 02:59:06.748133 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_f2ddaae7-747a-4f05-bc0f-4f69fc15b816/nova-cell0-conductor-conductor/0.log" Oct 11 02:59:06 crc kubenswrapper[4743]: I1011 02:59:06.927108 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_41ea9288-7c98-4c3b-a903-76053391426e/nova-api-log/0.log" Oct 11 02:59:07 crc kubenswrapper[4743]: I1011 02:59:07.253878 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_6f8d6d52-4659-4bde-8eac-469d0008964d/nova-cell1-conductor-conductor/0.log" Oct 11 02:59:07 crc kubenswrapper[4743]: I1011 02:59:07.520950 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_41ea9288-7c98-4c3b-a903-76053391426e/nova-api-api/0.log" Oct 11 02:59:07 crc kubenswrapper[4743]: I1011 02:59:07.600056 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d685eabb-511e-4604-9716-7676177726d6/nova-cell1-novncproxy-novncproxy/0.log" Oct 11 02:59:07 crc kubenswrapper[4743]: I1011 02:59:07.823398 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff_75f90fbf-75a7-4b2a-af1a-7693cebeaea3/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 02:59:07 crc kubenswrapper[4743]: I1011 02:59:07.950723 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_10e4a6f0-05ef-4f39-96f6-1e44cd3753d4/nova-metadata-log/0.log" Oct 11 02:59:08 crc kubenswrapper[4743]: I1011 02:59:08.291522 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2dfxt" podStartSLOduration=10.698611605 podStartE2EDuration="15.291501178s" podCreationTimestamp="2025-10-11 02:58:53 +0000 UTC" firstStartedPulling="2025-10-11 02:58:55.02471598 +0000 UTC m=+7629.677696377" lastFinishedPulling="2025-10-11 02:58:59.617605553 +0000 UTC m=+7634.270585950" observedRunningTime="2025-10-11 02:59:00.134781367 +0000 UTC m=+7634.787761784" watchObservedRunningTime="2025-10-11 02:59:08.291501178 +0000 UTC m=+7642.944481575" Oct 11 02:59:08 crc kubenswrapper[4743]: I1011 02:59:08.305265 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s4ztp"] Oct 11 02:59:08 crc kubenswrapper[4743]: I1011 02:59:08.307680 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_449c91f5-e998-4889-b148-30f334b03bc8/nova-scheduler-scheduler/0.log" Oct 11 02:59:08 crc kubenswrapper[4743]: I1011 02:59:08.308704 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s4ztp" Oct 11 02:59:08 crc kubenswrapper[4743]: I1011 02:59:08.327187 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s4ztp"] Oct 11 02:59:08 crc kubenswrapper[4743]: I1011 02:59:08.422798 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0-catalog-content\") pod \"redhat-operators-s4ztp\" (UID: \"ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0\") " pod="openshift-marketplace/redhat-operators-s4ztp" Oct 11 02:59:08 crc kubenswrapper[4743]: I1011 02:59:08.422900 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txrwn\" (UniqueName: \"kubernetes.io/projected/ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0-kube-api-access-txrwn\") pod \"redhat-operators-s4ztp\" (UID: \"ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0\") " pod="openshift-marketplace/redhat-operators-s4ztp" Oct 11 02:59:08 crc kubenswrapper[4743]: I1011 02:59:08.423018 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0-utilities\") pod \"redhat-operators-s4ztp\" (UID: \"ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0\") " pod="openshift-marketplace/redhat-operators-s4ztp" Oct 11 02:59:08 crc kubenswrapper[4743]: I1011 02:59:08.524849 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0-utilities\") pod \"redhat-operators-s4ztp\" (UID: \"ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0\") " pod="openshift-marketplace/redhat-operators-s4ztp" Oct 11 02:59:08 crc kubenswrapper[4743]: I1011 02:59:08.525009 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0-catalog-content\") pod \"redhat-operators-s4ztp\" (UID: \"ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0\") " pod="openshift-marketplace/redhat-operators-s4ztp" Oct 11 02:59:08 crc kubenswrapper[4743]: I1011 02:59:08.525039 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txrwn\" (UniqueName: \"kubernetes.io/projected/ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0-kube-api-access-txrwn\") pod \"redhat-operators-s4ztp\" (UID: \"ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0\") " pod="openshift-marketplace/redhat-operators-s4ztp" Oct 11 02:59:08 crc kubenswrapper[4743]: I1011 02:59:08.525877 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0-utilities\") pod \"redhat-operators-s4ztp\" (UID: \"ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0\") " pod="openshift-marketplace/redhat-operators-s4ztp" Oct 11 02:59:08 crc kubenswrapper[4743]: I1011 02:59:08.527093 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0-catalog-content\") pod \"redhat-operators-s4ztp\" (UID: \"ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0\") " pod="openshift-marketplace/redhat-operators-s4ztp" Oct 11 02:59:08 crc kubenswrapper[4743]: I1011 02:59:08.542159 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f604069e-dff8-4f02-a5e8-d3ba38d87625/mysql-bootstrap/0.log" Oct 11 02:59:08 crc kubenswrapper[4743]: I1011 02:59:08.549317 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txrwn\" (UniqueName: \"kubernetes.io/projected/ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0-kube-api-access-txrwn\") pod \"redhat-operators-s4ztp\" (UID: \"ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0\") " pod="openshift-marketplace/redhat-operators-s4ztp" Oct 11 02:59:08 crc kubenswrapper[4743]: I1011 02:59:08.653139 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s4ztp" Oct 11 02:59:08 crc kubenswrapper[4743]: I1011 02:59:08.735669 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f604069e-dff8-4f02-a5e8-d3ba38d87625/mysql-bootstrap/0.log" Oct 11 02:59:08 crc kubenswrapper[4743]: I1011 02:59:08.803670 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f604069e-dff8-4f02-a5e8-d3ba38d87625/galera/0.log" Oct 11 02:59:09 crc kubenswrapper[4743]: I1011 02:59:09.153285 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_32c52bf9-36b5-4a75-8991-e76f4dd87fb3/mysql-bootstrap/0.log" Oct 11 02:59:09 crc kubenswrapper[4743]: I1011 02:59:09.351768 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s4ztp"] Oct 11 02:59:09 crc kubenswrapper[4743]: W1011 02:59:09.362520 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba03d73f_0a91_41e0_8042_ee5fa3c7f9d0.slice/crio-13117f2179ee359e76f512d6bf397019e17a77e8e0b5582f2bb0578f4e91d485 WatchSource:0}: Error finding container 13117f2179ee359e76f512d6bf397019e17a77e8e0b5582f2bb0578f4e91d485: Status 404 returned error can't find the container with id 13117f2179ee359e76f512d6bf397019e17a77e8e0b5582f2bb0578f4e91d485 Oct 11 02:59:09 crc kubenswrapper[4743]: I1011 02:59:09.386543 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_32c52bf9-36b5-4a75-8991-e76f4dd87fb3/mysql-bootstrap/0.log" Oct 11 02:59:09 crc kubenswrapper[4743]: I1011 02:59:09.429778 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_32c52bf9-36b5-4a75-8991-e76f4dd87fb3/galera/0.log" Oct 11 02:59:09 crc kubenswrapper[4743]: I1011 02:59:09.699312 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_c02b1352-1ccf-4856-ad8e-328dab03135e/openstackclient/0.log" Oct 11 02:59:09 crc kubenswrapper[4743]: I1011 02:59:09.945897 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-fkcc9_145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a/openstack-network-exporter/0.log" Oct 11 02:59:10 crc kubenswrapper[4743]: I1011 02:59:10.170321 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-mwtxs_1ab33e99-afb5-4b67-89bd-a2eb540bf194/ovn-controller/0.log" Oct 11 02:59:10 crc kubenswrapper[4743]: I1011 02:59:10.199412 4743 generic.go:334] "Generic (PLEG): container finished" podID="ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0" containerID="751150522a58a2a0c1464679283b3ed0a6b21980e9e2d83556afc270a1e50bca" exitCode=0 Oct 11 02:59:10 crc kubenswrapper[4743]: I1011 02:59:10.199456 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s4ztp" event={"ID":"ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0","Type":"ContainerDied","Data":"751150522a58a2a0c1464679283b3ed0a6b21980e9e2d83556afc270a1e50bca"} Oct 11 02:59:10 crc kubenswrapper[4743]: I1011 02:59:10.199482 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s4ztp" event={"ID":"ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0","Type":"ContainerStarted","Data":"13117f2179ee359e76f512d6bf397019e17a77e8e0b5582f2bb0578f4e91d485"} Oct 11 02:59:10 crc kubenswrapper[4743]: I1011 02:59:10.494643 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6g6xb_ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387/ovsdb-server-init/0.log" Oct 11 02:59:10 crc kubenswrapper[4743]: I1011 02:59:10.744570 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6g6xb_ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387/ovsdb-server-init/0.log" Oct 11 02:59:10 crc kubenswrapper[4743]: I1011 02:59:10.759305 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6g6xb_ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387/ovs-vswitchd/0.log" Oct 11 02:59:10 crc kubenswrapper[4743]: I1011 02:59:10.937377 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6g6xb_ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387/ovsdb-server/0.log" Oct 11 02:59:11 crc kubenswrapper[4743]: I1011 02:59:11.155368 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-w2vs8_00ade740-f798-4354-9e89-35aa325d8b92/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 02:59:11 crc kubenswrapper[4743]: I1011 02:59:11.199030 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4n4jn" Oct 11 02:59:11 crc kubenswrapper[4743]: I1011 02:59:11.215155 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s4ztp" event={"ID":"ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0","Type":"ContainerStarted","Data":"7eeffa298cf032952d74ffe29b42a956e1fc4ab609c7a03e52f76af87892c575"} Oct 11 02:59:11 crc kubenswrapper[4743]: I1011 02:59:11.274274 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4n4jn" Oct 11 02:59:11 crc kubenswrapper[4743]: I1011 02:59:11.341055 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_aae1bfd4-4ee9-4b40-bc1b-241275ef8097/openstack-network-exporter/0.log" Oct 11 02:59:11 crc kubenswrapper[4743]: I1011 02:59:11.428612 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_aae1bfd4-4ee9-4b40-bc1b-241275ef8097/ovn-northd/0.log" Oct 11 02:59:11 crc kubenswrapper[4743]: I1011 02:59:11.659497 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7c8284ba-a2b2-4f9f-a692-b372e8294d6b/openstack-network-exporter/0.log" Oct 11 02:59:11 crc kubenswrapper[4743]: I1011 02:59:11.687207 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_10e4a6f0-05ef-4f39-96f6-1e44cd3753d4/nova-metadata-metadata/0.log" Oct 11 02:59:11 crc kubenswrapper[4743]: I1011 02:59:11.798963 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7c8284ba-a2b2-4f9f-a692-b372e8294d6b/ovsdbserver-nb/0.log" Oct 11 02:59:11 crc kubenswrapper[4743]: I1011 02:59:11.982658 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_74c19249-ff95-4e49-96bb-1135e7aa1b08/openstack-network-exporter/0.log" Oct 11 02:59:12 crc kubenswrapper[4743]: I1011 02:59:12.130611 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_74c19249-ff95-4e49-96bb-1135e7aa1b08/ovsdbserver-sb/0.log" Oct 11 02:59:12 crc kubenswrapper[4743]: I1011 02:59:12.430250 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-979bff964-bxbgb_8f846c8e-d28a-4a2e-a5b6-bfc739de275b/placement-api/0.log" Oct 11 02:59:12 crc kubenswrapper[4743]: I1011 02:59:12.561668 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-979bff964-bxbgb_8f846c8e-d28a-4a2e-a5b6-bfc739de275b/placement-log/0.log" Oct 11 02:59:12 crc kubenswrapper[4743]: I1011 02:59:12.622188 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_eed36ee9-8239-4139-97f3-0e7b2962f45b/init-config-reloader/0.log" Oct 11 02:59:12 crc kubenswrapper[4743]: I1011 02:59:12.856162 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_eed36ee9-8239-4139-97f3-0e7b2962f45b/init-config-reloader/0.log" Oct 11 02:59:12 crc kubenswrapper[4743]: I1011 02:59:12.918642 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_eed36ee9-8239-4139-97f3-0e7b2962f45b/config-reloader/0.log" Oct 11 02:59:12 crc kubenswrapper[4743]: I1011 02:59:12.960100 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_eed36ee9-8239-4139-97f3-0e7b2962f45b/prometheus/0.log" Oct 11 02:59:13 crc kubenswrapper[4743]: I1011 02:59:13.160463 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_eed36ee9-8239-4139-97f3-0e7b2962f45b/thanos-sidecar/0.log" Oct 11 02:59:13 crc kubenswrapper[4743]: I1011 02:59:13.216405 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_de84c29c-4168-4383-aadc-0d5cc0ba56f8/setup-container/0.log" Oct 11 02:59:13 crc kubenswrapper[4743]: I1011 02:59:13.430119 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_de84c29c-4168-4383-aadc-0d5cc0ba56f8/setup-container/0.log" Oct 11 02:59:13 crc kubenswrapper[4743]: I1011 02:59:13.458392 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_de84c29c-4168-4383-aadc-0d5cc0ba56f8/rabbitmq/0.log" Oct 11 02:59:13 crc kubenswrapper[4743]: I1011 02:59:13.473240 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4n4jn"] Oct 11 02:59:13 crc kubenswrapper[4743]: I1011 02:59:13.473454 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4n4jn" podUID="4da7ac2f-f2f8-49cd-866c-51b54daa1b51" containerName="registry-server" containerID="cri-o://7750aff9414ac20d78d4a99f790d1fd5d78b6acc8ea61a7dd37b1e515518e950" gracePeriod=2 Oct 11 02:59:13 crc kubenswrapper[4743]: I1011 02:59:13.634332 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2dfxt" Oct 11 02:59:13 crc kubenswrapper[4743]: I1011 02:59:13.697733 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_38225901-8300-41cc-8e32-748b754660dc/setup-container/0.log" Oct 11 02:59:13 crc kubenswrapper[4743]: I1011 02:59:13.700219 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2dfxt" Oct 11 02:59:13 crc kubenswrapper[4743]: I1011 02:59:13.955572 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_38225901-8300-41cc-8e32-748b754660dc/setup-container/0.log" Oct 11 02:59:13 crc kubenswrapper[4743]: I1011 02:59:13.975680 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_38225901-8300-41cc-8e32-748b754660dc/rabbitmq/0.log" Oct 11 02:59:14 crc kubenswrapper[4743]: I1011 02:59:14.103904 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4n4jn" Oct 11 02:59:14 crc kubenswrapper[4743]: I1011 02:59:14.171363 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5vf4\" (UniqueName: \"kubernetes.io/projected/4da7ac2f-f2f8-49cd-866c-51b54daa1b51-kube-api-access-v5vf4\") pod \"4da7ac2f-f2f8-49cd-866c-51b54daa1b51\" (UID: \"4da7ac2f-f2f8-49cd-866c-51b54daa1b51\") " Oct 11 02:59:14 crc kubenswrapper[4743]: I1011 02:59:14.171467 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4da7ac2f-f2f8-49cd-866c-51b54daa1b51-catalog-content\") pod \"4da7ac2f-f2f8-49cd-866c-51b54daa1b51\" (UID: \"4da7ac2f-f2f8-49cd-866c-51b54daa1b51\") " Oct 11 02:59:14 crc kubenswrapper[4743]: I1011 02:59:14.171698 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4da7ac2f-f2f8-49cd-866c-51b54daa1b51-utilities\") pod \"4da7ac2f-f2f8-49cd-866c-51b54daa1b51\" (UID: \"4da7ac2f-f2f8-49cd-866c-51b54daa1b51\") " Oct 11 02:59:14 crc kubenswrapper[4743]: I1011 02:59:14.176620 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4da7ac2f-f2f8-49cd-866c-51b54daa1b51-utilities" (OuterVolumeSpecName: "utilities") pod "4da7ac2f-f2f8-49cd-866c-51b54daa1b51" (UID: "4da7ac2f-f2f8-49cd-866c-51b54daa1b51"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:59:14 crc kubenswrapper[4743]: I1011 02:59:14.220752 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4da7ac2f-f2f8-49cd-866c-51b54daa1b51-kube-api-access-v5vf4" (OuterVolumeSpecName: "kube-api-access-v5vf4") pod "4da7ac2f-f2f8-49cd-866c-51b54daa1b51" (UID: "4da7ac2f-f2f8-49cd-866c-51b54daa1b51"). InnerVolumeSpecName "kube-api-access-v5vf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:59:14 crc kubenswrapper[4743]: I1011 02:59:14.222913 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp_dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 02:59:14 crc kubenswrapper[4743]: I1011 02:59:14.272225 4743 generic.go:334] "Generic (PLEG): container finished" podID="4da7ac2f-f2f8-49cd-866c-51b54daa1b51" containerID="7750aff9414ac20d78d4a99f790d1fd5d78b6acc8ea61a7dd37b1e515518e950" exitCode=0 Oct 11 02:59:14 crc kubenswrapper[4743]: I1011 02:59:14.274032 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4n4jn" Oct 11 02:59:14 crc kubenswrapper[4743]: I1011 02:59:14.274843 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4n4jn" event={"ID":"4da7ac2f-f2f8-49cd-866c-51b54daa1b51","Type":"ContainerDied","Data":"7750aff9414ac20d78d4a99f790d1fd5d78b6acc8ea61a7dd37b1e515518e950"} Oct 11 02:59:14 crc kubenswrapper[4743]: I1011 02:59:14.274884 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4n4jn" event={"ID":"4da7ac2f-f2f8-49cd-866c-51b54daa1b51","Type":"ContainerDied","Data":"2d8e824a02763f5b9632aeeaff0ebe7326623b5a3b2f00c697c7983eed08a32e"} Oct 11 02:59:14 crc kubenswrapper[4743]: I1011 02:59:14.275048 4743 scope.go:117] "RemoveContainer" containerID="7750aff9414ac20d78d4a99f790d1fd5d78b6acc8ea61a7dd37b1e515518e950" Oct 11 02:59:14 crc kubenswrapper[4743]: I1011 02:59:14.330495 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4da7ac2f-f2f8-49cd-866c-51b54daa1b51-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 02:59:14 crc kubenswrapper[4743]: I1011 02:59:14.331940 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5vf4\" (UniqueName: \"kubernetes.io/projected/4da7ac2f-f2f8-49cd-866c-51b54daa1b51-kube-api-access-v5vf4\") on node \"crc\" DevicePath \"\"" Oct 11 02:59:14 crc kubenswrapper[4743]: I1011 02:59:14.357328 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x_b9069bf9-41de-4faf-ad86-3913be33cb1a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 02:59:14 crc kubenswrapper[4743]: I1011 02:59:14.440443 4743 scope.go:117] "RemoveContainer" containerID="e35b840616968a95337b28c0779631356ae7d92e26be43681424613467107bac" Oct 11 02:59:14 crc kubenswrapper[4743]: I1011 02:59:14.759833 4743 scope.go:117] "RemoveContainer" containerID="b90265755b866b9a0ddd74035e1780377d27807b4bbe81ab0e6ae61201f56b18" Oct 11 02:59:14 crc kubenswrapper[4743]: I1011 02:59:14.944801 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-lrpfr_08cb63b9-9798-4e9f-9df8-7a1676dbe1f8/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 02:59:14 crc kubenswrapper[4743]: I1011 02:59:14.957136 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-7kvlt_45eba0ce-a54b-4530-a391-35572fb868aa/ssh-known-hosts-edpm-deployment/0.log" Oct 11 02:59:15 crc kubenswrapper[4743]: I1011 02:59:15.001135 4743 scope.go:117] "RemoveContainer" containerID="7750aff9414ac20d78d4a99f790d1fd5d78b6acc8ea61a7dd37b1e515518e950" Oct 11 02:59:15 crc kubenswrapper[4743]: E1011 02:59:15.003429 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7750aff9414ac20d78d4a99f790d1fd5d78b6acc8ea61a7dd37b1e515518e950\": container with ID starting with 7750aff9414ac20d78d4a99f790d1fd5d78b6acc8ea61a7dd37b1e515518e950 not found: ID does not exist" containerID="7750aff9414ac20d78d4a99f790d1fd5d78b6acc8ea61a7dd37b1e515518e950" Oct 11 02:59:15 crc kubenswrapper[4743]: I1011 02:59:15.003486 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7750aff9414ac20d78d4a99f790d1fd5d78b6acc8ea61a7dd37b1e515518e950"} err="failed to get container status \"7750aff9414ac20d78d4a99f790d1fd5d78b6acc8ea61a7dd37b1e515518e950\": rpc error: code = NotFound desc = could not find container \"7750aff9414ac20d78d4a99f790d1fd5d78b6acc8ea61a7dd37b1e515518e950\": container with ID starting with 7750aff9414ac20d78d4a99f790d1fd5d78b6acc8ea61a7dd37b1e515518e950 not found: ID does not exist" Oct 11 02:59:15 crc kubenswrapper[4743]: I1011 02:59:15.003517 4743 scope.go:117] "RemoveContainer" containerID="e35b840616968a95337b28c0779631356ae7d92e26be43681424613467107bac" Oct 11 02:59:15 crc kubenswrapper[4743]: E1011 02:59:15.003913 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e35b840616968a95337b28c0779631356ae7d92e26be43681424613467107bac\": container with ID starting with e35b840616968a95337b28c0779631356ae7d92e26be43681424613467107bac not found: ID does not exist" containerID="e35b840616968a95337b28c0779631356ae7d92e26be43681424613467107bac" Oct 11 02:59:15 crc kubenswrapper[4743]: I1011 02:59:15.003943 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e35b840616968a95337b28c0779631356ae7d92e26be43681424613467107bac"} err="failed to get container status \"e35b840616968a95337b28c0779631356ae7d92e26be43681424613467107bac\": rpc error: code = NotFound desc = could not find container \"e35b840616968a95337b28c0779631356ae7d92e26be43681424613467107bac\": container with ID starting with e35b840616968a95337b28c0779631356ae7d92e26be43681424613467107bac not found: ID does not exist" Oct 11 02:59:15 crc kubenswrapper[4743]: I1011 02:59:15.003963 4743 scope.go:117] "RemoveContainer" containerID="b90265755b866b9a0ddd74035e1780377d27807b4bbe81ab0e6ae61201f56b18" Oct 11 02:59:15 crc kubenswrapper[4743]: E1011 02:59:15.007012 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b90265755b866b9a0ddd74035e1780377d27807b4bbe81ab0e6ae61201f56b18\": container with ID starting with b90265755b866b9a0ddd74035e1780377d27807b4bbe81ab0e6ae61201f56b18 not found: ID does not exist" containerID="b90265755b866b9a0ddd74035e1780377d27807b4bbe81ab0e6ae61201f56b18" Oct 11 02:59:15 crc kubenswrapper[4743]: I1011 02:59:15.007050 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b90265755b866b9a0ddd74035e1780377d27807b4bbe81ab0e6ae61201f56b18"} err="failed to get container status \"b90265755b866b9a0ddd74035e1780377d27807b4bbe81ab0e6ae61201f56b18\": rpc error: code = NotFound desc = could not find container \"b90265755b866b9a0ddd74035e1780377d27807b4bbe81ab0e6ae61201f56b18\": container with ID starting with b90265755b866b9a0ddd74035e1780377d27807b4bbe81ab0e6ae61201f56b18 not found: ID does not exist" Oct 11 02:59:15 crc kubenswrapper[4743]: I1011 02:59:15.062224 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4da7ac2f-f2f8-49cd-866c-51b54daa1b51-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4da7ac2f-f2f8-49cd-866c-51b54daa1b51" (UID: "4da7ac2f-f2f8-49cd-866c-51b54daa1b51"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:59:15 crc kubenswrapper[4743]: I1011 02:59:15.149964 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4da7ac2f-f2f8-49cd-866c-51b54daa1b51-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 02:59:15 crc kubenswrapper[4743]: I1011 02:59:15.163178 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-744b8cd687-p7lgl_68219217-d875-4eb2-9611-b9afb0f64c45/proxy-server/0.log" Oct 11 02:59:15 crc kubenswrapper[4743]: I1011 02:59:15.210850 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4n4jn"] Oct 11 02:59:15 crc kubenswrapper[4743]: I1011 02:59:15.225898 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4n4jn"] Oct 11 02:59:15 crc kubenswrapper[4743]: I1011 02:59:15.400810 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-8qvc2_5718aabd-82b4-4079-96f4-d241fb2c8efc/swift-ring-rebalance/0.log" Oct 11 02:59:15 crc kubenswrapper[4743]: I1011 02:59:15.462680 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-744b8cd687-p7lgl_68219217-d875-4eb2-9611-b9afb0f64c45/proxy-httpd/0.log" Oct 11 02:59:15 crc kubenswrapper[4743]: I1011 02:59:15.635047 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/account-auditor/0.log" Oct 11 02:59:15 crc kubenswrapper[4743]: I1011 02:59:15.701817 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/account-reaper/0.log" Oct 11 02:59:15 crc kubenswrapper[4743]: I1011 02:59:15.873202 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/account-server/0.log" Oct 11 02:59:15 crc kubenswrapper[4743]: I1011 02:59:15.907848 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/account-replicator/0.log" Oct 11 02:59:15 crc kubenswrapper[4743]: I1011 02:59:15.953755 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/container-auditor/0.log" Oct 11 02:59:16 crc kubenswrapper[4743]: I1011 02:59:16.106758 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4da7ac2f-f2f8-49cd-866c-51b54daa1b51" path="/var/lib/kubelet/pods/4da7ac2f-f2f8-49cd-866c-51b54daa1b51/volumes" Oct 11 02:59:16 crc kubenswrapper[4743]: I1011 02:59:16.111071 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2dfxt"] Oct 11 02:59:16 crc kubenswrapper[4743]: I1011 02:59:16.111287 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2dfxt" podUID="1a8e1204-b12f-465f-9cdc-9828e813199c" containerName="registry-server" containerID="cri-o://3f5d8bf6044679b16527b8d03a96507dfadfa9d10e51a8ad8f32f5129807ad43" gracePeriod=2 Oct 11 02:59:16 crc kubenswrapper[4743]: I1011 02:59:16.151119 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/container-updater/0.log" Oct 11 02:59:16 crc kubenswrapper[4743]: I1011 02:59:16.191311 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/container-server/0.log" Oct 11 02:59:16 crc kubenswrapper[4743]: I1011 02:59:16.243116 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/container-replicator/0.log" Oct 11 02:59:16 crc kubenswrapper[4743]: I1011 02:59:16.299848 4743 generic.go:334] "Generic (PLEG): container finished" podID="ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0" containerID="7eeffa298cf032952d74ffe29b42a956e1fc4ab609c7a03e52f76af87892c575" exitCode=0 Oct 11 02:59:16 crc kubenswrapper[4743]: I1011 02:59:16.299919 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s4ztp" event={"ID":"ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0","Type":"ContainerDied","Data":"7eeffa298cf032952d74ffe29b42a956e1fc4ab609c7a03e52f76af87892c575"} Oct 11 02:59:16 crc kubenswrapper[4743]: I1011 02:59:16.430586 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/object-expirer/0.log" Oct 11 02:59:16 crc kubenswrapper[4743]: I1011 02:59:16.474810 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/object-auditor/0.log" Oct 11 02:59:16 crc kubenswrapper[4743]: I1011 02:59:16.486365 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/object-replicator/0.log" Oct 11 02:59:16 crc kubenswrapper[4743]: I1011 02:59:16.638052 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/object-server/0.log" Oct 11 02:59:16 crc kubenswrapper[4743]: I1011 02:59:16.711099 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/object-updater/0.log" Oct 11 02:59:16 crc kubenswrapper[4743]: I1011 02:59:16.711549 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/rsync/0.log" Oct 11 02:59:16 crc kubenswrapper[4743]: I1011 02:59:16.835002 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/swift-recon-cron/0.log" Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.089142 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd_7585335b-a755-40a1-b388-d90e2fa07121/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.269565 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2dfxt" Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.319690 4743 generic.go:334] "Generic (PLEG): container finished" podID="1a8e1204-b12f-465f-9cdc-9828e813199c" containerID="3f5d8bf6044679b16527b8d03a96507dfadfa9d10e51a8ad8f32f5129807ad43" exitCode=0 Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.319884 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2dfxt" Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.321165 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dfxt" event={"ID":"1a8e1204-b12f-465f-9cdc-9828e813199c","Type":"ContainerDied","Data":"3f5d8bf6044679b16527b8d03a96507dfadfa9d10e51a8ad8f32f5129807ad43"} Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.332779 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dfxt" event={"ID":"1a8e1204-b12f-465f-9cdc-9828e813199c","Type":"ContainerDied","Data":"8e238d59cf20984de6c6961d67f1456147f4afba1701c088c35299dca2d98fe1"} Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.333011 4743 scope.go:117] "RemoveContainer" containerID="3f5d8bf6044679b16527b8d03a96507dfadfa9d10e51a8ad8f32f5129807ad43" Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.345969 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s4ztp" event={"ID":"ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0","Type":"ContainerStarted","Data":"b0898e70662827dad40a2b7d76dcaaf1dab4a445e592c67d39f072a17b090b1f"} Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.401310 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s4ztp" podStartSLOduration=2.798639674 podStartE2EDuration="9.40129357s" podCreationTimestamp="2025-10-11 02:59:08 +0000 UTC" firstStartedPulling="2025-10-11 02:59:10.201664189 +0000 UTC m=+7644.854644596" lastFinishedPulling="2025-10-11 02:59:16.804318095 +0000 UTC m=+7651.457298492" observedRunningTime="2025-10-11 02:59:17.400300595 +0000 UTC m=+7652.053280992" watchObservedRunningTime="2025-10-11 02:59:17.40129357 +0000 UTC m=+7652.054273967" Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.404742 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a8e1204-b12f-465f-9cdc-9828e813199c-utilities\") pod \"1a8e1204-b12f-465f-9cdc-9828e813199c\" (UID: \"1a8e1204-b12f-465f-9cdc-9828e813199c\") " Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.405001 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffdqf\" (UniqueName: \"kubernetes.io/projected/1a8e1204-b12f-465f-9cdc-9828e813199c-kube-api-access-ffdqf\") pod \"1a8e1204-b12f-465f-9cdc-9828e813199c\" (UID: \"1a8e1204-b12f-465f-9cdc-9828e813199c\") " Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.405133 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a8e1204-b12f-465f-9cdc-9828e813199c-catalog-content\") pod \"1a8e1204-b12f-465f-9cdc-9828e813199c\" (UID: \"1a8e1204-b12f-465f-9cdc-9828e813199c\") " Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.408447 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a8e1204-b12f-465f-9cdc-9828e813199c-utilities" (OuterVolumeSpecName: "utilities") pod "1a8e1204-b12f-465f-9cdc-9828e813199c" (UID: "1a8e1204-b12f-465f-9cdc-9828e813199c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.411148 4743 scope.go:117] "RemoveContainer" containerID="938976c9f3608f30169b3bbf2aa8b9b14d943c382c948e10937640bcb768c0d0" Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.428650 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a8e1204-b12f-465f-9cdc-9828e813199c-kube-api-access-ffdqf" (OuterVolumeSpecName: "kube-api-access-ffdqf") pod "1a8e1204-b12f-465f-9cdc-9828e813199c" (UID: "1a8e1204-b12f-465f-9cdc-9828e813199c"). InnerVolumeSpecName "kube-api-access-ffdqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.435766 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg_f10a464d-943b-4c74-88f8-7d76dbdac358/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.485166 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a8e1204-b12f-465f-9cdc-9828e813199c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1a8e1204-b12f-465f-9cdc-9828e813199c" (UID: "1a8e1204-b12f-465f-9cdc-9828e813199c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.500956 4743 scope.go:117] "RemoveContainer" containerID="712562a79758daa737ff9682f5fc2d748697e60ce3862892bf7b5530ac0feb94" Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.507747 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a8e1204-b12f-465f-9cdc-9828e813199c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.507778 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a8e1204-b12f-465f-9cdc-9828e813199c-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.507787 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffdqf\" (UniqueName: \"kubernetes.io/projected/1a8e1204-b12f-465f-9cdc-9828e813199c-kube-api-access-ffdqf\") on node \"crc\" DevicePath \"\"" Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.549649 4743 scope.go:117] "RemoveContainer" containerID="3f5d8bf6044679b16527b8d03a96507dfadfa9d10e51a8ad8f32f5129807ad43" Oct 11 02:59:17 crc kubenswrapper[4743]: E1011 02:59:17.550148 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f5d8bf6044679b16527b8d03a96507dfadfa9d10e51a8ad8f32f5129807ad43\": container with ID starting with 3f5d8bf6044679b16527b8d03a96507dfadfa9d10e51a8ad8f32f5129807ad43 not found: ID does not exist" containerID="3f5d8bf6044679b16527b8d03a96507dfadfa9d10e51a8ad8f32f5129807ad43" Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.550190 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f5d8bf6044679b16527b8d03a96507dfadfa9d10e51a8ad8f32f5129807ad43"} err="failed to get container status \"3f5d8bf6044679b16527b8d03a96507dfadfa9d10e51a8ad8f32f5129807ad43\": rpc error: code = NotFound desc = could not find container \"3f5d8bf6044679b16527b8d03a96507dfadfa9d10e51a8ad8f32f5129807ad43\": container with ID starting with 3f5d8bf6044679b16527b8d03a96507dfadfa9d10e51a8ad8f32f5129807ad43 not found: ID does not exist" Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.550215 4743 scope.go:117] "RemoveContainer" containerID="938976c9f3608f30169b3bbf2aa8b9b14d943c382c948e10937640bcb768c0d0" Oct 11 02:59:17 crc kubenswrapper[4743]: E1011 02:59:17.550583 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"938976c9f3608f30169b3bbf2aa8b9b14d943c382c948e10937640bcb768c0d0\": container with ID starting with 938976c9f3608f30169b3bbf2aa8b9b14d943c382c948e10937640bcb768c0d0 not found: ID does not exist" containerID="938976c9f3608f30169b3bbf2aa8b9b14d943c382c948e10937640bcb768c0d0" Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.550627 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"938976c9f3608f30169b3bbf2aa8b9b14d943c382c948e10937640bcb768c0d0"} err="failed to get container status \"938976c9f3608f30169b3bbf2aa8b9b14d943c382c948e10937640bcb768c0d0\": rpc error: code = NotFound desc = could not find container \"938976c9f3608f30169b3bbf2aa8b9b14d943c382c948e10937640bcb768c0d0\": container with ID starting with 938976c9f3608f30169b3bbf2aa8b9b14d943c382c948e10937640bcb768c0d0 not found: ID does not exist" Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.550655 4743 scope.go:117] "RemoveContainer" containerID="712562a79758daa737ff9682f5fc2d748697e60ce3862892bf7b5530ac0feb94" Oct 11 02:59:17 crc kubenswrapper[4743]: E1011 02:59:17.551137 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"712562a79758daa737ff9682f5fc2d748697e60ce3862892bf7b5530ac0feb94\": container with ID starting with 712562a79758daa737ff9682f5fc2d748697e60ce3862892bf7b5530ac0feb94 not found: ID does not exist" containerID="712562a79758daa737ff9682f5fc2d748697e60ce3862892bf7b5530ac0feb94" Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.551173 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"712562a79758daa737ff9682f5fc2d748697e60ce3862892bf7b5530ac0feb94"} err="failed to get container status \"712562a79758daa737ff9682f5fc2d748697e60ce3862892bf7b5530ac0feb94\": rpc error: code = NotFound desc = could not find container \"712562a79758daa737ff9682f5fc2d748697e60ce3862892bf7b5530ac0feb94\": container with ID starting with 712562a79758daa737ff9682f5fc2d748697e60ce3862892bf7b5530ac0feb94 not found: ID does not exist" Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.693940 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2dfxt"] Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.722375 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2dfxt"] Oct 11 02:59:17 crc kubenswrapper[4743]: I1011 02:59:17.953543 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_795afd2c-7ccb-435d-8cc6-6ef474ddf6e1/test-operator-logs-container/0.log" Oct 11 02:59:18 crc kubenswrapper[4743]: I1011 02:59:18.103513 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a8e1204-b12f-465f-9cdc-9828e813199c" path="/var/lib/kubelet/pods/1a8e1204-b12f-465f-9cdc-9828e813199c/volumes" Oct 11 02:59:18 crc kubenswrapper[4743]: I1011 02:59:18.364298 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h_cb69ff06-2c84-40a1-805b-349c4fbfe3ba/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 02:59:18 crc kubenswrapper[4743]: I1011 02:59:18.477878 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_59e812a1-677a-4aca-bb9a-c4f0d166710a/tempest-tests-tempest-tests-runner/0.log" Oct 11 02:59:18 crc kubenswrapper[4743]: I1011 02:59:18.660736 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s4ztp" Oct 11 02:59:18 crc kubenswrapper[4743]: I1011 02:59:18.662132 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s4ztp" Oct 11 02:59:19 crc kubenswrapper[4743]: I1011 02:59:19.712436 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s4ztp" podUID="ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0" containerName="registry-server" probeResult="failure" output=< Oct 11 02:59:19 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Oct 11 02:59:19 crc kubenswrapper[4743]: > Oct 11 02:59:25 crc kubenswrapper[4743]: I1011 02:59:25.750675 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f1159224-8c5f-43ae-8aa3-ca628c69914e/memcached/0.log" Oct 11 02:59:29 crc kubenswrapper[4743]: I1011 02:59:29.738110 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s4ztp" podUID="ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0" containerName="registry-server" probeResult="failure" output=< Oct 11 02:59:29 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Oct 11 02:59:29 crc kubenswrapper[4743]: > Oct 11 02:59:38 crc kubenswrapper[4743]: I1011 02:59:38.705820 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s4ztp" Oct 11 02:59:38 crc kubenswrapper[4743]: I1011 02:59:38.756720 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s4ztp" Oct 11 02:59:39 crc kubenswrapper[4743]: I1011 02:59:39.506547 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s4ztp"] Oct 11 02:59:40 crc kubenswrapper[4743]: I1011 02:59:40.597527 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s4ztp" podUID="ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0" containerName="registry-server" containerID="cri-o://b0898e70662827dad40a2b7d76dcaaf1dab4a445e592c67d39f072a17b090b1f" gracePeriod=2 Oct 11 02:59:41 crc kubenswrapper[4743]: I1011 02:59:41.165869 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s4ztp" Oct 11 02:59:41 crc kubenswrapper[4743]: I1011 02:59:41.257307 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txrwn\" (UniqueName: \"kubernetes.io/projected/ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0-kube-api-access-txrwn\") pod \"ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0\" (UID: \"ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0\") " Oct 11 02:59:41 crc kubenswrapper[4743]: I1011 02:59:41.257352 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0-catalog-content\") pod \"ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0\" (UID: \"ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0\") " Oct 11 02:59:41 crc kubenswrapper[4743]: I1011 02:59:41.257446 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0-utilities\") pod \"ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0\" (UID: \"ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0\") " Oct 11 02:59:41 crc kubenswrapper[4743]: I1011 02:59:41.258292 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0-utilities" (OuterVolumeSpecName: "utilities") pod "ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0" (UID: "ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:59:41 crc kubenswrapper[4743]: I1011 02:59:41.264134 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0-kube-api-access-txrwn" (OuterVolumeSpecName: "kube-api-access-txrwn") pod "ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0" (UID: "ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0"). InnerVolumeSpecName "kube-api-access-txrwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 02:59:41 crc kubenswrapper[4743]: I1011 02:59:41.347582 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0" (UID: "ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 02:59:41 crc kubenswrapper[4743]: I1011 02:59:41.360431 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 02:59:41 crc kubenswrapper[4743]: I1011 02:59:41.360498 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txrwn\" (UniqueName: \"kubernetes.io/projected/ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0-kube-api-access-txrwn\") on node \"crc\" DevicePath \"\"" Oct 11 02:59:41 crc kubenswrapper[4743]: I1011 02:59:41.360511 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 02:59:41 crc kubenswrapper[4743]: I1011 02:59:41.609735 4743 generic.go:334] "Generic (PLEG): container finished" podID="ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0" containerID="b0898e70662827dad40a2b7d76dcaaf1dab4a445e592c67d39f072a17b090b1f" exitCode=0 Oct 11 02:59:41 crc kubenswrapper[4743]: I1011 02:59:41.609783 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s4ztp" event={"ID":"ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0","Type":"ContainerDied","Data":"b0898e70662827dad40a2b7d76dcaaf1dab4a445e592c67d39f072a17b090b1f"} Oct 11 02:59:41 crc kubenswrapper[4743]: I1011 02:59:41.609811 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s4ztp" event={"ID":"ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0","Type":"ContainerDied","Data":"13117f2179ee359e76f512d6bf397019e17a77e8e0b5582f2bb0578f4e91d485"} Oct 11 02:59:41 crc kubenswrapper[4743]: I1011 02:59:41.609829 4743 scope.go:117] "RemoveContainer" containerID="b0898e70662827dad40a2b7d76dcaaf1dab4a445e592c67d39f072a17b090b1f" Oct 11 02:59:41 crc kubenswrapper[4743]: I1011 02:59:41.609997 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s4ztp" Oct 11 02:59:41 crc kubenswrapper[4743]: I1011 02:59:41.630020 4743 scope.go:117] "RemoveContainer" containerID="7eeffa298cf032952d74ffe29b42a956e1fc4ab609c7a03e52f76af87892c575" Oct 11 02:59:41 crc kubenswrapper[4743]: I1011 02:59:41.651465 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s4ztp"] Oct 11 02:59:41 crc kubenswrapper[4743]: I1011 02:59:41.663194 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s4ztp"] Oct 11 02:59:41 crc kubenswrapper[4743]: I1011 02:59:41.665086 4743 scope.go:117] "RemoveContainer" containerID="751150522a58a2a0c1464679283b3ed0a6b21980e9e2d83556afc270a1e50bca" Oct 11 02:59:41 crc kubenswrapper[4743]: I1011 02:59:41.708621 4743 scope.go:117] "RemoveContainer" containerID="b0898e70662827dad40a2b7d76dcaaf1dab4a445e592c67d39f072a17b090b1f" Oct 11 02:59:41 crc kubenswrapper[4743]: E1011 02:59:41.709061 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0898e70662827dad40a2b7d76dcaaf1dab4a445e592c67d39f072a17b090b1f\": container with ID starting with b0898e70662827dad40a2b7d76dcaaf1dab4a445e592c67d39f072a17b090b1f not found: ID does not exist" containerID="b0898e70662827dad40a2b7d76dcaaf1dab4a445e592c67d39f072a17b090b1f" Oct 11 02:59:41 crc kubenswrapper[4743]: I1011 02:59:41.709103 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0898e70662827dad40a2b7d76dcaaf1dab4a445e592c67d39f072a17b090b1f"} err="failed to get container status \"b0898e70662827dad40a2b7d76dcaaf1dab4a445e592c67d39f072a17b090b1f\": rpc error: code = NotFound desc = could not find container \"b0898e70662827dad40a2b7d76dcaaf1dab4a445e592c67d39f072a17b090b1f\": container with ID starting with b0898e70662827dad40a2b7d76dcaaf1dab4a445e592c67d39f072a17b090b1f not found: ID does not exist" Oct 11 02:59:41 crc kubenswrapper[4743]: I1011 02:59:41.709129 4743 scope.go:117] "RemoveContainer" containerID="7eeffa298cf032952d74ffe29b42a956e1fc4ab609c7a03e52f76af87892c575" Oct 11 02:59:41 crc kubenswrapper[4743]: E1011 02:59:41.709448 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eeffa298cf032952d74ffe29b42a956e1fc4ab609c7a03e52f76af87892c575\": container with ID starting with 7eeffa298cf032952d74ffe29b42a956e1fc4ab609c7a03e52f76af87892c575 not found: ID does not exist" containerID="7eeffa298cf032952d74ffe29b42a956e1fc4ab609c7a03e52f76af87892c575" Oct 11 02:59:41 crc kubenswrapper[4743]: I1011 02:59:41.709497 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eeffa298cf032952d74ffe29b42a956e1fc4ab609c7a03e52f76af87892c575"} err="failed to get container status \"7eeffa298cf032952d74ffe29b42a956e1fc4ab609c7a03e52f76af87892c575\": rpc error: code = NotFound desc = could not find container \"7eeffa298cf032952d74ffe29b42a956e1fc4ab609c7a03e52f76af87892c575\": container with ID starting with 7eeffa298cf032952d74ffe29b42a956e1fc4ab609c7a03e52f76af87892c575 not found: ID does not exist" Oct 11 02:59:41 crc kubenswrapper[4743]: I1011 02:59:41.709525 4743 scope.go:117] "RemoveContainer" containerID="751150522a58a2a0c1464679283b3ed0a6b21980e9e2d83556afc270a1e50bca" Oct 11 02:59:41 crc kubenswrapper[4743]: E1011 02:59:41.709820 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"751150522a58a2a0c1464679283b3ed0a6b21980e9e2d83556afc270a1e50bca\": container with ID starting with 751150522a58a2a0c1464679283b3ed0a6b21980e9e2d83556afc270a1e50bca not found: ID does not exist" containerID="751150522a58a2a0c1464679283b3ed0a6b21980e9e2d83556afc270a1e50bca" Oct 11 02:59:41 crc kubenswrapper[4743]: I1011 02:59:41.709848 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"751150522a58a2a0c1464679283b3ed0a6b21980e9e2d83556afc270a1e50bca"} err="failed to get container status \"751150522a58a2a0c1464679283b3ed0a6b21980e9e2d83556afc270a1e50bca\": rpc error: code = NotFound desc = could not find container \"751150522a58a2a0c1464679283b3ed0a6b21980e9e2d83556afc270a1e50bca\": container with ID starting with 751150522a58a2a0c1464679283b3ed0a6b21980e9e2d83556afc270a1e50bca not found: ID does not exist" Oct 11 02:59:42 crc kubenswrapper[4743]: I1011 02:59:42.105402 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0" path="/var/lib/kubelet/pods/ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0/volumes" Oct 11 02:59:44 crc kubenswrapper[4743]: I1011 02:59:44.458248 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 02:59:44 crc kubenswrapper[4743]: I1011 02:59:44.458609 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 02:59:46 crc kubenswrapper[4743]: I1011 02:59:46.845409 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6_61bd1af6-1438-46b0-8762-6ee4abb576cd/util/0.log" Oct 11 02:59:47 crc kubenswrapper[4743]: I1011 02:59:47.073249 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6_61bd1af6-1438-46b0-8762-6ee4abb576cd/pull/0.log" Oct 11 02:59:47 crc kubenswrapper[4743]: I1011 02:59:47.079332 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6_61bd1af6-1438-46b0-8762-6ee4abb576cd/util/0.log" Oct 11 02:59:47 crc kubenswrapper[4743]: I1011 02:59:47.151610 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6_61bd1af6-1438-46b0-8762-6ee4abb576cd/pull/0.log" Oct 11 02:59:47 crc kubenswrapper[4743]: I1011 02:59:47.280883 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6_61bd1af6-1438-46b0-8762-6ee4abb576cd/extract/0.log" Oct 11 02:59:47 crc kubenswrapper[4743]: I1011 02:59:47.283423 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6_61bd1af6-1438-46b0-8762-6ee4abb576cd/pull/0.log" Oct 11 02:59:47 crc kubenswrapper[4743]: I1011 02:59:47.325339 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6_61bd1af6-1438-46b0-8762-6ee4abb576cd/util/0.log" Oct 11 02:59:47 crc kubenswrapper[4743]: I1011 02:59:47.459764 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-nfnkk_ed4aa42c-bd83-4fa8-99f2-5d7cde436979/kube-rbac-proxy/0.log" Oct 11 02:59:47 crc kubenswrapper[4743]: I1011 02:59:47.564748 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-nfnkk_ed4aa42c-bd83-4fa8-99f2-5d7cde436979/manager/0.log" Oct 11 02:59:47 crc kubenswrapper[4743]: I1011 02:59:47.636901 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-lbnjc_088785b6-72f1-472b-accb-fef0261e024b/kube-rbac-proxy/0.log" Oct 11 02:59:47 crc kubenswrapper[4743]: I1011 02:59:47.756220 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-lbnjc_088785b6-72f1-472b-accb-fef0261e024b/manager/0.log" Oct 11 02:59:47 crc kubenswrapper[4743]: I1011 02:59:47.856997 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-95j97_01d65505-63e1-4355-a1dc-675d22f5bdea/manager/0.log" Oct 11 02:59:47 crc kubenswrapper[4743]: I1011 02:59:47.876207 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-95j97_01d65505-63e1-4355-a1dc-675d22f5bdea/kube-rbac-proxy/0.log" Oct 11 02:59:48 crc kubenswrapper[4743]: I1011 02:59:48.018975 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-d58jj_3f7d0e6e-1b92-48de-910b-ef415fac5e7c/kube-rbac-proxy/0.log" Oct 11 02:59:48 crc kubenswrapper[4743]: I1011 02:59:48.153362 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-d58jj_3f7d0e6e-1b92-48de-910b-ef415fac5e7c/manager/0.log" Oct 11 02:59:48 crc kubenswrapper[4743]: I1011 02:59:48.204237 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-sjsvr_a2f6fc09-a2cb-46e0-9fe6-e5dad1025bfe/kube-rbac-proxy/0.log" Oct 11 02:59:48 crc kubenswrapper[4743]: I1011 02:59:48.347908 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-sjsvr_a2f6fc09-a2cb-46e0-9fe6-e5dad1025bfe/manager/0.log" Oct 11 02:59:48 crc kubenswrapper[4743]: I1011 02:59:48.382197 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-5t64t_547f1b93-dffd-4c63-964a-e3ea6d29970e/kube-rbac-proxy/0.log" Oct 11 02:59:48 crc kubenswrapper[4743]: I1011 02:59:48.449902 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-5t64t_547f1b93-dffd-4c63-964a-e3ea6d29970e/manager/0.log" Oct 11 02:59:48 crc kubenswrapper[4743]: I1011 02:59:48.600270 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-6pvq9_f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b/kube-rbac-proxy/0.log" Oct 11 02:59:48 crc kubenswrapper[4743]: I1011 02:59:48.811070 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-6pvq9_f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b/manager/0.log" Oct 11 02:59:48 crc kubenswrapper[4743]: I1011 02:59:48.848761 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-zrk9n_53fca326-a309-4ff5-b52f-8b547496c069/kube-rbac-proxy/0.log" Oct 11 02:59:48 crc kubenswrapper[4743]: I1011 02:59:48.878871 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-zrk9n_53fca326-a309-4ff5-b52f-8b547496c069/manager/0.log" Oct 11 02:59:49 crc kubenswrapper[4743]: I1011 02:59:49.029773 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-cbc66_2726f212-a3ba-48cb-a96f-8d5f117a7f5e/kube-rbac-proxy/0.log" Oct 11 02:59:49 crc kubenswrapper[4743]: I1011 02:59:49.208000 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-cbc66_2726f212-a3ba-48cb-a96f-8d5f117a7f5e/manager/0.log" Oct 11 02:59:49 crc kubenswrapper[4743]: I1011 02:59:49.317671 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-vvcjm_af96a190-49a7-4179-be4f-4a636d004cd0/kube-rbac-proxy/0.log" Oct 11 02:59:49 crc kubenswrapper[4743]: I1011 02:59:49.328464 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-vvcjm_af96a190-49a7-4179-be4f-4a636d004cd0/manager/0.log" Oct 11 02:59:49 crc kubenswrapper[4743]: I1011 02:59:49.467890 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-qs4hf_be590325-60d2-4f91-9e78-a5520788cfed/kube-rbac-proxy/0.log" Oct 11 02:59:49 crc kubenswrapper[4743]: I1011 02:59:49.589003 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-qs4hf_be590325-60d2-4f91-9e78-a5520788cfed/manager/0.log" Oct 11 02:59:49 crc kubenswrapper[4743]: I1011 02:59:49.669557 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-2htrs_035ca3e9-4fd6-4cb1-8e82-8ebcf39146a3/kube-rbac-proxy/0.log" Oct 11 02:59:49 crc kubenswrapper[4743]: I1011 02:59:49.748437 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-2htrs_035ca3e9-4fd6-4cb1-8e82-8ebcf39146a3/manager/0.log" Oct 11 02:59:49 crc kubenswrapper[4743]: I1011 02:59:49.797899 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-d29v4_4be72879-00a9-4253-9ad9-c266c32b968e/kube-rbac-proxy/0.log" Oct 11 02:59:50 crc kubenswrapper[4743]: I1011 02:59:50.008207 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-d29v4_4be72879-00a9-4253-9ad9-c266c32b968e/manager/0.log" Oct 11 02:59:50 crc kubenswrapper[4743]: I1011 02:59:50.071013 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-nt865_a79f9419-0d04-41bb-b1ab-1615888819df/manager/0.log" Oct 11 02:59:50 crc kubenswrapper[4743]: I1011 02:59:50.083206 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-nt865_a79f9419-0d04-41bb-b1ab-1615888819df/kube-rbac-proxy/0.log" Oct 11 02:59:50 crc kubenswrapper[4743]: I1011 02:59:50.251758 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv_6b8570e7-9b29-4a55-95e3-a3a588ba4083/kube-rbac-proxy/0.log" Oct 11 02:59:50 crc kubenswrapper[4743]: I1011 02:59:50.315649 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv_6b8570e7-9b29-4a55-95e3-a3a588ba4083/manager/0.log" Oct 11 02:59:50 crc kubenswrapper[4743]: I1011 02:59:50.492010 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6bc9d748dc-5vs6z_cec9bfbd-c515-4bcc-8bf7-63648ecd230b/kube-rbac-proxy/0.log" Oct 11 02:59:50 crc kubenswrapper[4743]: I1011 02:59:50.584985 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-79cb6b48d5-wqg8k_9607e624-b661-41bd-bfaf-ceb7e552fbf2/kube-rbac-proxy/0.log" Oct 11 02:59:50 crc kubenswrapper[4743]: I1011 02:59:50.736080 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-79cb6b48d5-wqg8k_9607e624-b661-41bd-bfaf-ceb7e552fbf2/operator/0.log" Oct 11 02:59:50 crc kubenswrapper[4743]: I1011 02:59:50.819823 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-848vd_bd73b279-e2a4-4500-aed1-70c73212cba1/registry-server/0.log" Oct 11 02:59:51 crc kubenswrapper[4743]: I1011 02:59:51.019441 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-vt9w7_c39ea94c-0f30-4b04-8a87-848ee9a62740/kube-rbac-proxy/0.log" Oct 11 02:59:51 crc kubenswrapper[4743]: I1011 02:59:51.076074 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-vt9w7_c39ea94c-0f30-4b04-8a87-848ee9a62740/manager/0.log" Oct 11 02:59:51 crc kubenswrapper[4743]: I1011 02:59:51.101943 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-6pzz7_884ddb30-3a64-4bf2-83ac-3acc83e8bd96/kube-rbac-proxy/0.log" Oct 11 02:59:51 crc kubenswrapper[4743]: I1011 02:59:51.302133 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-6pzz7_884ddb30-3a64-4bf2-83ac-3acc83e8bd96/manager/0.log" Oct 11 02:59:51 crc kubenswrapper[4743]: I1011 02:59:51.515363 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-w94wj_fd811aaa-ab9b-4d34-a268-ecbfa76bf43a/operator/0.log" Oct 11 02:59:51 crc kubenswrapper[4743]: I1011 02:59:51.621885 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-cxzw6_36bd30aa-037d-4e7d-ae0f-fb53fe20f812/manager/0.log" Oct 11 02:59:51 crc kubenswrapper[4743]: I1011 02:59:51.708745 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-cxzw6_36bd30aa-037d-4e7d-ae0f-fb53fe20f812/kube-rbac-proxy/0.log" Oct 11 02:59:52 crc kubenswrapper[4743]: I1011 02:59:52.030809 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6bc9d748dc-5vs6z_cec9bfbd-c515-4bcc-8bf7-63648ecd230b/manager/0.log" Oct 11 02:59:52 crc kubenswrapper[4743]: I1011 02:59:52.110432 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-9qjkk_962796b2-2bc0-4db5-84be-df36bbc28121/kube-rbac-proxy/0.log" Oct 11 02:59:52 crc kubenswrapper[4743]: I1011 02:59:52.119177 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-679ff79844-2dvm2_7967fda3-d5ca-4e28-878e-e50017efc60f/kube-rbac-proxy/0.log" Oct 11 02:59:52 crc kubenswrapper[4743]: I1011 02:59:52.277517 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-9qjkk_962796b2-2bc0-4db5-84be-df36bbc28121/manager/0.log" Oct 11 02:59:52 crc kubenswrapper[4743]: I1011 02:59:52.376682 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-ctp5x_f1a6e436-6a36-45a8-a033-f8d307ba12bd/kube-rbac-proxy/0.log" Oct 11 02:59:52 crc kubenswrapper[4743]: I1011 02:59:52.390748 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-ctp5x_f1a6e436-6a36-45a8-a033-f8d307ba12bd/manager/0.log" Oct 11 02:59:52 crc kubenswrapper[4743]: I1011 02:59:52.458429 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-679ff79844-2dvm2_7967fda3-d5ca-4e28-878e-e50017efc60f/manager/0.log" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.163319 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335860-lcktr"] Oct 11 03:00:00 crc kubenswrapper[4743]: E1011 03:00:00.164235 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0" containerName="registry-server" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.164247 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0" containerName="registry-server" Oct 11 03:00:00 crc kubenswrapper[4743]: E1011 03:00:00.164262 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0" containerName="extract-utilities" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.164270 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0" containerName="extract-utilities" Oct 11 03:00:00 crc kubenswrapper[4743]: E1011 03:00:00.164287 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4da7ac2f-f2f8-49cd-866c-51b54daa1b51" containerName="extract-content" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.164294 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4da7ac2f-f2f8-49cd-866c-51b54daa1b51" containerName="extract-content" Oct 11 03:00:00 crc kubenswrapper[4743]: E1011 03:00:00.164312 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a8e1204-b12f-465f-9cdc-9828e813199c" containerName="registry-server" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.164317 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8e1204-b12f-465f-9cdc-9828e813199c" containerName="registry-server" Oct 11 03:00:00 crc kubenswrapper[4743]: E1011 03:00:00.164332 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4da7ac2f-f2f8-49cd-866c-51b54daa1b51" containerName="registry-server" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.164337 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4da7ac2f-f2f8-49cd-866c-51b54daa1b51" containerName="registry-server" Oct 11 03:00:00 crc kubenswrapper[4743]: E1011 03:00:00.164356 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a8e1204-b12f-465f-9cdc-9828e813199c" containerName="extract-utilities" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.164361 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8e1204-b12f-465f-9cdc-9828e813199c" containerName="extract-utilities" Oct 11 03:00:00 crc kubenswrapper[4743]: E1011 03:00:00.164372 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a8e1204-b12f-465f-9cdc-9828e813199c" containerName="extract-content" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.164379 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8e1204-b12f-465f-9cdc-9828e813199c" containerName="extract-content" Oct 11 03:00:00 crc kubenswrapper[4743]: E1011 03:00:00.164397 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0" containerName="extract-content" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.164402 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0" containerName="extract-content" Oct 11 03:00:00 crc kubenswrapper[4743]: E1011 03:00:00.164414 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4da7ac2f-f2f8-49cd-866c-51b54daa1b51" containerName="extract-utilities" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.164420 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4da7ac2f-f2f8-49cd-866c-51b54daa1b51" containerName="extract-utilities" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.164629 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a8e1204-b12f-465f-9cdc-9828e813199c" containerName="registry-server" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.164652 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4da7ac2f-f2f8-49cd-866c-51b54daa1b51" containerName="registry-server" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.164672 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba03d73f-0a91-41e0-8042-ee5fa3c7f9d0" containerName="registry-server" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.165405 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335860-lcktr" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.181336 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335860-lcktr"] Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.182638 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.183003 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.258707 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxhdh\" (UniqueName: \"kubernetes.io/projected/b42ff236-4435-4986-b89d-67da6c4a0fbf-kube-api-access-rxhdh\") pod \"collect-profiles-29335860-lcktr\" (UID: \"b42ff236-4435-4986-b89d-67da6c4a0fbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335860-lcktr" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.258869 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b42ff236-4435-4986-b89d-67da6c4a0fbf-config-volume\") pod \"collect-profiles-29335860-lcktr\" (UID: \"b42ff236-4435-4986-b89d-67da6c4a0fbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335860-lcktr" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.258917 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b42ff236-4435-4986-b89d-67da6c4a0fbf-secret-volume\") pod \"collect-profiles-29335860-lcktr\" (UID: \"b42ff236-4435-4986-b89d-67da6c4a0fbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335860-lcktr" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.360887 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b42ff236-4435-4986-b89d-67da6c4a0fbf-config-volume\") pod \"collect-profiles-29335860-lcktr\" (UID: \"b42ff236-4435-4986-b89d-67da6c4a0fbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335860-lcktr" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.360969 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b42ff236-4435-4986-b89d-67da6c4a0fbf-secret-volume\") pod \"collect-profiles-29335860-lcktr\" (UID: \"b42ff236-4435-4986-b89d-67da6c4a0fbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335860-lcktr" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.361128 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxhdh\" (UniqueName: \"kubernetes.io/projected/b42ff236-4435-4986-b89d-67da6c4a0fbf-kube-api-access-rxhdh\") pod \"collect-profiles-29335860-lcktr\" (UID: \"b42ff236-4435-4986-b89d-67da6c4a0fbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335860-lcktr" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.361878 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b42ff236-4435-4986-b89d-67da6c4a0fbf-config-volume\") pod \"collect-profiles-29335860-lcktr\" (UID: \"b42ff236-4435-4986-b89d-67da6c4a0fbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335860-lcktr" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.367581 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b42ff236-4435-4986-b89d-67da6c4a0fbf-secret-volume\") pod \"collect-profiles-29335860-lcktr\" (UID: \"b42ff236-4435-4986-b89d-67da6c4a0fbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335860-lcktr" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.377769 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxhdh\" (UniqueName: \"kubernetes.io/projected/b42ff236-4435-4986-b89d-67da6c4a0fbf-kube-api-access-rxhdh\") pod \"collect-profiles-29335860-lcktr\" (UID: \"b42ff236-4435-4986-b89d-67da6c4a0fbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29335860-lcktr" Oct 11 03:00:00 crc kubenswrapper[4743]: I1011 03:00:00.512361 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335860-lcktr" Oct 11 03:00:01 crc kubenswrapper[4743]: I1011 03:00:01.034459 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335860-lcktr"] Oct 11 03:00:01 crc kubenswrapper[4743]: I1011 03:00:01.817848 4743 generic.go:334] "Generic (PLEG): container finished" podID="b42ff236-4435-4986-b89d-67da6c4a0fbf" containerID="2b78e5c2fe55f3878f2a4dd2010d3f4173bf44ec72d7cdf4d7f0b3e59511483a" exitCode=0 Oct 11 03:00:01 crc kubenswrapper[4743]: I1011 03:00:01.817962 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335860-lcktr" event={"ID":"b42ff236-4435-4986-b89d-67da6c4a0fbf","Type":"ContainerDied","Data":"2b78e5c2fe55f3878f2a4dd2010d3f4173bf44ec72d7cdf4d7f0b3e59511483a"} Oct 11 03:00:01 crc kubenswrapper[4743]: I1011 03:00:01.818259 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335860-lcktr" event={"ID":"b42ff236-4435-4986-b89d-67da6c4a0fbf","Type":"ContainerStarted","Data":"6facc8189b57a13fcfecf08a77afed34314206e4033654c43e47b8b97211195e"} Oct 11 03:00:03 crc kubenswrapper[4743]: I1011 03:00:03.287342 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335860-lcktr" Oct 11 03:00:03 crc kubenswrapper[4743]: I1011 03:00:03.433753 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b42ff236-4435-4986-b89d-67da6c4a0fbf-secret-volume\") pod \"b42ff236-4435-4986-b89d-67da6c4a0fbf\" (UID: \"b42ff236-4435-4986-b89d-67da6c4a0fbf\") " Oct 11 03:00:03 crc kubenswrapper[4743]: I1011 03:00:03.433899 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b42ff236-4435-4986-b89d-67da6c4a0fbf-config-volume\") pod \"b42ff236-4435-4986-b89d-67da6c4a0fbf\" (UID: \"b42ff236-4435-4986-b89d-67da6c4a0fbf\") " Oct 11 03:00:03 crc kubenswrapper[4743]: I1011 03:00:03.434001 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxhdh\" (UniqueName: \"kubernetes.io/projected/b42ff236-4435-4986-b89d-67da6c4a0fbf-kube-api-access-rxhdh\") pod \"b42ff236-4435-4986-b89d-67da6c4a0fbf\" (UID: \"b42ff236-4435-4986-b89d-67da6c4a0fbf\") " Oct 11 03:00:03 crc kubenswrapper[4743]: I1011 03:00:03.435795 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b42ff236-4435-4986-b89d-67da6c4a0fbf-config-volume" (OuterVolumeSpecName: "config-volume") pod "b42ff236-4435-4986-b89d-67da6c4a0fbf" (UID: "b42ff236-4435-4986-b89d-67da6c4a0fbf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 11 03:00:03 crc kubenswrapper[4743]: I1011 03:00:03.440096 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b42ff236-4435-4986-b89d-67da6c4a0fbf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b42ff236-4435-4986-b89d-67da6c4a0fbf" (UID: "b42ff236-4435-4986-b89d-67da6c4a0fbf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:00:03 crc kubenswrapper[4743]: I1011 03:00:03.447891 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b42ff236-4435-4986-b89d-67da6c4a0fbf-kube-api-access-rxhdh" (OuterVolumeSpecName: "kube-api-access-rxhdh") pod "b42ff236-4435-4986-b89d-67da6c4a0fbf" (UID: "b42ff236-4435-4986-b89d-67da6c4a0fbf"). InnerVolumeSpecName "kube-api-access-rxhdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:00:03 crc kubenswrapper[4743]: I1011 03:00:03.536415 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b42ff236-4435-4986-b89d-67da6c4a0fbf-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 11 03:00:03 crc kubenswrapper[4743]: I1011 03:00:03.536454 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b42ff236-4435-4986-b89d-67da6c4a0fbf-config-volume\") on node \"crc\" DevicePath \"\"" Oct 11 03:00:03 crc kubenswrapper[4743]: I1011 03:00:03.536463 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxhdh\" (UniqueName: \"kubernetes.io/projected/b42ff236-4435-4986-b89d-67da6c4a0fbf-kube-api-access-rxhdh\") on node \"crc\" DevicePath \"\"" Oct 11 03:00:03 crc kubenswrapper[4743]: I1011 03:00:03.846509 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29335860-lcktr" event={"ID":"b42ff236-4435-4986-b89d-67da6c4a0fbf","Type":"ContainerDied","Data":"6facc8189b57a13fcfecf08a77afed34314206e4033654c43e47b8b97211195e"} Oct 11 03:00:03 crc kubenswrapper[4743]: I1011 03:00:03.846545 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6facc8189b57a13fcfecf08a77afed34314206e4033654c43e47b8b97211195e" Oct 11 03:00:03 crc kubenswrapper[4743]: I1011 03:00:03.846601 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29335860-lcktr" Oct 11 03:00:04 crc kubenswrapper[4743]: I1011 03:00:04.411767 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335815-q7tlq"] Oct 11 03:00:04 crc kubenswrapper[4743]: I1011 03:00:04.422470 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29335815-q7tlq"] Oct 11 03:00:06 crc kubenswrapper[4743]: I1011 03:00:06.104581 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da838f8d-81f6-4239-8b89-607a7dcace70" path="/var/lib/kubelet/pods/da838f8d-81f6-4239-8b89-607a7dcace70/volumes" Oct 11 03:00:09 crc kubenswrapper[4743]: I1011 03:00:09.213466 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-rt2nd_5690cb54-1fbe-4d33-a809-b7bdca4df6c0/control-plane-machine-set-operator/0.log" Oct 11 03:00:09 crc kubenswrapper[4743]: I1011 03:00:09.359624 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tljzf_79462f0e-13e0-4ee7-af5f-02e6e5cd849d/kube-rbac-proxy/0.log" Oct 11 03:00:09 crc kubenswrapper[4743]: I1011 03:00:09.385231 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tljzf_79462f0e-13e0-4ee7-af5f-02e6e5cd849d/machine-api-operator/0.log" Oct 11 03:00:14 crc kubenswrapper[4743]: I1011 03:00:14.458155 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 03:00:14 crc kubenswrapper[4743]: I1011 03:00:14.458752 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 03:00:21 crc kubenswrapper[4743]: I1011 03:00:21.666923 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-4xbkk_3862bd0e-7310-4227-9d4f-8eb551293343/cert-manager-controller/0.log" Oct 11 03:00:21 crc kubenswrapper[4743]: I1011 03:00:21.803520 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-gf47h_5796afa2-031a-4046-b0ed-d2f728e700db/cert-manager-cainjector/0.log" Oct 11 03:00:21 crc kubenswrapper[4743]: I1011 03:00:21.877704 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-xdlw4_60052a4b-2c20-4f20-b109-ca070b9e11e6/cert-manager-webhook/0.log" Oct 11 03:00:34 crc kubenswrapper[4743]: I1011 03:00:34.178395 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-gvckr_f12fff02-4cd4-437d-b704-99766f165f0e/nmstate-console-plugin/0.log" Oct 11 03:00:34 crc kubenswrapper[4743]: I1011 03:00:34.380536 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-j4sk5_8e3fe60f-ace7-445d-8994-03a95ff90479/nmstate-handler/0.log" Oct 11 03:00:34 crc kubenswrapper[4743]: I1011 03:00:34.440485 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-2v928_c2323cd1-eebb-46d7-9393-586093c921f1/kube-rbac-proxy/0.log" Oct 11 03:00:34 crc kubenswrapper[4743]: I1011 03:00:34.495882 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-2v928_c2323cd1-eebb-46d7-9393-586093c921f1/nmstate-metrics/0.log" Oct 11 03:00:34 crc kubenswrapper[4743]: I1011 03:00:34.684625 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-bmzmc_0389ac5e-634b-4dd6-a9a8-084cb349b29e/nmstate-operator/0.log" Oct 11 03:00:34 crc kubenswrapper[4743]: I1011 03:00:34.739310 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-nl9th_6d192208-f92a-4299-866d-14cf8ecffe17/nmstate-webhook/0.log" Oct 11 03:00:44 crc kubenswrapper[4743]: I1011 03:00:44.458417 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 03:00:44 crc kubenswrapper[4743]: I1011 03:00:44.459062 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 03:00:44 crc kubenswrapper[4743]: I1011 03:00:44.459107 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 03:00:44 crc kubenswrapper[4743]: I1011 03:00:44.459984 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 03:00:44 crc kubenswrapper[4743]: I1011 03:00:44.460036 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" gracePeriod=600 Oct 11 03:00:44 crc kubenswrapper[4743]: E1011 03:00:44.592161 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:00:45 crc kubenswrapper[4743]: I1011 03:00:45.314896 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" exitCode=0 Oct 11 03:00:45 crc kubenswrapper[4743]: I1011 03:00:45.314944 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61"} Oct 11 03:00:45 crc kubenswrapper[4743]: I1011 03:00:45.315018 4743 scope.go:117] "RemoveContainer" containerID="85f2b99de1cc7ef2f34c96872391b70e6bcbca23b2568563cfde1c4d96fac977" Oct 11 03:00:45 crc kubenswrapper[4743]: I1011 03:00:45.315955 4743 scope.go:117] "RemoveContainer" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" Oct 11 03:00:45 crc kubenswrapper[4743]: E1011 03:00:45.316317 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:00:46 crc kubenswrapper[4743]: I1011 03:00:46.675378 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7857f779b4-t484n_d60b36a6-87ff-4325-a245-43b3dea4cfaf/manager/0.log" Oct 11 03:00:46 crc kubenswrapper[4743]: I1011 03:00:46.688702 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7857f779b4-t484n_d60b36a6-87ff-4325-a245-43b3dea4cfaf/kube-rbac-proxy/0.log" Oct 11 03:00:57 crc kubenswrapper[4743]: I1011 03:00:57.091953 4743 scope.go:117] "RemoveContainer" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" Oct 11 03:00:57 crc kubenswrapper[4743]: E1011 03:00:57.093565 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:00:59 crc kubenswrapper[4743]: I1011 03:00:59.200040 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-8958c8b87-zqct6_e4f8060c-3f3e-4e5d-85c5-3c344322869a/cluster-logging-operator/0.log" Oct 11 03:00:59 crc kubenswrapper[4743]: I1011 03:00:59.398475 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-fdslp_15f63aaf-1998-4daa-8ebf-1f9455b483e5/collector/0.log" Oct 11 03:00:59 crc kubenswrapper[4743]: I1011 03:00:59.444046 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_03508b98-c7f8-4ffd-9417-074307cd588e/loki-compactor/0.log" Oct 11 03:00:59 crc kubenswrapper[4743]: I1011 03:00:59.451382 4743 scope.go:117] "RemoveContainer" containerID="09108208f1ab277a815b8850e624542cec2f1d5bfd37faf75b8ac7c49c434ce7" Oct 11 03:00:59 crc kubenswrapper[4743]: I1011 03:00:59.660242 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-c45fcc855-8mtnp_60332ecb-34a5-4628-9311-4469d823f589/gateway/0.log" Oct 11 03:00:59 crc kubenswrapper[4743]: I1011 03:00:59.670517 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-6f5f7fff97-72rvc_1f759884-04cd-4b18-90dd-9b4745c12ba7/loki-distributor/0.log" Oct 11 03:00:59 crc kubenswrapper[4743]: I1011 03:00:59.672630 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-c45fcc855-8mtnp_60332ecb-34a5-4628-9311-4469d823f589/opa/0.log" Oct 11 03:00:59 crc kubenswrapper[4743]: I1011 03:00:59.808536 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-c45fcc855-ddvcq_443c6346-a364-4e67-8c53-2bcd9b1f0927/gateway/0.log" Oct 11 03:00:59 crc kubenswrapper[4743]: I1011 03:00:59.861578 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-c45fcc855-ddvcq_443c6346-a364-4e67-8c53-2bcd9b1f0927/opa/0.log" Oct 11 03:00:59 crc kubenswrapper[4743]: I1011 03:00:59.996796 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_a85d8329-17fe-4d06-b45e-d410514cc210/loki-index-gateway/0.log" Oct 11 03:01:00 crc kubenswrapper[4743]: I1011 03:01:00.108969 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_bf0c8290-1118-4dbf-a638-bde5c07bdaab/loki-ingester/0.log" Oct 11 03:01:00 crc kubenswrapper[4743]: I1011 03:01:00.154399 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29335861-6fjkl"] Oct 11 03:01:00 crc kubenswrapper[4743]: E1011 03:01:00.155336 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b42ff236-4435-4986-b89d-67da6c4a0fbf" containerName="collect-profiles" Oct 11 03:01:00 crc kubenswrapper[4743]: I1011 03:01:00.155356 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42ff236-4435-4986-b89d-67da6c4a0fbf" containerName="collect-profiles" Oct 11 03:01:00 crc kubenswrapper[4743]: I1011 03:01:00.155699 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b42ff236-4435-4986-b89d-67da6c4a0fbf" containerName="collect-profiles" Oct 11 03:01:00 crc kubenswrapper[4743]: I1011 03:01:00.156797 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29335861-6fjkl" Oct 11 03:01:00 crc kubenswrapper[4743]: I1011 03:01:00.173521 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29335861-6fjkl"] Oct 11 03:01:00 crc kubenswrapper[4743]: I1011 03:01:00.228162 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-5d954896cf-xd7bv_7b393f9c-b255-4cac-96f2-3d5861cc7cce/loki-querier/0.log" Oct 11 03:01:00 crc kubenswrapper[4743]: I1011 03:01:00.314584 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-6fbbbc8b7d-zzmff_58a37184-c341-454d-b33e-a2af6dc56af3/loki-query-frontend/0.log" Oct 11 03:01:00 crc kubenswrapper[4743]: I1011 03:01:00.345686 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb164eff-c00c-4764-bb12-e371aaa02860-combined-ca-bundle\") pod \"keystone-cron-29335861-6fjkl\" (UID: \"eb164eff-c00c-4764-bb12-e371aaa02860\") " pod="openstack/keystone-cron-29335861-6fjkl" Oct 11 03:01:00 crc kubenswrapper[4743]: I1011 03:01:00.345736 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb164eff-c00c-4764-bb12-e371aaa02860-fernet-keys\") pod \"keystone-cron-29335861-6fjkl\" (UID: \"eb164eff-c00c-4764-bb12-e371aaa02860\") " pod="openstack/keystone-cron-29335861-6fjkl" Oct 11 03:01:00 crc kubenswrapper[4743]: I1011 03:01:00.345832 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb164eff-c00c-4764-bb12-e371aaa02860-config-data\") pod \"keystone-cron-29335861-6fjkl\" (UID: \"eb164eff-c00c-4764-bb12-e371aaa02860\") " pod="openstack/keystone-cron-29335861-6fjkl" Oct 11 03:01:00 crc kubenswrapper[4743]: I1011 03:01:00.345873 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knntc\" (UniqueName: \"kubernetes.io/projected/eb164eff-c00c-4764-bb12-e371aaa02860-kube-api-access-knntc\") pod \"keystone-cron-29335861-6fjkl\" (UID: \"eb164eff-c00c-4764-bb12-e371aaa02860\") " pod="openstack/keystone-cron-29335861-6fjkl" Oct 11 03:01:00 crc kubenswrapper[4743]: I1011 03:01:00.447760 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb164eff-c00c-4764-bb12-e371aaa02860-combined-ca-bundle\") pod \"keystone-cron-29335861-6fjkl\" (UID: \"eb164eff-c00c-4764-bb12-e371aaa02860\") " pod="openstack/keystone-cron-29335861-6fjkl" Oct 11 03:01:00 crc kubenswrapper[4743]: I1011 03:01:00.447807 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb164eff-c00c-4764-bb12-e371aaa02860-fernet-keys\") pod \"keystone-cron-29335861-6fjkl\" (UID: \"eb164eff-c00c-4764-bb12-e371aaa02860\") " pod="openstack/keystone-cron-29335861-6fjkl" Oct 11 03:01:00 crc kubenswrapper[4743]: I1011 03:01:00.447962 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb164eff-c00c-4764-bb12-e371aaa02860-config-data\") pod \"keystone-cron-29335861-6fjkl\" (UID: \"eb164eff-c00c-4764-bb12-e371aaa02860\") " pod="openstack/keystone-cron-29335861-6fjkl" Oct 11 03:01:00 crc kubenswrapper[4743]: I1011 03:01:00.448004 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knntc\" (UniqueName: \"kubernetes.io/projected/eb164eff-c00c-4764-bb12-e371aaa02860-kube-api-access-knntc\") pod \"keystone-cron-29335861-6fjkl\" (UID: \"eb164eff-c00c-4764-bb12-e371aaa02860\") " pod="openstack/keystone-cron-29335861-6fjkl" Oct 11 03:01:00 crc kubenswrapper[4743]: I1011 03:01:00.456915 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb164eff-c00c-4764-bb12-e371aaa02860-combined-ca-bundle\") pod \"keystone-cron-29335861-6fjkl\" (UID: \"eb164eff-c00c-4764-bb12-e371aaa02860\") " pod="openstack/keystone-cron-29335861-6fjkl" Oct 11 03:01:00 crc kubenswrapper[4743]: I1011 03:01:00.460554 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb164eff-c00c-4764-bb12-e371aaa02860-fernet-keys\") pod \"keystone-cron-29335861-6fjkl\" (UID: \"eb164eff-c00c-4764-bb12-e371aaa02860\") " pod="openstack/keystone-cron-29335861-6fjkl" Oct 11 03:01:00 crc kubenswrapper[4743]: I1011 03:01:00.464601 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb164eff-c00c-4764-bb12-e371aaa02860-config-data\") pod \"keystone-cron-29335861-6fjkl\" (UID: \"eb164eff-c00c-4764-bb12-e371aaa02860\") " pod="openstack/keystone-cron-29335861-6fjkl" Oct 11 03:01:00 crc kubenswrapper[4743]: I1011 03:01:00.468284 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knntc\" (UniqueName: \"kubernetes.io/projected/eb164eff-c00c-4764-bb12-e371aaa02860-kube-api-access-knntc\") pod \"keystone-cron-29335861-6fjkl\" (UID: \"eb164eff-c00c-4764-bb12-e371aaa02860\") " pod="openstack/keystone-cron-29335861-6fjkl" Oct 11 03:01:00 crc kubenswrapper[4743]: I1011 03:01:00.483580 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29335861-6fjkl" Oct 11 03:01:01 crc kubenswrapper[4743]: I1011 03:01:01.043943 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29335861-6fjkl"] Oct 11 03:01:01 crc kubenswrapper[4743]: I1011 03:01:01.535327 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29335861-6fjkl" event={"ID":"eb164eff-c00c-4764-bb12-e371aaa02860","Type":"ContainerStarted","Data":"e991f644d2fe2e6916092ef4fb0fea6d6d06600dddc87f689777a8b61e848618"} Oct 11 03:01:01 crc kubenswrapper[4743]: I1011 03:01:01.535651 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29335861-6fjkl" event={"ID":"eb164eff-c00c-4764-bb12-e371aaa02860","Type":"ContainerStarted","Data":"7351aa4f481e650bdc7beccbc368353d295d64ad9ea3d01b602ce0bc75ffac0a"} Oct 11 03:01:01 crc kubenswrapper[4743]: I1011 03:01:01.557636 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29335861-6fjkl" podStartSLOduration=1.557618135 podStartE2EDuration="1.557618135s" podCreationTimestamp="2025-10-11 03:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:01:01.554852356 +0000 UTC m=+7756.207832763" watchObservedRunningTime="2025-10-11 03:01:01.557618135 +0000 UTC m=+7756.210598532" Oct 11 03:01:05 crc kubenswrapper[4743]: I1011 03:01:05.576058 4743 generic.go:334] "Generic (PLEG): container finished" podID="eb164eff-c00c-4764-bb12-e371aaa02860" containerID="e991f644d2fe2e6916092ef4fb0fea6d6d06600dddc87f689777a8b61e848618" exitCode=0 Oct 11 03:01:05 crc kubenswrapper[4743]: I1011 03:01:05.576151 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29335861-6fjkl" event={"ID":"eb164eff-c00c-4764-bb12-e371aaa02860","Type":"ContainerDied","Data":"e991f644d2fe2e6916092ef4fb0fea6d6d06600dddc87f689777a8b61e848618"} Oct 11 03:01:06 crc kubenswrapper[4743]: I1011 03:01:06.989810 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29335861-6fjkl" Oct 11 03:01:07 crc kubenswrapper[4743]: I1011 03:01:07.010223 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb164eff-c00c-4764-bb12-e371aaa02860-config-data\") pod \"eb164eff-c00c-4764-bb12-e371aaa02860\" (UID: \"eb164eff-c00c-4764-bb12-e371aaa02860\") " Oct 11 03:01:07 crc kubenswrapper[4743]: I1011 03:01:07.010458 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knntc\" (UniqueName: \"kubernetes.io/projected/eb164eff-c00c-4764-bb12-e371aaa02860-kube-api-access-knntc\") pod \"eb164eff-c00c-4764-bb12-e371aaa02860\" (UID: \"eb164eff-c00c-4764-bb12-e371aaa02860\") " Oct 11 03:01:07 crc kubenswrapper[4743]: I1011 03:01:07.010551 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb164eff-c00c-4764-bb12-e371aaa02860-fernet-keys\") pod \"eb164eff-c00c-4764-bb12-e371aaa02860\" (UID: \"eb164eff-c00c-4764-bb12-e371aaa02860\") " Oct 11 03:01:07 crc kubenswrapper[4743]: I1011 03:01:07.010689 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb164eff-c00c-4764-bb12-e371aaa02860-combined-ca-bundle\") pod \"eb164eff-c00c-4764-bb12-e371aaa02860\" (UID: \"eb164eff-c00c-4764-bb12-e371aaa02860\") " Oct 11 03:01:07 crc kubenswrapper[4743]: I1011 03:01:07.018005 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb164eff-c00c-4764-bb12-e371aaa02860-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "eb164eff-c00c-4764-bb12-e371aaa02860" (UID: "eb164eff-c00c-4764-bb12-e371aaa02860"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:01:07 crc kubenswrapper[4743]: I1011 03:01:07.018136 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb164eff-c00c-4764-bb12-e371aaa02860-kube-api-access-knntc" (OuterVolumeSpecName: "kube-api-access-knntc") pod "eb164eff-c00c-4764-bb12-e371aaa02860" (UID: "eb164eff-c00c-4764-bb12-e371aaa02860"). InnerVolumeSpecName "kube-api-access-knntc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:01:07 crc kubenswrapper[4743]: I1011 03:01:07.055026 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb164eff-c00c-4764-bb12-e371aaa02860-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb164eff-c00c-4764-bb12-e371aaa02860" (UID: "eb164eff-c00c-4764-bb12-e371aaa02860"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:01:07 crc kubenswrapper[4743]: I1011 03:01:07.107877 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb164eff-c00c-4764-bb12-e371aaa02860-config-data" (OuterVolumeSpecName: "config-data") pod "eb164eff-c00c-4764-bb12-e371aaa02860" (UID: "eb164eff-c00c-4764-bb12-e371aaa02860"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 11 03:01:07 crc kubenswrapper[4743]: I1011 03:01:07.113445 4743 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb164eff-c00c-4764-bb12-e371aaa02860-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 11 03:01:07 crc kubenswrapper[4743]: I1011 03:01:07.113478 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb164eff-c00c-4764-bb12-e371aaa02860-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 11 03:01:07 crc kubenswrapper[4743]: I1011 03:01:07.113490 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb164eff-c00c-4764-bb12-e371aaa02860-config-data\") on node \"crc\" DevicePath \"\"" Oct 11 03:01:07 crc kubenswrapper[4743]: I1011 03:01:07.113500 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knntc\" (UniqueName: \"kubernetes.io/projected/eb164eff-c00c-4764-bb12-e371aaa02860-kube-api-access-knntc\") on node \"crc\" DevicePath \"\"" Oct 11 03:01:07 crc kubenswrapper[4743]: I1011 03:01:07.604528 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29335861-6fjkl" event={"ID":"eb164eff-c00c-4764-bb12-e371aaa02860","Type":"ContainerDied","Data":"7351aa4f481e650bdc7beccbc368353d295d64ad9ea3d01b602ce0bc75ffac0a"} Oct 11 03:01:07 crc kubenswrapper[4743]: I1011 03:01:07.604570 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7351aa4f481e650bdc7beccbc368353d295d64ad9ea3d01b602ce0bc75ffac0a" Oct 11 03:01:07 crc kubenswrapper[4743]: I1011 03:01:07.604601 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29335861-6fjkl" Oct 11 03:01:08 crc kubenswrapper[4743]: I1011 03:01:08.092136 4743 scope.go:117] "RemoveContainer" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" Oct 11 03:01:08 crc kubenswrapper[4743]: E1011 03:01:08.092704 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:01:14 crc kubenswrapper[4743]: I1011 03:01:14.320828 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-bwqtl_3b7ecaea-d42f-44b0-a181-3c61cf45bde2/kube-rbac-proxy/0.log" Oct 11 03:01:14 crc kubenswrapper[4743]: I1011 03:01:14.618570 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/cp-frr-files/0.log" Oct 11 03:01:14 crc kubenswrapper[4743]: I1011 03:01:14.621411 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-bwqtl_3b7ecaea-d42f-44b0-a181-3c61cf45bde2/controller/0.log" Oct 11 03:01:14 crc kubenswrapper[4743]: I1011 03:01:14.755585 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/cp-frr-files/0.log" Oct 11 03:01:14 crc kubenswrapper[4743]: I1011 03:01:14.829759 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/cp-reloader/0.log" Oct 11 03:01:14 crc kubenswrapper[4743]: I1011 03:01:14.832245 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/cp-reloader/0.log" Oct 11 03:01:14 crc kubenswrapper[4743]: I1011 03:01:14.867868 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/cp-metrics/0.log" Oct 11 03:01:15 crc kubenswrapper[4743]: I1011 03:01:15.055338 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/cp-frr-files/0.log" Oct 11 03:01:15 crc kubenswrapper[4743]: I1011 03:01:15.066380 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/cp-reloader/0.log" Oct 11 03:01:15 crc kubenswrapper[4743]: I1011 03:01:15.145691 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/cp-metrics/0.log" Oct 11 03:01:15 crc kubenswrapper[4743]: I1011 03:01:15.155554 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/cp-metrics/0.log" Oct 11 03:01:15 crc kubenswrapper[4743]: I1011 03:01:15.376897 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/cp-reloader/0.log" Oct 11 03:01:15 crc kubenswrapper[4743]: I1011 03:01:15.391845 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/controller/0.log" Oct 11 03:01:15 crc kubenswrapper[4743]: I1011 03:01:15.403049 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/cp-metrics/0.log" Oct 11 03:01:15 crc kubenswrapper[4743]: I1011 03:01:15.433442 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/cp-frr-files/0.log" Oct 11 03:01:15 crc kubenswrapper[4743]: I1011 03:01:15.580883 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/kube-rbac-proxy/0.log" Oct 11 03:01:15 crc kubenswrapper[4743]: I1011 03:01:15.662434 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/frr-metrics/0.log" Oct 11 03:01:15 crc kubenswrapper[4743]: I1011 03:01:15.687127 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/kube-rbac-proxy-frr/0.log" Oct 11 03:01:15 crc kubenswrapper[4743]: I1011 03:01:15.971464 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/reloader/0.log" Oct 11 03:01:16 crc kubenswrapper[4743]: I1011 03:01:16.119985 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-jct7h_2772ef34-f307-4a91-8f2f-28e3b22375a0/frr-k8s-webhook-server/0.log" Oct 11 03:01:16 crc kubenswrapper[4743]: I1011 03:01:16.361250 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7969b47488-dm7g4_ef8e01c5-e132-4e07-9ce3-9a5578548ad7/manager/0.log" Oct 11 03:01:16 crc kubenswrapper[4743]: I1011 03:01:16.475512 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-79c6c9bd96-5sx97_3cf19c00-f066-4814-9134-4a6d4aed88a7/webhook-server/0.log" Oct 11 03:01:16 crc kubenswrapper[4743]: I1011 03:01:16.689946 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-9g4hk_83bddd85-204d-438d-a29f-e7fca659542a/kube-rbac-proxy/0.log" Oct 11 03:01:17 crc kubenswrapper[4743]: I1011 03:01:17.310935 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-9g4hk_83bddd85-204d-438d-a29f-e7fca659542a/speaker/0.log" Oct 11 03:01:17 crc kubenswrapper[4743]: I1011 03:01:17.717732 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/frr/0.log" Oct 11 03:01:22 crc kubenswrapper[4743]: I1011 03:01:22.092006 4743 scope.go:117] "RemoveContainer" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" Oct 11 03:01:22 crc kubenswrapper[4743]: E1011 03:01:22.092841 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:01:30 crc kubenswrapper[4743]: I1011 03:01:30.031378 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x_2d977ae5-754b-436a-87b9-b0618947c353/util/0.log" Oct 11 03:01:30 crc kubenswrapper[4743]: I1011 03:01:30.250511 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x_2d977ae5-754b-436a-87b9-b0618947c353/pull/0.log" Oct 11 03:01:30 crc kubenswrapper[4743]: I1011 03:01:30.255959 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x_2d977ae5-754b-436a-87b9-b0618947c353/util/0.log" Oct 11 03:01:30 crc kubenswrapper[4743]: I1011 03:01:30.306014 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x_2d977ae5-754b-436a-87b9-b0618947c353/pull/0.log" Oct 11 03:01:30 crc kubenswrapper[4743]: I1011 03:01:30.491734 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x_2d977ae5-754b-436a-87b9-b0618947c353/util/0.log" Oct 11 03:01:30 crc kubenswrapper[4743]: I1011 03:01:30.502077 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x_2d977ae5-754b-436a-87b9-b0618947c353/pull/0.log" Oct 11 03:01:30 crc kubenswrapper[4743]: I1011 03:01:30.528970 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x_2d977ae5-754b-436a-87b9-b0618947c353/extract/0.log" Oct 11 03:01:30 crc kubenswrapper[4743]: I1011 03:01:30.679480 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8_69d0b6c4-03c5-4968-9dfb-0b6b2774954a/util/0.log" Oct 11 03:01:30 crc kubenswrapper[4743]: I1011 03:01:30.877056 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8_69d0b6c4-03c5-4968-9dfb-0b6b2774954a/pull/0.log" Oct 11 03:01:30 crc kubenswrapper[4743]: I1011 03:01:30.882338 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8_69d0b6c4-03c5-4968-9dfb-0b6b2774954a/pull/0.log" Oct 11 03:01:30 crc kubenswrapper[4743]: I1011 03:01:30.909051 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8_69d0b6c4-03c5-4968-9dfb-0b6b2774954a/util/0.log" Oct 11 03:01:31 crc kubenswrapper[4743]: I1011 03:01:31.103462 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8_69d0b6c4-03c5-4968-9dfb-0b6b2774954a/pull/0.log" Oct 11 03:01:31 crc kubenswrapper[4743]: I1011 03:01:31.112658 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8_69d0b6c4-03c5-4968-9dfb-0b6b2774954a/util/0.log" Oct 11 03:01:31 crc kubenswrapper[4743]: I1011 03:01:31.127331 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8_69d0b6c4-03c5-4968-9dfb-0b6b2774954a/extract/0.log" Oct 11 03:01:31 crc kubenswrapper[4743]: I1011 03:01:31.458402 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt_f424c028-a0f6-4327-90f7-338a9d21043c/util/0.log" Oct 11 03:01:31 crc kubenswrapper[4743]: I1011 03:01:31.697839 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt_f424c028-a0f6-4327-90f7-338a9d21043c/util/0.log" Oct 11 03:01:31 crc kubenswrapper[4743]: I1011 03:01:31.721649 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt_f424c028-a0f6-4327-90f7-338a9d21043c/pull/0.log" Oct 11 03:01:31 crc kubenswrapper[4743]: I1011 03:01:31.764816 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt_f424c028-a0f6-4327-90f7-338a9d21043c/pull/0.log" Oct 11 03:01:31 crc kubenswrapper[4743]: I1011 03:01:31.934081 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt_f424c028-a0f6-4327-90f7-338a9d21043c/pull/0.log" Oct 11 03:01:31 crc kubenswrapper[4743]: I1011 03:01:31.985014 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt_f424c028-a0f6-4327-90f7-338a9d21043c/util/0.log" Oct 11 03:01:31 crc kubenswrapper[4743]: I1011 03:01:31.991089 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt_f424c028-a0f6-4327-90f7-338a9d21043c/extract/0.log" Oct 11 03:01:32 crc kubenswrapper[4743]: I1011 03:01:32.133080 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc_3f25a7dd-4148-4251-9896-0d781682a3c3/util/0.log" Oct 11 03:01:32 crc kubenswrapper[4743]: I1011 03:01:32.368350 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc_3f25a7dd-4148-4251-9896-0d781682a3c3/util/0.log" Oct 11 03:01:32 crc kubenswrapper[4743]: I1011 03:01:32.371268 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc_3f25a7dd-4148-4251-9896-0d781682a3c3/pull/0.log" Oct 11 03:01:32 crc kubenswrapper[4743]: I1011 03:01:32.414828 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc_3f25a7dd-4148-4251-9896-0d781682a3c3/pull/0.log" Oct 11 03:01:32 crc kubenswrapper[4743]: I1011 03:01:32.597696 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc_3f25a7dd-4148-4251-9896-0d781682a3c3/pull/0.log" Oct 11 03:01:32 crc kubenswrapper[4743]: I1011 03:01:32.622094 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc_3f25a7dd-4148-4251-9896-0d781682a3c3/util/0.log" Oct 11 03:01:32 crc kubenswrapper[4743]: I1011 03:01:32.626296 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc_3f25a7dd-4148-4251-9896-0d781682a3c3/extract/0.log" Oct 11 03:01:32 crc kubenswrapper[4743]: I1011 03:01:32.777465 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wqwgk_3543dfa9-ce3b-48b3-bc13-70ade6294a3b/extract-utilities/0.log" Oct 11 03:01:33 crc kubenswrapper[4743]: I1011 03:01:33.027484 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wqwgk_3543dfa9-ce3b-48b3-bc13-70ade6294a3b/extract-content/0.log" Oct 11 03:01:33 crc kubenswrapper[4743]: I1011 03:01:33.034400 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wqwgk_3543dfa9-ce3b-48b3-bc13-70ade6294a3b/extract-content/0.log" Oct 11 03:01:33 crc kubenswrapper[4743]: I1011 03:01:33.074901 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wqwgk_3543dfa9-ce3b-48b3-bc13-70ade6294a3b/extract-utilities/0.log" Oct 11 03:01:33 crc kubenswrapper[4743]: I1011 03:01:33.224040 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wqwgk_3543dfa9-ce3b-48b3-bc13-70ade6294a3b/extract-utilities/0.log" Oct 11 03:01:33 crc kubenswrapper[4743]: I1011 03:01:33.276928 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wqwgk_3543dfa9-ce3b-48b3-bc13-70ade6294a3b/extract-content/0.log" Oct 11 03:01:33 crc kubenswrapper[4743]: I1011 03:01:33.438218 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l6gl2_8b619db5-930f-4298-9d1d-2c74a9e60783/extract-utilities/0.log" Oct 11 03:01:33 crc kubenswrapper[4743]: I1011 03:01:33.633809 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l6gl2_8b619db5-930f-4298-9d1d-2c74a9e60783/extract-content/0.log" Oct 11 03:01:33 crc kubenswrapper[4743]: I1011 03:01:33.637054 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l6gl2_8b619db5-930f-4298-9d1d-2c74a9e60783/extract-utilities/0.log" Oct 11 03:01:33 crc kubenswrapper[4743]: I1011 03:01:33.652203 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l6gl2_8b619db5-930f-4298-9d1d-2c74a9e60783/extract-content/0.log" Oct 11 03:01:33 crc kubenswrapper[4743]: I1011 03:01:33.926193 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l6gl2_8b619db5-930f-4298-9d1d-2c74a9e60783/extract-utilities/0.log" Oct 11 03:01:34 crc kubenswrapper[4743]: I1011 03:01:34.003549 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l6gl2_8b619db5-930f-4298-9d1d-2c74a9e60783/extract-content/0.log" Oct 11 03:01:34 crc kubenswrapper[4743]: I1011 03:01:34.208959 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk_7c5ed10c-1531-47da-8329-5589a12da9ac/util/0.log" Oct 11 03:01:34 crc kubenswrapper[4743]: I1011 03:01:34.414729 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk_7c5ed10c-1531-47da-8329-5589a12da9ac/util/0.log" Oct 11 03:01:34 crc kubenswrapper[4743]: I1011 03:01:34.441339 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk_7c5ed10c-1531-47da-8329-5589a12da9ac/pull/0.log" Oct 11 03:01:34 crc kubenswrapper[4743]: I1011 03:01:34.444363 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk_7c5ed10c-1531-47da-8329-5589a12da9ac/pull/0.log" Oct 11 03:01:34 crc kubenswrapper[4743]: I1011 03:01:34.490807 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wqwgk_3543dfa9-ce3b-48b3-bc13-70ade6294a3b/registry-server/0.log" Oct 11 03:01:34 crc kubenswrapper[4743]: I1011 03:01:34.688970 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk_7c5ed10c-1531-47da-8329-5589a12da9ac/util/0.log" Oct 11 03:01:34 crc kubenswrapper[4743]: I1011 03:01:34.740604 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk_7c5ed10c-1531-47da-8329-5589a12da9ac/extract/0.log" Oct 11 03:01:34 crc kubenswrapper[4743]: I1011 03:01:34.830684 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk_7c5ed10c-1531-47da-8329-5589a12da9ac/pull/0.log" Oct 11 03:01:34 crc kubenswrapper[4743]: I1011 03:01:34.934150 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-cpcdj_ecba19cf-13a3-40ee-8d5a-17af54a79caa/marketplace-operator/0.log" Oct 11 03:01:35 crc kubenswrapper[4743]: I1011 03:01:35.030505 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttsr5_3bd2c6b7-4918-4cb2-abcd-efa1523befe0/extract-utilities/0.log" Oct 11 03:01:35 crc kubenswrapper[4743]: I1011 03:01:35.196382 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l6gl2_8b619db5-930f-4298-9d1d-2c74a9e60783/registry-server/0.log" Oct 11 03:01:35 crc kubenswrapper[4743]: I1011 03:01:35.280736 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttsr5_3bd2c6b7-4918-4cb2-abcd-efa1523befe0/extract-utilities/0.log" Oct 11 03:01:35 crc kubenswrapper[4743]: I1011 03:01:35.306757 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttsr5_3bd2c6b7-4918-4cb2-abcd-efa1523befe0/extract-content/0.log" Oct 11 03:01:35 crc kubenswrapper[4743]: I1011 03:01:35.326824 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttsr5_3bd2c6b7-4918-4cb2-abcd-efa1523befe0/extract-content/0.log" Oct 11 03:01:35 crc kubenswrapper[4743]: I1011 03:01:35.575614 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vs4lk_a6061b50-92be-4942-b961-a094b28b50a9/extract-utilities/0.log" Oct 11 03:01:35 crc kubenswrapper[4743]: I1011 03:01:35.612515 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttsr5_3bd2c6b7-4918-4cb2-abcd-efa1523befe0/extract-utilities/0.log" Oct 11 03:01:35 crc kubenswrapper[4743]: I1011 03:01:35.643508 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttsr5_3bd2c6b7-4918-4cb2-abcd-efa1523befe0/extract-content/0.log" Oct 11 03:01:35 crc kubenswrapper[4743]: I1011 03:01:35.839390 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vs4lk_a6061b50-92be-4942-b961-a094b28b50a9/extract-utilities/0.log" Oct 11 03:01:35 crc kubenswrapper[4743]: I1011 03:01:35.847908 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vs4lk_a6061b50-92be-4942-b961-a094b28b50a9/extract-content/0.log" Oct 11 03:01:35 crc kubenswrapper[4743]: I1011 03:01:35.851952 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttsr5_3bd2c6b7-4918-4cb2-abcd-efa1523befe0/registry-server/0.log" Oct 11 03:01:35 crc kubenswrapper[4743]: I1011 03:01:35.890647 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vs4lk_a6061b50-92be-4942-b961-a094b28b50a9/extract-content/0.log" Oct 11 03:01:36 crc kubenswrapper[4743]: I1011 03:01:36.018633 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vs4lk_a6061b50-92be-4942-b961-a094b28b50a9/extract-utilities/0.log" Oct 11 03:01:36 crc kubenswrapper[4743]: I1011 03:01:36.019135 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vs4lk_a6061b50-92be-4942-b961-a094b28b50a9/extract-content/0.log" Oct 11 03:01:36 crc kubenswrapper[4743]: I1011 03:01:36.948734 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vs4lk_a6061b50-92be-4942-b961-a094b28b50a9/registry-server/0.log" Oct 11 03:01:37 crc kubenswrapper[4743]: I1011 03:01:37.091752 4743 scope.go:117] "RemoveContainer" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" Oct 11 03:01:37 crc kubenswrapper[4743]: E1011 03:01:37.092090 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:01:49 crc kubenswrapper[4743]: I1011 03:01:49.092270 4743 scope.go:117] "RemoveContainer" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" Oct 11 03:01:49 crc kubenswrapper[4743]: E1011 03:01:49.094136 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:01:49 crc kubenswrapper[4743]: I1011 03:01:49.230121 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-4ftxm_9918e6d2-f9d4-4c2f-93ef-cc952577182b/prometheus-operator/0.log" Oct 11 03:01:49 crc kubenswrapper[4743]: I1011 03:01:49.479467 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-55f6745849-6fxfg_e62c0910-f3a4-4c85-9ad5-88f6fa5262df/prometheus-operator-admission-webhook/0.log" Oct 11 03:01:49 crc kubenswrapper[4743]: I1011 03:01:49.527702 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-55f6745849-chpjb_97d6cff1-f86b-4110-9c64-907a97ea4ceb/prometheus-operator-admission-webhook/0.log" Oct 11 03:01:49 crc kubenswrapper[4743]: I1011 03:01:49.707005 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-6584dc9448-tzxl5_c4bbc3ff-d45e-46ac-a8fc-6b75e1f5d342/observability-ui-dashboards/0.log" Oct 11 03:01:49 crc kubenswrapper[4743]: I1011 03:01:49.740158 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-sgfrq_c4642856-48b2-4843-a11e-1a207a8c8efc/operator/0.log" Oct 11 03:01:49 crc kubenswrapper[4743]: I1011 03:01:49.913133 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-bdtnb_fe87b24f-db4d-49cd-a2de-ab949443ecea/perses-operator/0.log" Oct 11 03:01:51 crc kubenswrapper[4743]: I1011 03:01:51.483929 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k5txp"] Oct 11 03:01:51 crc kubenswrapper[4743]: E1011 03:01:51.484534 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb164eff-c00c-4764-bb12-e371aaa02860" containerName="keystone-cron" Oct 11 03:01:51 crc kubenswrapper[4743]: I1011 03:01:51.484548 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb164eff-c00c-4764-bb12-e371aaa02860" containerName="keystone-cron" Oct 11 03:01:51 crc kubenswrapper[4743]: I1011 03:01:51.484821 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb164eff-c00c-4764-bb12-e371aaa02860" containerName="keystone-cron" Oct 11 03:01:51 crc kubenswrapper[4743]: I1011 03:01:51.486732 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5txp" Oct 11 03:01:51 crc kubenswrapper[4743]: I1011 03:01:51.513938 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5txp"] Oct 11 03:01:51 crc kubenswrapper[4743]: I1011 03:01:51.573954 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm5c6\" (UniqueName: \"kubernetes.io/projected/9183d067-0293-4cfb-b274-8aaff386b391-kube-api-access-hm5c6\") pod \"redhat-marketplace-k5txp\" (UID: \"9183d067-0293-4cfb-b274-8aaff386b391\") " pod="openshift-marketplace/redhat-marketplace-k5txp" Oct 11 03:01:51 crc kubenswrapper[4743]: I1011 03:01:51.574116 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9183d067-0293-4cfb-b274-8aaff386b391-catalog-content\") pod \"redhat-marketplace-k5txp\" (UID: \"9183d067-0293-4cfb-b274-8aaff386b391\") " pod="openshift-marketplace/redhat-marketplace-k5txp" Oct 11 03:01:51 crc kubenswrapper[4743]: I1011 03:01:51.574139 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9183d067-0293-4cfb-b274-8aaff386b391-utilities\") pod \"redhat-marketplace-k5txp\" (UID: \"9183d067-0293-4cfb-b274-8aaff386b391\") " pod="openshift-marketplace/redhat-marketplace-k5txp" Oct 11 03:01:51 crc kubenswrapper[4743]: I1011 03:01:51.675784 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9183d067-0293-4cfb-b274-8aaff386b391-catalog-content\") pod \"redhat-marketplace-k5txp\" (UID: \"9183d067-0293-4cfb-b274-8aaff386b391\") " pod="openshift-marketplace/redhat-marketplace-k5txp" Oct 11 03:01:51 crc kubenswrapper[4743]: I1011 03:01:51.675834 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9183d067-0293-4cfb-b274-8aaff386b391-utilities\") pod \"redhat-marketplace-k5txp\" (UID: \"9183d067-0293-4cfb-b274-8aaff386b391\") " pod="openshift-marketplace/redhat-marketplace-k5txp" Oct 11 03:01:51 crc kubenswrapper[4743]: I1011 03:01:51.676038 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm5c6\" (UniqueName: \"kubernetes.io/projected/9183d067-0293-4cfb-b274-8aaff386b391-kube-api-access-hm5c6\") pod \"redhat-marketplace-k5txp\" (UID: \"9183d067-0293-4cfb-b274-8aaff386b391\") " pod="openshift-marketplace/redhat-marketplace-k5txp" Oct 11 03:01:51 crc kubenswrapper[4743]: I1011 03:01:51.676490 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9183d067-0293-4cfb-b274-8aaff386b391-catalog-content\") pod \"redhat-marketplace-k5txp\" (UID: \"9183d067-0293-4cfb-b274-8aaff386b391\") " pod="openshift-marketplace/redhat-marketplace-k5txp" Oct 11 03:01:51 crc kubenswrapper[4743]: I1011 03:01:51.676574 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9183d067-0293-4cfb-b274-8aaff386b391-utilities\") pod \"redhat-marketplace-k5txp\" (UID: \"9183d067-0293-4cfb-b274-8aaff386b391\") " pod="openshift-marketplace/redhat-marketplace-k5txp" Oct 11 03:01:51 crc kubenswrapper[4743]: I1011 03:01:51.698504 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm5c6\" (UniqueName: \"kubernetes.io/projected/9183d067-0293-4cfb-b274-8aaff386b391-kube-api-access-hm5c6\") pod \"redhat-marketplace-k5txp\" (UID: \"9183d067-0293-4cfb-b274-8aaff386b391\") " pod="openshift-marketplace/redhat-marketplace-k5txp" Oct 11 03:01:51 crc kubenswrapper[4743]: I1011 03:01:51.807845 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5txp" Oct 11 03:01:52 crc kubenswrapper[4743]: I1011 03:01:52.324469 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5txp"] Oct 11 03:01:53 crc kubenswrapper[4743]: I1011 03:01:53.085282 4743 generic.go:334] "Generic (PLEG): container finished" podID="9183d067-0293-4cfb-b274-8aaff386b391" containerID="09d45d0890d509286b222d72a7706cb1da8c53d0352e58e4b6b2b03a5604f65e" exitCode=0 Oct 11 03:01:53 crc kubenswrapper[4743]: I1011 03:01:53.085406 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5txp" event={"ID":"9183d067-0293-4cfb-b274-8aaff386b391","Type":"ContainerDied","Data":"09d45d0890d509286b222d72a7706cb1da8c53d0352e58e4b6b2b03a5604f65e"} Oct 11 03:01:53 crc kubenswrapper[4743]: I1011 03:01:53.085646 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5txp" event={"ID":"9183d067-0293-4cfb-b274-8aaff386b391","Type":"ContainerStarted","Data":"8be4348869d03fa9896f0f00ecb5a9ba3ba46bad3d62be72214eceda02eafc68"} Oct 11 03:01:53 crc kubenswrapper[4743]: I1011 03:01:53.087973 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 03:01:55 crc kubenswrapper[4743]: I1011 03:01:55.110916 4743 generic.go:334] "Generic (PLEG): container finished" podID="9183d067-0293-4cfb-b274-8aaff386b391" containerID="ae87351241c450973e6ccaf8faed7fd882030b06db047f6833ce53284665e9ec" exitCode=0 Oct 11 03:01:55 crc kubenswrapper[4743]: I1011 03:01:55.111004 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5txp" event={"ID":"9183d067-0293-4cfb-b274-8aaff386b391","Type":"ContainerDied","Data":"ae87351241c450973e6ccaf8faed7fd882030b06db047f6833ce53284665e9ec"} Oct 11 03:01:56 crc kubenswrapper[4743]: I1011 03:01:56.127008 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5txp" event={"ID":"9183d067-0293-4cfb-b274-8aaff386b391","Type":"ContainerStarted","Data":"b0810b47ac3b2fec2a092001b170f2152fdc6afc1ee76887d7310702eed4c554"} Oct 11 03:01:56 crc kubenswrapper[4743]: I1011 03:01:56.146000 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k5txp" podStartSLOduration=2.503174535 podStartE2EDuration="5.145978788s" podCreationTimestamp="2025-10-11 03:01:51 +0000 UTC" firstStartedPulling="2025-10-11 03:01:53.087181963 +0000 UTC m=+7807.740162360" lastFinishedPulling="2025-10-11 03:01:55.729986206 +0000 UTC m=+7810.382966613" observedRunningTime="2025-10-11 03:01:56.143087876 +0000 UTC m=+7810.796068283" watchObservedRunningTime="2025-10-11 03:01:56.145978788 +0000 UTC m=+7810.798959195" Oct 11 03:02:01 crc kubenswrapper[4743]: I1011 03:02:01.808129 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k5txp" Oct 11 03:02:01 crc kubenswrapper[4743]: I1011 03:02:01.808630 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k5txp" Oct 11 03:02:01 crc kubenswrapper[4743]: I1011 03:02:01.869799 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k5txp" Oct 11 03:02:02 crc kubenswrapper[4743]: I1011 03:02:02.092133 4743 scope.go:117] "RemoveContainer" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" Oct 11 03:02:02 crc kubenswrapper[4743]: E1011 03:02:02.092455 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:02:02 crc kubenswrapper[4743]: I1011 03:02:02.240476 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k5txp" Oct 11 03:02:02 crc kubenswrapper[4743]: I1011 03:02:02.325840 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5txp"] Oct 11 03:02:03 crc kubenswrapper[4743]: I1011 03:02:03.429324 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7857f779b4-t484n_d60b36a6-87ff-4325-a245-43b3dea4cfaf/kube-rbac-proxy/0.log" Oct 11 03:02:03 crc kubenswrapper[4743]: I1011 03:02:03.460087 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7857f779b4-t484n_d60b36a6-87ff-4325-a245-43b3dea4cfaf/manager/0.log" Oct 11 03:02:04 crc kubenswrapper[4743]: I1011 03:02:04.205837 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k5txp" podUID="9183d067-0293-4cfb-b274-8aaff386b391" containerName="registry-server" containerID="cri-o://b0810b47ac3b2fec2a092001b170f2152fdc6afc1ee76887d7310702eed4c554" gracePeriod=2 Oct 11 03:02:04 crc kubenswrapper[4743]: I1011 03:02:04.856849 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5txp" Oct 11 03:02:05 crc kubenswrapper[4743]: I1011 03:02:05.006980 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9183d067-0293-4cfb-b274-8aaff386b391-catalog-content\") pod \"9183d067-0293-4cfb-b274-8aaff386b391\" (UID: \"9183d067-0293-4cfb-b274-8aaff386b391\") " Oct 11 03:02:05 crc kubenswrapper[4743]: I1011 03:02:05.007217 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm5c6\" (UniqueName: \"kubernetes.io/projected/9183d067-0293-4cfb-b274-8aaff386b391-kube-api-access-hm5c6\") pod \"9183d067-0293-4cfb-b274-8aaff386b391\" (UID: \"9183d067-0293-4cfb-b274-8aaff386b391\") " Oct 11 03:02:05 crc kubenswrapper[4743]: I1011 03:02:05.007251 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9183d067-0293-4cfb-b274-8aaff386b391-utilities\") pod \"9183d067-0293-4cfb-b274-8aaff386b391\" (UID: \"9183d067-0293-4cfb-b274-8aaff386b391\") " Oct 11 03:02:05 crc kubenswrapper[4743]: I1011 03:02:05.008050 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9183d067-0293-4cfb-b274-8aaff386b391-utilities" (OuterVolumeSpecName: "utilities") pod "9183d067-0293-4cfb-b274-8aaff386b391" (UID: "9183d067-0293-4cfb-b274-8aaff386b391"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:02:05 crc kubenswrapper[4743]: I1011 03:02:05.022280 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9183d067-0293-4cfb-b274-8aaff386b391-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9183d067-0293-4cfb-b274-8aaff386b391" (UID: "9183d067-0293-4cfb-b274-8aaff386b391"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:02:05 crc kubenswrapper[4743]: I1011 03:02:05.028712 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9183d067-0293-4cfb-b274-8aaff386b391-kube-api-access-hm5c6" (OuterVolumeSpecName: "kube-api-access-hm5c6") pod "9183d067-0293-4cfb-b274-8aaff386b391" (UID: "9183d067-0293-4cfb-b274-8aaff386b391"). InnerVolumeSpecName "kube-api-access-hm5c6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:02:05 crc kubenswrapper[4743]: I1011 03:02:05.109684 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9183d067-0293-4cfb-b274-8aaff386b391-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 03:02:05 crc kubenswrapper[4743]: I1011 03:02:05.109725 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm5c6\" (UniqueName: \"kubernetes.io/projected/9183d067-0293-4cfb-b274-8aaff386b391-kube-api-access-hm5c6\") on node \"crc\" DevicePath \"\"" Oct 11 03:02:05 crc kubenswrapper[4743]: I1011 03:02:05.109740 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9183d067-0293-4cfb-b274-8aaff386b391-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 03:02:05 crc kubenswrapper[4743]: I1011 03:02:05.220334 4743 generic.go:334] "Generic (PLEG): container finished" podID="9183d067-0293-4cfb-b274-8aaff386b391" containerID="b0810b47ac3b2fec2a092001b170f2152fdc6afc1ee76887d7310702eed4c554" exitCode=0 Oct 11 03:02:05 crc kubenswrapper[4743]: I1011 03:02:05.220378 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5txp" event={"ID":"9183d067-0293-4cfb-b274-8aaff386b391","Type":"ContainerDied","Data":"b0810b47ac3b2fec2a092001b170f2152fdc6afc1ee76887d7310702eed4c554"} Oct 11 03:02:05 crc kubenswrapper[4743]: I1011 03:02:05.220409 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5txp" Oct 11 03:02:05 crc kubenswrapper[4743]: I1011 03:02:05.220442 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5txp" event={"ID":"9183d067-0293-4cfb-b274-8aaff386b391","Type":"ContainerDied","Data":"8be4348869d03fa9896f0f00ecb5a9ba3ba46bad3d62be72214eceda02eafc68"} Oct 11 03:02:05 crc kubenswrapper[4743]: I1011 03:02:05.220474 4743 scope.go:117] "RemoveContainer" containerID="b0810b47ac3b2fec2a092001b170f2152fdc6afc1ee76887d7310702eed4c554" Oct 11 03:02:05 crc kubenswrapper[4743]: I1011 03:02:05.246393 4743 scope.go:117] "RemoveContainer" containerID="ae87351241c450973e6ccaf8faed7fd882030b06db047f6833ce53284665e9ec" Oct 11 03:02:05 crc kubenswrapper[4743]: I1011 03:02:05.270212 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5txp"] Oct 11 03:02:05 crc kubenswrapper[4743]: I1011 03:02:05.285268 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5txp"] Oct 11 03:02:05 crc kubenswrapper[4743]: I1011 03:02:05.290414 4743 scope.go:117] "RemoveContainer" containerID="09d45d0890d509286b222d72a7706cb1da8c53d0352e58e4b6b2b03a5604f65e" Oct 11 03:02:05 crc kubenswrapper[4743]: I1011 03:02:05.331333 4743 scope.go:117] "RemoveContainer" containerID="b0810b47ac3b2fec2a092001b170f2152fdc6afc1ee76887d7310702eed4c554" Oct 11 03:02:05 crc kubenswrapper[4743]: E1011 03:02:05.332146 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0810b47ac3b2fec2a092001b170f2152fdc6afc1ee76887d7310702eed4c554\": container with ID starting with b0810b47ac3b2fec2a092001b170f2152fdc6afc1ee76887d7310702eed4c554 not found: ID does not exist" containerID="b0810b47ac3b2fec2a092001b170f2152fdc6afc1ee76887d7310702eed4c554" Oct 11 03:02:05 crc kubenswrapper[4743]: I1011 03:02:05.332272 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0810b47ac3b2fec2a092001b170f2152fdc6afc1ee76887d7310702eed4c554"} err="failed to get container status \"b0810b47ac3b2fec2a092001b170f2152fdc6afc1ee76887d7310702eed4c554\": rpc error: code = NotFound desc = could not find container \"b0810b47ac3b2fec2a092001b170f2152fdc6afc1ee76887d7310702eed4c554\": container with ID starting with b0810b47ac3b2fec2a092001b170f2152fdc6afc1ee76887d7310702eed4c554 not found: ID does not exist" Oct 11 03:02:05 crc kubenswrapper[4743]: I1011 03:02:05.332374 4743 scope.go:117] "RemoveContainer" containerID="ae87351241c450973e6ccaf8faed7fd882030b06db047f6833ce53284665e9ec" Oct 11 03:02:05 crc kubenswrapper[4743]: E1011 03:02:05.332737 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae87351241c450973e6ccaf8faed7fd882030b06db047f6833ce53284665e9ec\": container with ID starting with ae87351241c450973e6ccaf8faed7fd882030b06db047f6833ce53284665e9ec not found: ID does not exist" containerID="ae87351241c450973e6ccaf8faed7fd882030b06db047f6833ce53284665e9ec" Oct 11 03:02:05 crc kubenswrapper[4743]: I1011 03:02:05.332840 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae87351241c450973e6ccaf8faed7fd882030b06db047f6833ce53284665e9ec"} err="failed to get container status \"ae87351241c450973e6ccaf8faed7fd882030b06db047f6833ce53284665e9ec\": rpc error: code = NotFound desc = could not find container \"ae87351241c450973e6ccaf8faed7fd882030b06db047f6833ce53284665e9ec\": container with ID starting with ae87351241c450973e6ccaf8faed7fd882030b06db047f6833ce53284665e9ec not found: ID does not exist" Oct 11 03:02:05 crc kubenswrapper[4743]: I1011 03:02:05.332929 4743 scope.go:117] "RemoveContainer" containerID="09d45d0890d509286b222d72a7706cb1da8c53d0352e58e4b6b2b03a5604f65e" Oct 11 03:02:05 crc kubenswrapper[4743]: E1011 03:02:05.333756 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09d45d0890d509286b222d72a7706cb1da8c53d0352e58e4b6b2b03a5604f65e\": container with ID starting with 09d45d0890d509286b222d72a7706cb1da8c53d0352e58e4b6b2b03a5604f65e not found: ID does not exist" containerID="09d45d0890d509286b222d72a7706cb1da8c53d0352e58e4b6b2b03a5604f65e" Oct 11 03:02:05 crc kubenswrapper[4743]: I1011 03:02:05.333898 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09d45d0890d509286b222d72a7706cb1da8c53d0352e58e4b6b2b03a5604f65e"} err="failed to get container status \"09d45d0890d509286b222d72a7706cb1da8c53d0352e58e4b6b2b03a5604f65e\": rpc error: code = NotFound desc = could not find container \"09d45d0890d509286b222d72a7706cb1da8c53d0352e58e4b6b2b03a5604f65e\": container with ID starting with 09d45d0890d509286b222d72a7706cb1da8c53d0352e58e4b6b2b03a5604f65e not found: ID does not exist" Oct 11 03:02:06 crc kubenswrapper[4743]: I1011 03:02:06.106531 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9183d067-0293-4cfb-b274-8aaff386b391" path="/var/lib/kubelet/pods/9183d067-0293-4cfb-b274-8aaff386b391/volumes" Oct 11 03:02:15 crc kubenswrapper[4743]: I1011 03:02:15.094691 4743 scope.go:117] "RemoveContainer" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" Oct 11 03:02:15 crc kubenswrapper[4743]: E1011 03:02:15.095474 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:02:22 crc kubenswrapper[4743]: E1011 03:02:22.928940 4743 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.106:45582->38.102.83.106:39201: write tcp 38.102.83.106:45582->38.102.83.106:39201: write: broken pipe Oct 11 03:02:27 crc kubenswrapper[4743]: I1011 03:02:27.092062 4743 scope.go:117] "RemoveContainer" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" Oct 11 03:02:27 crc kubenswrapper[4743]: E1011 03:02:27.094171 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:02:39 crc kubenswrapper[4743]: I1011 03:02:39.093421 4743 scope.go:117] "RemoveContainer" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" Oct 11 03:02:39 crc kubenswrapper[4743]: E1011 03:02:39.097504 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:02:50 crc kubenswrapper[4743]: I1011 03:02:50.111483 4743 scope.go:117] "RemoveContainer" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" Oct 11 03:02:50 crc kubenswrapper[4743]: E1011 03:02:50.112533 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:03:05 crc kubenswrapper[4743]: I1011 03:03:05.092844 4743 scope.go:117] "RemoveContainer" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" Oct 11 03:03:05 crc kubenswrapper[4743]: E1011 03:03:05.093631 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:03:20 crc kubenswrapper[4743]: I1011 03:03:20.095232 4743 scope.go:117] "RemoveContainer" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" Oct 11 03:03:20 crc kubenswrapper[4743]: E1011 03:03:20.096446 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:03:33 crc kubenswrapper[4743]: I1011 03:03:33.092087 4743 scope.go:117] "RemoveContainer" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" Oct 11 03:03:33 crc kubenswrapper[4743]: E1011 03:03:33.092829 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:03:48 crc kubenswrapper[4743]: I1011 03:03:48.093425 4743 scope.go:117] "RemoveContainer" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" Oct 11 03:03:48 crc kubenswrapper[4743]: E1011 03:03:48.094111 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:03:59 crc kubenswrapper[4743]: I1011 03:03:59.570340 4743 scope.go:117] "RemoveContainer" containerID="9dc7fcc1d4ae9fb30dfa876b976bade81cea21a40e14d1cbec11aae0e91205d9" Oct 11 03:04:00 crc kubenswrapper[4743]: I1011 03:04:00.091696 4743 scope.go:117] "RemoveContainer" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" Oct 11 03:04:00 crc kubenswrapper[4743]: E1011 03:04:00.091992 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:04:01 crc kubenswrapper[4743]: I1011 03:04:01.513279 4743 generic.go:334] "Generic (PLEG): container finished" podID="6ac11bdb-263b-4572-889e-311b00b61201" containerID="b09072a149bbfd55f7798798a0fdf816b98d19e5b0821a0efe726be18ad892d6" exitCode=0 Oct 11 03:04:01 crc kubenswrapper[4743]: I1011 03:04:01.513349 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dmn4d/must-gather-svv82" event={"ID":"6ac11bdb-263b-4572-889e-311b00b61201","Type":"ContainerDied","Data":"b09072a149bbfd55f7798798a0fdf816b98d19e5b0821a0efe726be18ad892d6"} Oct 11 03:04:01 crc kubenswrapper[4743]: I1011 03:04:01.514298 4743 scope.go:117] "RemoveContainer" containerID="b09072a149bbfd55f7798798a0fdf816b98d19e5b0821a0efe726be18ad892d6" Oct 11 03:04:01 crc kubenswrapper[4743]: I1011 03:04:01.806546 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dmn4d_must-gather-svv82_6ac11bdb-263b-4572-889e-311b00b61201/gather/0.log" Oct 11 03:04:10 crc kubenswrapper[4743]: I1011 03:04:10.074737 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dmn4d/must-gather-svv82"] Oct 11 03:04:10 crc kubenswrapper[4743]: I1011 03:04:10.075487 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-dmn4d/must-gather-svv82" podUID="6ac11bdb-263b-4572-889e-311b00b61201" containerName="copy" containerID="cri-o://c4099bc965f55fc6a5ff36fd5ffc6fef73824b3c110c4a6a35f80123916ace83" gracePeriod=2 Oct 11 03:04:10 crc kubenswrapper[4743]: I1011 03:04:10.089250 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dmn4d/must-gather-svv82"] Oct 11 03:04:10 crc kubenswrapper[4743]: I1011 03:04:10.586457 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dmn4d_must-gather-svv82_6ac11bdb-263b-4572-889e-311b00b61201/copy/0.log" Oct 11 03:04:10 crc kubenswrapper[4743]: I1011 03:04:10.587338 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dmn4d/must-gather-svv82" Oct 11 03:04:10 crc kubenswrapper[4743]: I1011 03:04:10.614950 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dmn4d_must-gather-svv82_6ac11bdb-263b-4572-889e-311b00b61201/copy/0.log" Oct 11 03:04:10 crc kubenswrapper[4743]: I1011 03:04:10.615366 4743 generic.go:334] "Generic (PLEG): container finished" podID="6ac11bdb-263b-4572-889e-311b00b61201" containerID="c4099bc965f55fc6a5ff36fd5ffc6fef73824b3c110c4a6a35f80123916ace83" exitCode=143 Oct 11 03:04:10 crc kubenswrapper[4743]: I1011 03:04:10.615418 4743 scope.go:117] "RemoveContainer" containerID="c4099bc965f55fc6a5ff36fd5ffc6fef73824b3c110c4a6a35f80123916ace83" Oct 11 03:04:10 crc kubenswrapper[4743]: I1011 03:04:10.615476 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dmn4d/must-gather-svv82" Oct 11 03:04:10 crc kubenswrapper[4743]: I1011 03:04:10.648873 4743 scope.go:117] "RemoveContainer" containerID="b09072a149bbfd55f7798798a0fdf816b98d19e5b0821a0efe726be18ad892d6" Oct 11 03:04:10 crc kubenswrapper[4743]: I1011 03:04:10.692469 4743 scope.go:117] "RemoveContainer" containerID="c4099bc965f55fc6a5ff36fd5ffc6fef73824b3c110c4a6a35f80123916ace83" Oct 11 03:04:10 crc kubenswrapper[4743]: E1011 03:04:10.692961 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4099bc965f55fc6a5ff36fd5ffc6fef73824b3c110c4a6a35f80123916ace83\": container with ID starting with c4099bc965f55fc6a5ff36fd5ffc6fef73824b3c110c4a6a35f80123916ace83 not found: ID does not exist" containerID="c4099bc965f55fc6a5ff36fd5ffc6fef73824b3c110c4a6a35f80123916ace83" Oct 11 03:04:10 crc kubenswrapper[4743]: I1011 03:04:10.692999 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4099bc965f55fc6a5ff36fd5ffc6fef73824b3c110c4a6a35f80123916ace83"} err="failed to get container status \"c4099bc965f55fc6a5ff36fd5ffc6fef73824b3c110c4a6a35f80123916ace83\": rpc error: code = NotFound desc = could not find container \"c4099bc965f55fc6a5ff36fd5ffc6fef73824b3c110c4a6a35f80123916ace83\": container with ID starting with c4099bc965f55fc6a5ff36fd5ffc6fef73824b3c110c4a6a35f80123916ace83 not found: ID does not exist" Oct 11 03:04:10 crc kubenswrapper[4743]: I1011 03:04:10.693028 4743 scope.go:117] "RemoveContainer" containerID="b09072a149bbfd55f7798798a0fdf816b98d19e5b0821a0efe726be18ad892d6" Oct 11 03:04:10 crc kubenswrapper[4743]: E1011 03:04:10.693324 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b09072a149bbfd55f7798798a0fdf816b98d19e5b0821a0efe726be18ad892d6\": container with ID starting with b09072a149bbfd55f7798798a0fdf816b98d19e5b0821a0efe726be18ad892d6 not found: ID does not exist" containerID="b09072a149bbfd55f7798798a0fdf816b98d19e5b0821a0efe726be18ad892d6" Oct 11 03:04:10 crc kubenswrapper[4743]: I1011 03:04:10.693443 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b09072a149bbfd55f7798798a0fdf816b98d19e5b0821a0efe726be18ad892d6"} err="failed to get container status \"b09072a149bbfd55f7798798a0fdf816b98d19e5b0821a0efe726be18ad892d6\": rpc error: code = NotFound desc = could not find container \"b09072a149bbfd55f7798798a0fdf816b98d19e5b0821a0efe726be18ad892d6\": container with ID starting with b09072a149bbfd55f7798798a0fdf816b98d19e5b0821a0efe726be18ad892d6 not found: ID does not exist" Oct 11 03:04:10 crc kubenswrapper[4743]: I1011 03:04:10.721501 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwnvm\" (UniqueName: \"kubernetes.io/projected/6ac11bdb-263b-4572-889e-311b00b61201-kube-api-access-kwnvm\") pod \"6ac11bdb-263b-4572-889e-311b00b61201\" (UID: \"6ac11bdb-263b-4572-889e-311b00b61201\") " Oct 11 03:04:10 crc kubenswrapper[4743]: I1011 03:04:10.721559 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6ac11bdb-263b-4572-889e-311b00b61201-must-gather-output\") pod \"6ac11bdb-263b-4572-889e-311b00b61201\" (UID: \"6ac11bdb-263b-4572-889e-311b00b61201\") " Oct 11 03:04:10 crc kubenswrapper[4743]: I1011 03:04:10.728486 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ac11bdb-263b-4572-889e-311b00b61201-kube-api-access-kwnvm" (OuterVolumeSpecName: "kube-api-access-kwnvm") pod "6ac11bdb-263b-4572-889e-311b00b61201" (UID: "6ac11bdb-263b-4572-889e-311b00b61201"). InnerVolumeSpecName "kube-api-access-kwnvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:04:10 crc kubenswrapper[4743]: I1011 03:04:10.823767 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwnvm\" (UniqueName: \"kubernetes.io/projected/6ac11bdb-263b-4572-889e-311b00b61201-kube-api-access-kwnvm\") on node \"crc\" DevicePath \"\"" Oct 11 03:04:10 crc kubenswrapper[4743]: I1011 03:04:10.933335 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ac11bdb-263b-4572-889e-311b00b61201-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "6ac11bdb-263b-4572-889e-311b00b61201" (UID: "6ac11bdb-263b-4572-889e-311b00b61201"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:04:11 crc kubenswrapper[4743]: I1011 03:04:11.031611 4743 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6ac11bdb-263b-4572-889e-311b00b61201-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 11 03:04:11 crc kubenswrapper[4743]: I1011 03:04:11.092440 4743 scope.go:117] "RemoveContainer" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" Oct 11 03:04:11 crc kubenswrapper[4743]: E1011 03:04:11.092741 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:04:12 crc kubenswrapper[4743]: I1011 03:04:12.108070 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ac11bdb-263b-4572-889e-311b00b61201" path="/var/lib/kubelet/pods/6ac11bdb-263b-4572-889e-311b00b61201/volumes" Oct 11 03:04:26 crc kubenswrapper[4743]: I1011 03:04:26.101573 4743 scope.go:117] "RemoveContainer" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" Oct 11 03:04:26 crc kubenswrapper[4743]: E1011 03:04:26.104118 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:04:41 crc kubenswrapper[4743]: I1011 03:04:41.092854 4743 scope.go:117] "RemoveContainer" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" Oct 11 03:04:41 crc kubenswrapper[4743]: E1011 03:04:41.093522 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:04:56 crc kubenswrapper[4743]: I1011 03:04:56.111728 4743 scope.go:117] "RemoveContainer" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" Oct 11 03:04:56 crc kubenswrapper[4743]: E1011 03:04:56.117813 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:04:59 crc kubenswrapper[4743]: I1011 03:04:59.636384 4743 scope.go:117] "RemoveContainer" containerID="d6de51480ebcaae0fb768156c5d7cd97e768d82d4a43b1bdf70f3871080fcd2e" Oct 11 03:05:08 crc kubenswrapper[4743]: I1011 03:05:08.091588 4743 scope.go:117] "RemoveContainer" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" Oct 11 03:05:08 crc kubenswrapper[4743]: E1011 03:05:08.093547 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:05:09 crc kubenswrapper[4743]: I1011 03:05:09.542902 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rvqc2/must-gather-ns9w6"] Oct 11 03:05:09 crc kubenswrapper[4743]: E1011 03:05:09.553327 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac11bdb-263b-4572-889e-311b00b61201" containerName="copy" Oct 11 03:05:09 crc kubenswrapper[4743]: I1011 03:05:09.553345 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac11bdb-263b-4572-889e-311b00b61201" containerName="copy" Oct 11 03:05:09 crc kubenswrapper[4743]: E1011 03:05:09.553360 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac11bdb-263b-4572-889e-311b00b61201" containerName="gather" Oct 11 03:05:09 crc kubenswrapper[4743]: I1011 03:05:09.553365 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac11bdb-263b-4572-889e-311b00b61201" containerName="gather" Oct 11 03:05:09 crc kubenswrapper[4743]: E1011 03:05:09.553374 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9183d067-0293-4cfb-b274-8aaff386b391" containerName="extract-content" Oct 11 03:05:09 crc kubenswrapper[4743]: I1011 03:05:09.553381 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9183d067-0293-4cfb-b274-8aaff386b391" containerName="extract-content" Oct 11 03:05:09 crc kubenswrapper[4743]: E1011 03:05:09.553391 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9183d067-0293-4cfb-b274-8aaff386b391" containerName="registry-server" Oct 11 03:05:09 crc kubenswrapper[4743]: I1011 03:05:09.553396 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9183d067-0293-4cfb-b274-8aaff386b391" containerName="registry-server" Oct 11 03:05:09 crc kubenswrapper[4743]: E1011 03:05:09.553421 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9183d067-0293-4cfb-b274-8aaff386b391" containerName="extract-utilities" Oct 11 03:05:09 crc kubenswrapper[4743]: I1011 03:05:09.553429 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9183d067-0293-4cfb-b274-8aaff386b391" containerName="extract-utilities" Oct 11 03:05:09 crc kubenswrapper[4743]: I1011 03:05:09.553648 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac11bdb-263b-4572-889e-311b00b61201" containerName="gather" Oct 11 03:05:09 crc kubenswrapper[4743]: I1011 03:05:09.553664 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9183d067-0293-4cfb-b274-8aaff386b391" containerName="registry-server" Oct 11 03:05:09 crc kubenswrapper[4743]: I1011 03:05:09.553683 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac11bdb-263b-4572-889e-311b00b61201" containerName="copy" Oct 11 03:05:09 crc kubenswrapper[4743]: I1011 03:05:09.554837 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rvqc2/must-gather-ns9w6" Oct 11 03:05:09 crc kubenswrapper[4743]: I1011 03:05:09.559209 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rvqc2"/"kube-root-ca.crt" Oct 11 03:05:09 crc kubenswrapper[4743]: I1011 03:05:09.559598 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rvqc2"/"openshift-service-ca.crt" Oct 11 03:05:09 crc kubenswrapper[4743]: I1011 03:05:09.565371 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rvqc2/must-gather-ns9w6"] Oct 11 03:05:09 crc kubenswrapper[4743]: I1011 03:05:09.594926 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4p5r\" (UniqueName: \"kubernetes.io/projected/f7694f07-e76c-472a-9ac4-7c606ca08f28-kube-api-access-x4p5r\") pod \"must-gather-ns9w6\" (UID: \"f7694f07-e76c-472a-9ac4-7c606ca08f28\") " pod="openshift-must-gather-rvqc2/must-gather-ns9w6" Oct 11 03:05:09 crc kubenswrapper[4743]: I1011 03:05:09.595530 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f7694f07-e76c-472a-9ac4-7c606ca08f28-must-gather-output\") pod \"must-gather-ns9w6\" (UID: \"f7694f07-e76c-472a-9ac4-7c606ca08f28\") " pod="openshift-must-gather-rvqc2/must-gather-ns9w6" Oct 11 03:05:09 crc kubenswrapper[4743]: I1011 03:05:09.696453 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4p5r\" (UniqueName: \"kubernetes.io/projected/f7694f07-e76c-472a-9ac4-7c606ca08f28-kube-api-access-x4p5r\") pod \"must-gather-ns9w6\" (UID: \"f7694f07-e76c-472a-9ac4-7c606ca08f28\") " pod="openshift-must-gather-rvqc2/must-gather-ns9w6" Oct 11 03:05:09 crc kubenswrapper[4743]: I1011 03:05:09.696527 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f7694f07-e76c-472a-9ac4-7c606ca08f28-must-gather-output\") pod \"must-gather-ns9w6\" (UID: \"f7694f07-e76c-472a-9ac4-7c606ca08f28\") " pod="openshift-must-gather-rvqc2/must-gather-ns9w6" Oct 11 03:05:09 crc kubenswrapper[4743]: I1011 03:05:09.697059 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f7694f07-e76c-472a-9ac4-7c606ca08f28-must-gather-output\") pod \"must-gather-ns9w6\" (UID: \"f7694f07-e76c-472a-9ac4-7c606ca08f28\") " pod="openshift-must-gather-rvqc2/must-gather-ns9w6" Oct 11 03:05:09 crc kubenswrapper[4743]: I1011 03:05:09.715255 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4p5r\" (UniqueName: \"kubernetes.io/projected/f7694f07-e76c-472a-9ac4-7c606ca08f28-kube-api-access-x4p5r\") pod \"must-gather-ns9w6\" (UID: \"f7694f07-e76c-472a-9ac4-7c606ca08f28\") " pod="openshift-must-gather-rvqc2/must-gather-ns9w6" Oct 11 03:05:09 crc kubenswrapper[4743]: I1011 03:05:09.888767 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rvqc2/must-gather-ns9w6" Oct 11 03:05:10 crc kubenswrapper[4743]: I1011 03:05:10.407253 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rvqc2/must-gather-ns9w6"] Oct 11 03:05:11 crc kubenswrapper[4743]: I1011 03:05:11.284106 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rvqc2/must-gather-ns9w6" event={"ID":"f7694f07-e76c-472a-9ac4-7c606ca08f28","Type":"ContainerStarted","Data":"ede8ed1b5cfbb64a76ed455f957807cfe6ea6b7af9c3977ae980a81eb9fe7b79"} Oct 11 03:05:11 crc kubenswrapper[4743]: I1011 03:05:11.284432 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rvqc2/must-gather-ns9w6" event={"ID":"f7694f07-e76c-472a-9ac4-7c606ca08f28","Type":"ContainerStarted","Data":"9fa23bd7fa6e38fcaee07f378bf8e4521ce45a7ee0dd7b32573a76ac02e46437"} Oct 11 03:05:11 crc kubenswrapper[4743]: I1011 03:05:11.284448 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rvqc2/must-gather-ns9w6" event={"ID":"f7694f07-e76c-472a-9ac4-7c606ca08f28","Type":"ContainerStarted","Data":"418da09532cfda5c29f94fa54289483425cd92f875fe47149000c7710f6e5b99"} Oct 11 03:05:11 crc kubenswrapper[4743]: I1011 03:05:11.313482 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rvqc2/must-gather-ns9w6" podStartSLOduration=2.313463742 podStartE2EDuration="2.313463742s" podCreationTimestamp="2025-10-11 03:05:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:05:11.304059579 +0000 UTC m=+8005.957039976" watchObservedRunningTime="2025-10-11 03:05:11.313463742 +0000 UTC m=+8005.966444139" Oct 11 03:05:15 crc kubenswrapper[4743]: I1011 03:05:15.150571 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rvqc2/crc-debug-4sgb6"] Oct 11 03:05:15 crc kubenswrapper[4743]: I1011 03:05:15.153084 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rvqc2/crc-debug-4sgb6" Oct 11 03:05:15 crc kubenswrapper[4743]: I1011 03:05:15.156620 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-rvqc2"/"default-dockercfg-cjq9f" Oct 11 03:05:15 crc kubenswrapper[4743]: I1011 03:05:15.237936 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04f57e67-7f28-4950-8221-55ed092efeb5-host\") pod \"crc-debug-4sgb6\" (UID: \"04f57e67-7f28-4950-8221-55ed092efeb5\") " pod="openshift-must-gather-rvqc2/crc-debug-4sgb6" Oct 11 03:05:15 crc kubenswrapper[4743]: I1011 03:05:15.238067 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qwpr\" (UniqueName: \"kubernetes.io/projected/04f57e67-7f28-4950-8221-55ed092efeb5-kube-api-access-6qwpr\") pod \"crc-debug-4sgb6\" (UID: \"04f57e67-7f28-4950-8221-55ed092efeb5\") " pod="openshift-must-gather-rvqc2/crc-debug-4sgb6" Oct 11 03:05:15 crc kubenswrapper[4743]: I1011 03:05:15.340376 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04f57e67-7f28-4950-8221-55ed092efeb5-host\") pod \"crc-debug-4sgb6\" (UID: \"04f57e67-7f28-4950-8221-55ed092efeb5\") " pod="openshift-must-gather-rvqc2/crc-debug-4sgb6" Oct 11 03:05:15 crc kubenswrapper[4743]: I1011 03:05:15.340519 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qwpr\" (UniqueName: \"kubernetes.io/projected/04f57e67-7f28-4950-8221-55ed092efeb5-kube-api-access-6qwpr\") pod \"crc-debug-4sgb6\" (UID: \"04f57e67-7f28-4950-8221-55ed092efeb5\") " pod="openshift-must-gather-rvqc2/crc-debug-4sgb6" Oct 11 03:05:15 crc kubenswrapper[4743]: I1011 03:05:15.340517 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04f57e67-7f28-4950-8221-55ed092efeb5-host\") pod \"crc-debug-4sgb6\" (UID: \"04f57e67-7f28-4950-8221-55ed092efeb5\") " pod="openshift-must-gather-rvqc2/crc-debug-4sgb6" Oct 11 03:05:15 crc kubenswrapper[4743]: I1011 03:05:15.361414 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qwpr\" (UniqueName: \"kubernetes.io/projected/04f57e67-7f28-4950-8221-55ed092efeb5-kube-api-access-6qwpr\") pod \"crc-debug-4sgb6\" (UID: \"04f57e67-7f28-4950-8221-55ed092efeb5\") " pod="openshift-must-gather-rvqc2/crc-debug-4sgb6" Oct 11 03:05:15 crc kubenswrapper[4743]: I1011 03:05:15.481152 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rvqc2/crc-debug-4sgb6" Oct 11 03:05:15 crc kubenswrapper[4743]: W1011 03:05:15.530439 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04f57e67_7f28_4950_8221_55ed092efeb5.slice/crio-1e4c7e94fcd0215bd0df81558f6e7e7215fa563b182ddded59834e6924de0c02 WatchSource:0}: Error finding container 1e4c7e94fcd0215bd0df81558f6e7e7215fa563b182ddded59834e6924de0c02: Status 404 returned error can't find the container with id 1e4c7e94fcd0215bd0df81558f6e7e7215fa563b182ddded59834e6924de0c02 Oct 11 03:05:16 crc kubenswrapper[4743]: I1011 03:05:16.343602 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rvqc2/crc-debug-4sgb6" event={"ID":"04f57e67-7f28-4950-8221-55ed092efeb5","Type":"ContainerStarted","Data":"3327c6af83fa2b5d64c505bc8644955fea821ba5bd794bbfd4d631229ab3e40d"} Oct 11 03:05:16 crc kubenswrapper[4743]: I1011 03:05:16.345083 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rvqc2/crc-debug-4sgb6" event={"ID":"04f57e67-7f28-4950-8221-55ed092efeb5","Type":"ContainerStarted","Data":"1e4c7e94fcd0215bd0df81558f6e7e7215fa563b182ddded59834e6924de0c02"} Oct 11 03:05:16 crc kubenswrapper[4743]: I1011 03:05:16.377494 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rvqc2/crc-debug-4sgb6" podStartSLOduration=1.377472918 podStartE2EDuration="1.377472918s" podCreationTimestamp="2025-10-11 03:05:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-11 03:05:16.37068732 +0000 UTC m=+8011.023667727" watchObservedRunningTime="2025-10-11 03:05:16.377472918 +0000 UTC m=+8011.030453315" Oct 11 03:05:20 crc kubenswrapper[4743]: I1011 03:05:20.092400 4743 scope.go:117] "RemoveContainer" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" Oct 11 03:05:20 crc kubenswrapper[4743]: E1011 03:05:20.093182 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:05:34 crc kubenswrapper[4743]: I1011 03:05:34.099623 4743 scope.go:117] "RemoveContainer" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" Oct 11 03:05:34 crc kubenswrapper[4743]: E1011 03:05:34.100609 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:05:45 crc kubenswrapper[4743]: I1011 03:05:45.092635 4743 scope.go:117] "RemoveContainer" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" Oct 11 03:05:45 crc kubenswrapper[4743]: I1011 03:05:45.623772 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"62fef52b9bc704a24e77d3838ec6b6ea99e34891429f790a37c7da87d4fa7a3f"} Oct 11 03:06:01 crc kubenswrapper[4743]: I1011 03:06:01.803087 4743 generic.go:334] "Generic (PLEG): container finished" podID="04f57e67-7f28-4950-8221-55ed092efeb5" containerID="3327c6af83fa2b5d64c505bc8644955fea821ba5bd794bbfd4d631229ab3e40d" exitCode=0 Oct 11 03:06:01 crc kubenswrapper[4743]: I1011 03:06:01.803168 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rvqc2/crc-debug-4sgb6" event={"ID":"04f57e67-7f28-4950-8221-55ed092efeb5","Type":"ContainerDied","Data":"3327c6af83fa2b5d64c505bc8644955fea821ba5bd794bbfd4d631229ab3e40d"} Oct 11 03:06:02 crc kubenswrapper[4743]: I1011 03:06:02.958835 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rvqc2/crc-debug-4sgb6" Oct 11 03:06:02 crc kubenswrapper[4743]: I1011 03:06:02.993218 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qwpr\" (UniqueName: \"kubernetes.io/projected/04f57e67-7f28-4950-8221-55ed092efeb5-kube-api-access-6qwpr\") pod \"04f57e67-7f28-4950-8221-55ed092efeb5\" (UID: \"04f57e67-7f28-4950-8221-55ed092efeb5\") " Oct 11 03:06:02 crc kubenswrapper[4743]: I1011 03:06:02.993509 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04f57e67-7f28-4950-8221-55ed092efeb5-host\") pod \"04f57e67-7f28-4950-8221-55ed092efeb5\" (UID: \"04f57e67-7f28-4950-8221-55ed092efeb5\") " Oct 11 03:06:02 crc kubenswrapper[4743]: I1011 03:06:02.993612 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04f57e67-7f28-4950-8221-55ed092efeb5-host" (OuterVolumeSpecName: "host") pod "04f57e67-7f28-4950-8221-55ed092efeb5" (UID: "04f57e67-7f28-4950-8221-55ed092efeb5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 03:06:02 crc kubenswrapper[4743]: I1011 03:06:02.994130 4743 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04f57e67-7f28-4950-8221-55ed092efeb5-host\") on node \"crc\" DevicePath \"\"" Oct 11 03:06:02 crc kubenswrapper[4743]: I1011 03:06:02.995626 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rvqc2/crc-debug-4sgb6"] Oct 11 03:06:02 crc kubenswrapper[4743]: I1011 03:06:02.999587 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04f57e67-7f28-4950-8221-55ed092efeb5-kube-api-access-6qwpr" (OuterVolumeSpecName: "kube-api-access-6qwpr") pod "04f57e67-7f28-4950-8221-55ed092efeb5" (UID: "04f57e67-7f28-4950-8221-55ed092efeb5"). InnerVolumeSpecName "kube-api-access-6qwpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:06:03 crc kubenswrapper[4743]: I1011 03:06:03.007999 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rvqc2/crc-debug-4sgb6"] Oct 11 03:06:03 crc kubenswrapper[4743]: I1011 03:06:03.095939 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qwpr\" (UniqueName: \"kubernetes.io/projected/04f57e67-7f28-4950-8221-55ed092efeb5-kube-api-access-6qwpr\") on node \"crc\" DevicePath \"\"" Oct 11 03:06:03 crc kubenswrapper[4743]: I1011 03:06:03.831301 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e4c7e94fcd0215bd0df81558f6e7e7215fa563b182ddded59834e6924de0c02" Oct 11 03:06:03 crc kubenswrapper[4743]: I1011 03:06:03.831714 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rvqc2/crc-debug-4sgb6" Oct 11 03:06:04 crc kubenswrapper[4743]: I1011 03:06:04.105149 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04f57e67-7f28-4950-8221-55ed092efeb5" path="/var/lib/kubelet/pods/04f57e67-7f28-4950-8221-55ed092efeb5/volumes" Oct 11 03:06:04 crc kubenswrapper[4743]: I1011 03:06:04.194032 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rvqc2/crc-debug-hzz2b"] Oct 11 03:06:04 crc kubenswrapper[4743]: E1011 03:06:04.195253 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04f57e67-7f28-4950-8221-55ed092efeb5" containerName="container-00" Oct 11 03:06:04 crc kubenswrapper[4743]: I1011 03:06:04.195353 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="04f57e67-7f28-4950-8221-55ed092efeb5" containerName="container-00" Oct 11 03:06:04 crc kubenswrapper[4743]: I1011 03:06:04.195675 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="04f57e67-7f28-4950-8221-55ed092efeb5" containerName="container-00" Oct 11 03:06:04 crc kubenswrapper[4743]: I1011 03:06:04.196482 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rvqc2/crc-debug-hzz2b" Oct 11 03:06:04 crc kubenswrapper[4743]: I1011 03:06:04.199093 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-rvqc2"/"default-dockercfg-cjq9f" Oct 11 03:06:04 crc kubenswrapper[4743]: I1011 03:06:04.219044 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcngv\" (UniqueName: \"kubernetes.io/projected/1bd8aec5-f490-4c6d-af51-3d7bf4580cba-kube-api-access-xcngv\") pod \"crc-debug-hzz2b\" (UID: \"1bd8aec5-f490-4c6d-af51-3d7bf4580cba\") " pod="openshift-must-gather-rvqc2/crc-debug-hzz2b" Oct 11 03:06:04 crc kubenswrapper[4743]: I1011 03:06:04.219168 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1bd8aec5-f490-4c6d-af51-3d7bf4580cba-host\") pod \"crc-debug-hzz2b\" (UID: \"1bd8aec5-f490-4c6d-af51-3d7bf4580cba\") " pod="openshift-must-gather-rvqc2/crc-debug-hzz2b" Oct 11 03:06:04 crc kubenswrapper[4743]: I1011 03:06:04.321349 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcngv\" (UniqueName: \"kubernetes.io/projected/1bd8aec5-f490-4c6d-af51-3d7bf4580cba-kube-api-access-xcngv\") pod \"crc-debug-hzz2b\" (UID: \"1bd8aec5-f490-4c6d-af51-3d7bf4580cba\") " pod="openshift-must-gather-rvqc2/crc-debug-hzz2b" Oct 11 03:06:04 crc kubenswrapper[4743]: I1011 03:06:04.321462 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1bd8aec5-f490-4c6d-af51-3d7bf4580cba-host\") pod \"crc-debug-hzz2b\" (UID: \"1bd8aec5-f490-4c6d-af51-3d7bf4580cba\") " pod="openshift-must-gather-rvqc2/crc-debug-hzz2b" Oct 11 03:06:04 crc kubenswrapper[4743]: I1011 03:06:04.321670 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1bd8aec5-f490-4c6d-af51-3d7bf4580cba-host\") pod \"crc-debug-hzz2b\" (UID: \"1bd8aec5-f490-4c6d-af51-3d7bf4580cba\") " pod="openshift-must-gather-rvqc2/crc-debug-hzz2b" Oct 11 03:06:04 crc kubenswrapper[4743]: I1011 03:06:04.348753 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcngv\" (UniqueName: \"kubernetes.io/projected/1bd8aec5-f490-4c6d-af51-3d7bf4580cba-kube-api-access-xcngv\") pod \"crc-debug-hzz2b\" (UID: \"1bd8aec5-f490-4c6d-af51-3d7bf4580cba\") " pod="openshift-must-gather-rvqc2/crc-debug-hzz2b" Oct 11 03:06:04 crc kubenswrapper[4743]: I1011 03:06:04.514265 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rvqc2/crc-debug-hzz2b" Oct 11 03:06:04 crc kubenswrapper[4743]: I1011 03:06:04.841653 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rvqc2/crc-debug-hzz2b" event={"ID":"1bd8aec5-f490-4c6d-af51-3d7bf4580cba","Type":"ContainerStarted","Data":"d487ea63a094267e60c62f4218d63453314befae8b468f2841ec932749c5021c"} Oct 11 03:06:05 crc kubenswrapper[4743]: I1011 03:06:05.852535 4743 generic.go:334] "Generic (PLEG): container finished" podID="1bd8aec5-f490-4c6d-af51-3d7bf4580cba" containerID="70a777b6891e48fef92e453fae0a18e8f55e591276bd27ab54f2fb51bf835e7d" exitCode=0 Oct 11 03:06:05 crc kubenswrapper[4743]: I1011 03:06:05.852632 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rvqc2/crc-debug-hzz2b" event={"ID":"1bd8aec5-f490-4c6d-af51-3d7bf4580cba","Type":"ContainerDied","Data":"70a777b6891e48fef92e453fae0a18e8f55e591276bd27ab54f2fb51bf835e7d"} Oct 11 03:06:06 crc kubenswrapper[4743]: I1011 03:06:06.998973 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rvqc2/crc-debug-hzz2b" Oct 11 03:06:07 crc kubenswrapper[4743]: I1011 03:06:07.076424 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcngv\" (UniqueName: \"kubernetes.io/projected/1bd8aec5-f490-4c6d-af51-3d7bf4580cba-kube-api-access-xcngv\") pod \"1bd8aec5-f490-4c6d-af51-3d7bf4580cba\" (UID: \"1bd8aec5-f490-4c6d-af51-3d7bf4580cba\") " Oct 11 03:06:07 crc kubenswrapper[4743]: I1011 03:06:07.076575 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1bd8aec5-f490-4c6d-af51-3d7bf4580cba-host\") pod \"1bd8aec5-f490-4c6d-af51-3d7bf4580cba\" (UID: \"1bd8aec5-f490-4c6d-af51-3d7bf4580cba\") " Oct 11 03:06:07 crc kubenswrapper[4743]: I1011 03:06:07.077087 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1bd8aec5-f490-4c6d-af51-3d7bf4580cba-host" (OuterVolumeSpecName: "host") pod "1bd8aec5-f490-4c6d-af51-3d7bf4580cba" (UID: "1bd8aec5-f490-4c6d-af51-3d7bf4580cba"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 03:06:07 crc kubenswrapper[4743]: I1011 03:06:07.077501 4743 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1bd8aec5-f490-4c6d-af51-3d7bf4580cba-host\") on node \"crc\" DevicePath \"\"" Oct 11 03:06:07 crc kubenswrapper[4743]: I1011 03:06:07.084221 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bd8aec5-f490-4c6d-af51-3d7bf4580cba-kube-api-access-xcngv" (OuterVolumeSpecName: "kube-api-access-xcngv") pod "1bd8aec5-f490-4c6d-af51-3d7bf4580cba" (UID: "1bd8aec5-f490-4c6d-af51-3d7bf4580cba"). InnerVolumeSpecName "kube-api-access-xcngv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:06:07 crc kubenswrapper[4743]: I1011 03:06:07.178265 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcngv\" (UniqueName: \"kubernetes.io/projected/1bd8aec5-f490-4c6d-af51-3d7bf4580cba-kube-api-access-xcngv\") on node \"crc\" DevicePath \"\"" Oct 11 03:06:07 crc kubenswrapper[4743]: I1011 03:06:07.877964 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rvqc2/crc-debug-hzz2b" event={"ID":"1bd8aec5-f490-4c6d-af51-3d7bf4580cba","Type":"ContainerDied","Data":"d487ea63a094267e60c62f4218d63453314befae8b468f2841ec932749c5021c"} Oct 11 03:06:07 crc kubenswrapper[4743]: I1011 03:06:07.878275 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d487ea63a094267e60c62f4218d63453314befae8b468f2841ec932749c5021c" Oct 11 03:06:07 crc kubenswrapper[4743]: I1011 03:06:07.878047 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rvqc2/crc-debug-hzz2b" Oct 11 03:06:08 crc kubenswrapper[4743]: I1011 03:06:08.306434 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rvqc2/crc-debug-hzz2b"] Oct 11 03:06:08 crc kubenswrapper[4743]: I1011 03:06:08.315796 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rvqc2/crc-debug-hzz2b"] Oct 11 03:06:09 crc kubenswrapper[4743]: I1011 03:06:09.549711 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rvqc2/crc-debug-pmgkl"] Oct 11 03:06:09 crc kubenswrapper[4743]: E1011 03:06:09.550431 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bd8aec5-f490-4c6d-af51-3d7bf4580cba" containerName="container-00" Oct 11 03:06:09 crc kubenswrapper[4743]: I1011 03:06:09.550444 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bd8aec5-f490-4c6d-af51-3d7bf4580cba" containerName="container-00" Oct 11 03:06:09 crc kubenswrapper[4743]: I1011 03:06:09.550682 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bd8aec5-f490-4c6d-af51-3d7bf4580cba" containerName="container-00" Oct 11 03:06:09 crc kubenswrapper[4743]: I1011 03:06:09.551443 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rvqc2/crc-debug-pmgkl" Oct 11 03:06:09 crc kubenswrapper[4743]: I1011 03:06:09.553310 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-rvqc2"/"default-dockercfg-cjq9f" Oct 11 03:06:09 crc kubenswrapper[4743]: I1011 03:06:09.630010 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjhkd\" (UniqueName: \"kubernetes.io/projected/3f0e7754-386f-4be3-8108-ac89f6b627d6-kube-api-access-vjhkd\") pod \"crc-debug-pmgkl\" (UID: \"3f0e7754-386f-4be3-8108-ac89f6b627d6\") " pod="openshift-must-gather-rvqc2/crc-debug-pmgkl" Oct 11 03:06:09 crc kubenswrapper[4743]: I1011 03:06:09.630178 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f0e7754-386f-4be3-8108-ac89f6b627d6-host\") pod \"crc-debug-pmgkl\" (UID: \"3f0e7754-386f-4be3-8108-ac89f6b627d6\") " pod="openshift-must-gather-rvqc2/crc-debug-pmgkl" Oct 11 03:06:09 crc kubenswrapper[4743]: I1011 03:06:09.732806 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f0e7754-386f-4be3-8108-ac89f6b627d6-host\") pod \"crc-debug-pmgkl\" (UID: \"3f0e7754-386f-4be3-8108-ac89f6b627d6\") " pod="openshift-must-gather-rvqc2/crc-debug-pmgkl" Oct 11 03:06:09 crc kubenswrapper[4743]: I1011 03:06:09.732937 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f0e7754-386f-4be3-8108-ac89f6b627d6-host\") pod \"crc-debug-pmgkl\" (UID: \"3f0e7754-386f-4be3-8108-ac89f6b627d6\") " pod="openshift-must-gather-rvqc2/crc-debug-pmgkl" Oct 11 03:06:09 crc kubenswrapper[4743]: I1011 03:06:09.733075 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjhkd\" (UniqueName: \"kubernetes.io/projected/3f0e7754-386f-4be3-8108-ac89f6b627d6-kube-api-access-vjhkd\") pod \"crc-debug-pmgkl\" (UID: \"3f0e7754-386f-4be3-8108-ac89f6b627d6\") " pod="openshift-must-gather-rvqc2/crc-debug-pmgkl" Oct 11 03:06:09 crc kubenswrapper[4743]: I1011 03:06:09.754171 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjhkd\" (UniqueName: \"kubernetes.io/projected/3f0e7754-386f-4be3-8108-ac89f6b627d6-kube-api-access-vjhkd\") pod \"crc-debug-pmgkl\" (UID: \"3f0e7754-386f-4be3-8108-ac89f6b627d6\") " pod="openshift-must-gather-rvqc2/crc-debug-pmgkl" Oct 11 03:06:09 crc kubenswrapper[4743]: I1011 03:06:09.870369 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rvqc2/crc-debug-pmgkl" Oct 11 03:06:10 crc kubenswrapper[4743]: I1011 03:06:10.113478 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bd8aec5-f490-4c6d-af51-3d7bf4580cba" path="/var/lib/kubelet/pods/1bd8aec5-f490-4c6d-af51-3d7bf4580cba/volumes" Oct 11 03:06:10 crc kubenswrapper[4743]: I1011 03:06:10.918465 4743 generic.go:334] "Generic (PLEG): container finished" podID="3f0e7754-386f-4be3-8108-ac89f6b627d6" containerID="63d180420a3d54ef69e82bb8672ee2445ff67b254243edfd5d93716280c4b98e" exitCode=0 Oct 11 03:06:10 crc kubenswrapper[4743]: I1011 03:06:10.918642 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rvqc2/crc-debug-pmgkl" event={"ID":"3f0e7754-386f-4be3-8108-ac89f6b627d6","Type":"ContainerDied","Data":"63d180420a3d54ef69e82bb8672ee2445ff67b254243edfd5d93716280c4b98e"} Oct 11 03:06:10 crc kubenswrapper[4743]: I1011 03:06:10.919034 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rvqc2/crc-debug-pmgkl" event={"ID":"3f0e7754-386f-4be3-8108-ac89f6b627d6","Type":"ContainerStarted","Data":"37d0df44333b6063672819e2518690f9ebdb684185793db5039904c7f842d942"} Oct 11 03:06:10 crc kubenswrapper[4743]: I1011 03:06:10.953499 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rvqc2/crc-debug-pmgkl"] Oct 11 03:06:10 crc kubenswrapper[4743]: I1011 03:06:10.963282 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rvqc2/crc-debug-pmgkl"] Oct 11 03:06:12 crc kubenswrapper[4743]: I1011 03:06:12.055626 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rvqc2/crc-debug-pmgkl" Oct 11 03:06:12 crc kubenswrapper[4743]: I1011 03:06:12.186460 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f0e7754-386f-4be3-8108-ac89f6b627d6-host\") pod \"3f0e7754-386f-4be3-8108-ac89f6b627d6\" (UID: \"3f0e7754-386f-4be3-8108-ac89f6b627d6\") " Oct 11 03:06:12 crc kubenswrapper[4743]: I1011 03:06:12.186535 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjhkd\" (UniqueName: \"kubernetes.io/projected/3f0e7754-386f-4be3-8108-ac89f6b627d6-kube-api-access-vjhkd\") pod \"3f0e7754-386f-4be3-8108-ac89f6b627d6\" (UID: \"3f0e7754-386f-4be3-8108-ac89f6b627d6\") " Oct 11 03:06:12 crc kubenswrapper[4743]: I1011 03:06:12.186582 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f0e7754-386f-4be3-8108-ac89f6b627d6-host" (OuterVolumeSpecName: "host") pod "3f0e7754-386f-4be3-8108-ac89f6b627d6" (UID: "3f0e7754-386f-4be3-8108-ac89f6b627d6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 11 03:06:12 crc kubenswrapper[4743]: I1011 03:06:12.187139 4743 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f0e7754-386f-4be3-8108-ac89f6b627d6-host\") on node \"crc\" DevicePath \"\"" Oct 11 03:06:12 crc kubenswrapper[4743]: I1011 03:06:12.192018 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f0e7754-386f-4be3-8108-ac89f6b627d6-kube-api-access-vjhkd" (OuterVolumeSpecName: "kube-api-access-vjhkd") pod "3f0e7754-386f-4be3-8108-ac89f6b627d6" (UID: "3f0e7754-386f-4be3-8108-ac89f6b627d6"). InnerVolumeSpecName "kube-api-access-vjhkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:06:12 crc kubenswrapper[4743]: I1011 03:06:12.287844 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjhkd\" (UniqueName: \"kubernetes.io/projected/3f0e7754-386f-4be3-8108-ac89f6b627d6-kube-api-access-vjhkd\") on node \"crc\" DevicePath \"\"" Oct 11 03:06:12 crc kubenswrapper[4743]: I1011 03:06:12.935987 4743 scope.go:117] "RemoveContainer" containerID="63d180420a3d54ef69e82bb8672ee2445ff67b254243edfd5d93716280c4b98e" Oct 11 03:06:12 crc kubenswrapper[4743]: I1011 03:06:12.936111 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rvqc2/crc-debug-pmgkl" Oct 11 03:06:14 crc kubenswrapper[4743]: I1011 03:06:14.104775 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f0e7754-386f-4be3-8108-ac89f6b627d6" path="/var/lib/kubelet/pods/3f0e7754-386f-4be3-8108-ac89f6b627d6/volumes" Oct 11 03:06:37 crc kubenswrapper[4743]: I1011 03:06:37.018084 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b61fc8c0-014e-481a-b189-e554dced0696/aodh-evaluator/0.log" Oct 11 03:06:37 crc kubenswrapper[4743]: I1011 03:06:37.038704 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b61fc8c0-014e-481a-b189-e554dced0696/aodh-api/0.log" Oct 11 03:06:37 crc kubenswrapper[4743]: I1011 03:06:37.194317 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b61fc8c0-014e-481a-b189-e554dced0696/aodh-listener/0.log" Oct 11 03:06:37 crc kubenswrapper[4743]: I1011 03:06:37.228643 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b61fc8c0-014e-481a-b189-e554dced0696/aodh-notifier/0.log" Oct 11 03:06:37 crc kubenswrapper[4743]: I1011 03:06:37.382303 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-67c9948594-q58d2_9cd00887-e25b-4548-8084-5efab4f9cb27/barbican-api/0.log" Oct 11 03:06:37 crc kubenswrapper[4743]: I1011 03:06:37.466686 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-67c9948594-q58d2_9cd00887-e25b-4548-8084-5efab4f9cb27/barbican-api-log/0.log" Oct 11 03:06:37 crc kubenswrapper[4743]: I1011 03:06:37.567097 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-785cc87c98-slsn7_7f793c4b-6627-4c4b-9f2c-529641700221/barbican-keystone-listener/0.log" Oct 11 03:06:37 crc kubenswrapper[4743]: I1011 03:06:37.767377 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-785cc87c98-slsn7_7f793c4b-6627-4c4b-9f2c-529641700221/barbican-keystone-listener-log/0.log" Oct 11 03:06:37 crc kubenswrapper[4743]: I1011 03:06:37.799164 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f98bddd87-6kv6r_b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d/barbican-worker/0.log" Oct 11 03:06:37 crc kubenswrapper[4743]: I1011 03:06:37.975976 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f98bddd87-6kv6r_b8ff2f6e-e247-4ff9-9c07-8ba2bf4d5b1d/barbican-worker-log/0.log" Oct 11 03:06:38 crc kubenswrapper[4743]: I1011 03:06:38.018881 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-j85dc_5ce1ff59-69f9-466b-926d-4785eb4df84f/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 03:06:38 crc kubenswrapper[4743]: I1011 03:06:38.278707 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0a98090b-ea2d-4b45-98ca-cdb8d619e42d/ceilometer-central-agent/0.log" Oct 11 03:06:38 crc kubenswrapper[4743]: I1011 03:06:38.325909 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0a98090b-ea2d-4b45-98ca-cdb8d619e42d/ceilometer-notification-agent/0.log" Oct 11 03:06:38 crc kubenswrapper[4743]: I1011 03:06:38.467630 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0a98090b-ea2d-4b45-98ca-cdb8d619e42d/proxy-httpd/0.log" Oct 11 03:06:38 crc kubenswrapper[4743]: I1011 03:06:38.494412 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0a98090b-ea2d-4b45-98ca-cdb8d619e42d/sg-core/0.log" Oct 11 03:06:38 crc kubenswrapper[4743]: I1011 03:06:38.639343 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-vl6cd_2a7d527b-9f7c-40ec-8939-fbd2350a9ec3/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 03:06:38 crc kubenswrapper[4743]: I1011 03:06:38.774403 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-8hnsg_3523070b-145c-4c82-9623-b4a9f2a32c11/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 03:06:39 crc kubenswrapper[4743]: I1011 03:06:39.054500 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9f904889-28d2-4cfd-86ee-2e5841f9fc04/cinder-api-log/0.log" Oct 11 03:06:39 crc kubenswrapper[4743]: I1011 03:06:39.100572 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9f904889-28d2-4cfd-86ee-2e5841f9fc04/cinder-api/0.log" Oct 11 03:06:39 crc kubenswrapper[4743]: I1011 03:06:39.672419 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_f333d397-070a-4624-8b2d-856964010b75/probe/0.log" Oct 11 03:06:39 crc kubenswrapper[4743]: I1011 03:06:39.773610 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_f333d397-070a-4624-8b2d-856964010b75/cinder-backup/0.log" Oct 11 03:06:39 crc kubenswrapper[4743]: I1011 03:06:39.997559 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_1570e831-5132-4e30-b791-6ac13faaeea4/cinder-scheduler/0.log" Oct 11 03:06:40 crc kubenswrapper[4743]: I1011 03:06:40.085526 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_1570e831-5132-4e30-b791-6ac13faaeea4/probe/0.log" Oct 11 03:06:40 crc kubenswrapper[4743]: I1011 03:06:40.295502 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_129685c1-9de5-4c18-9219-172fe359aa89/cinder-volume/0.log" Oct 11 03:06:40 crc kubenswrapper[4743]: I1011 03:06:40.341321 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-hc4vr_5205ba97-c5be-49b8-a4a6-2570d1b602d2/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 03:06:40 crc kubenswrapper[4743]: I1011 03:06:40.351538 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_129685c1-9de5-4c18-9219-172fe359aa89/probe/0.log" Oct 11 03:06:40 crc kubenswrapper[4743]: I1011 03:06:40.547372 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-dwkfr_43197ff3-1a5a-4c2f-a836-aa22d055d415/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 03:06:40 crc kubenswrapper[4743]: I1011 03:06:40.736791 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c8d8d886c-dsztl_1654f6a5-1abf-4c9e-b956-3bfc60c7077c/init/0.log" Oct 11 03:06:40 crc kubenswrapper[4743]: I1011 03:06:40.870960 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c8d8d886c-dsztl_1654f6a5-1abf-4c9e-b956-3bfc60c7077c/init/0.log" Oct 11 03:06:40 crc kubenswrapper[4743]: I1011 03:06:40.966191 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c8d8d886c-dsztl_1654f6a5-1abf-4c9e-b956-3bfc60c7077c/dnsmasq-dns/0.log" Oct 11 03:06:41 crc kubenswrapper[4743]: I1011 03:06:41.079342 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2e1cee17-cf14-4bf2-bda0-2f651412f042/glance-httpd/0.log" Oct 11 03:06:41 crc kubenswrapper[4743]: I1011 03:06:41.083243 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2e1cee17-cf14-4bf2-bda0-2f651412f042/glance-log/0.log" Oct 11 03:06:41 crc kubenswrapper[4743]: I1011 03:06:41.266069 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964/glance-httpd/0.log" Oct 11 03:06:41 crc kubenswrapper[4743]: I1011 03:06:41.375460 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d5ae3fcd-4a6d-42e5-9ab1-69aae9e8f964/glance-log/0.log" Oct 11 03:06:41 crc kubenswrapper[4743]: I1011 03:06:41.841417 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-57fbf8bd-mp4f8_ae21bf76-1584-4681-b679-29abbf1ef22a/heat-engine/0.log" Oct 11 03:06:42 crc kubenswrapper[4743]: I1011 03:06:42.208124 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-f46b79456-dm9d6_36f566d2-9c6b-4bc3-a1a3-47a11e6eee45/horizon/0.log" Oct 11 03:06:42 crc kubenswrapper[4743]: I1011 03:06:42.721262 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-tb9g9_57fc6fbe-24cd-4185-a91e-dd39258e8d05/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 03:06:42 crc kubenswrapper[4743]: I1011 03:06:42.782717 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-657c9dff6b-rgphg_de562f4f-80d8-407b-bf5a-9b584e013294/heat-api/0.log" Oct 11 03:06:42 crc kubenswrapper[4743]: I1011 03:06:42.792426 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-7b7546fb69-4lvd7_db0cfd47-4287-4526-8e6b-0fd5bd770a1c/heat-cfnapi/0.log" Oct 11 03:06:42 crc kubenswrapper[4743]: I1011 03:06:42.906559 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-f46b79456-dm9d6_36f566d2-9c6b-4bc3-a1a3-47a11e6eee45/horizon-log/0.log" Oct 11 03:06:43 crc kubenswrapper[4743]: I1011 03:06:43.207165 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-lj96c_09ec7d44-c723-4a16-a24e-d473280d1321/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 03:06:43 crc kubenswrapper[4743]: I1011 03:06:43.406546 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29335801-cqb6t_90420843-1d2e-48e7-bec5-63cc4cd8557e/keystone-cron/0.log" Oct 11 03:06:43 crc kubenswrapper[4743]: I1011 03:06:43.645032 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29335861-6fjkl_eb164eff-c00c-4764-bb12-e371aaa02860/keystone-cron/0.log" Oct 11 03:06:43 crc kubenswrapper[4743]: I1011 03:06:43.803399 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5bd8997d9d-dpxn8_8f249d47-90a4-4fc9-8bb8-e61bc0143ae7/keystone-api/0.log" Oct 11 03:06:43 crc kubenswrapper[4743]: I1011 03:06:43.803453 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_923b0fb7-1d93-491e-a1e0-73614b302fdb/kube-state-metrics/0.log" Oct 11 03:06:43 crc kubenswrapper[4743]: I1011 03:06:43.977356 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-nc8vt_6ea65353-b389-4222-8ff8-298d53283609/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 03:06:43 crc kubenswrapper[4743]: I1011 03:06:43.999300 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-bvnfw_abded9bf-eca7-43d5-bd5b-531d44751777/logging-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 03:06:44 crc kubenswrapper[4743]: I1011 03:06:44.230594 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_7959191f-6ca5-4f63-84e9-815b7378c505/manila-api-log/0.log" Oct 11 03:06:44 crc kubenswrapper[4743]: I1011 03:06:44.341868 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_7959191f-6ca5-4f63-84e9-815b7378c505/manila-api/0.log" Oct 11 03:06:44 crc kubenswrapper[4743]: I1011 03:06:44.438928 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_00706963-dfb0-45d7-a0be-5875e1ae0a8f/manila-scheduler/0.log" Oct 11 03:06:44 crc kubenswrapper[4743]: I1011 03:06:44.463076 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_00706963-dfb0-45d7-a0be-5875e1ae0a8f/probe/0.log" Oct 11 03:06:44 crc kubenswrapper[4743]: I1011 03:06:44.605123 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_0c713aa6-9d10-4baa-855c-a05256d83be7/probe/0.log" Oct 11 03:06:44 crc kubenswrapper[4743]: I1011 03:06:44.668397 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_0c713aa6-9d10-4baa-855c-a05256d83be7/manila-share/0.log" Oct 11 03:06:44 crc kubenswrapper[4743]: I1011 03:06:44.926102 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_50798e93-52c7-4ee3-b94a-295fbcc7eeba/mysqld-exporter/0.log" Oct 11 03:06:45 crc kubenswrapper[4743]: I1011 03:06:45.336154 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5496cd5f5c-c9jx6_fdc6c654-6370-4cc4-99a1-c13dfd402b14/neutron-httpd/0.log" Oct 11 03:06:45 crc kubenswrapper[4743]: I1011 03:06:45.391542 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5496cd5f5c-c9jx6_fdc6c654-6370-4cc4-99a1-c13dfd402b14/neutron-api/0.log" Oct 11 03:06:45 crc kubenswrapper[4743]: I1011 03:06:45.565340 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx96l_d792f039-d865-44e3-9474-be444dee2d03/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 03:06:46 crc kubenswrapper[4743]: I1011 03:06:46.304051 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_f2ddaae7-747a-4f05-bc0f-4f69fc15b816/nova-cell0-conductor-conductor/0.log" Oct 11 03:06:46 crc kubenswrapper[4743]: I1011 03:06:46.548842 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_41ea9288-7c98-4c3b-a903-76053391426e/nova-api-log/0.log" Oct 11 03:06:47 crc kubenswrapper[4743]: I1011 03:06:47.051324 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_6f8d6d52-4659-4bde-8eac-469d0008964d/nova-cell1-conductor-conductor/0.log" Oct 11 03:06:47 crc kubenswrapper[4743]: I1011 03:06:47.196124 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_41ea9288-7c98-4c3b-a903-76053391426e/nova-api-api/0.log" Oct 11 03:06:47 crc kubenswrapper[4743]: I1011 03:06:47.397483 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d685eabb-511e-4604-9716-7676177726d6/nova-cell1-novncproxy-novncproxy/0.log" Oct 11 03:06:47 crc kubenswrapper[4743]: I1011 03:06:47.490608 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kt9ff_75f90fbf-75a7-4b2a-af1a-7693cebeaea3/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 03:06:47 crc kubenswrapper[4743]: I1011 03:06:47.673017 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_10e4a6f0-05ef-4f39-96f6-1e44cd3753d4/nova-metadata-log/0.log" Oct 11 03:06:48 crc kubenswrapper[4743]: I1011 03:06:48.156323 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_449c91f5-e998-4889-b148-30f334b03bc8/nova-scheduler-scheduler/0.log" Oct 11 03:06:48 crc kubenswrapper[4743]: I1011 03:06:48.347699 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f604069e-dff8-4f02-a5e8-d3ba38d87625/mysql-bootstrap/0.log" Oct 11 03:06:48 crc kubenswrapper[4743]: I1011 03:06:48.545171 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f604069e-dff8-4f02-a5e8-d3ba38d87625/mysql-bootstrap/0.log" Oct 11 03:06:48 crc kubenswrapper[4743]: I1011 03:06:48.594386 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f604069e-dff8-4f02-a5e8-d3ba38d87625/galera/0.log" Oct 11 03:06:48 crc kubenswrapper[4743]: I1011 03:06:48.846892 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_32c52bf9-36b5-4a75-8991-e76f4dd87fb3/mysql-bootstrap/0.log" Oct 11 03:06:49 crc kubenswrapper[4743]: I1011 03:06:49.094040 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_32c52bf9-36b5-4a75-8991-e76f4dd87fb3/galera/0.log" Oct 11 03:06:49 crc kubenswrapper[4743]: I1011 03:06:49.099241 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_32c52bf9-36b5-4a75-8991-e76f4dd87fb3/mysql-bootstrap/0.log" Oct 11 03:06:49 crc kubenswrapper[4743]: I1011 03:06:49.339018 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_c02b1352-1ccf-4856-ad8e-328dab03135e/openstackclient/0.log" Oct 11 03:06:49 crc kubenswrapper[4743]: I1011 03:06:49.548910 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-fkcc9_145bdcc2-f62c-4dbf-bdb1-6ced4d65ba3a/openstack-network-exporter/0.log" Oct 11 03:06:49 crc kubenswrapper[4743]: I1011 03:06:49.859905 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-mwtxs_1ab33e99-afb5-4b67-89bd-a2eb540bf194/ovn-controller/0.log" Oct 11 03:06:50 crc kubenswrapper[4743]: I1011 03:06:50.370896 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6g6xb_ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387/ovsdb-server-init/0.log" Oct 11 03:06:50 crc kubenswrapper[4743]: I1011 03:06:50.471883 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6g6xb_ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387/ovsdb-server-init/0.log" Oct 11 03:06:50 crc kubenswrapper[4743]: I1011 03:06:50.602832 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6g6xb_ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387/ovs-vswitchd/0.log" Oct 11 03:06:50 crc kubenswrapper[4743]: I1011 03:06:50.668254 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6g6xb_ba2a34bc-8581-47ba-a9a9-5ba1ce8b6387/ovsdb-server/0.log" Oct 11 03:06:50 crc kubenswrapper[4743]: I1011 03:06:50.927433 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-w2vs8_00ade740-f798-4354-9e89-35aa325d8b92/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 03:06:51 crc kubenswrapper[4743]: I1011 03:06:51.125710 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_aae1bfd4-4ee9-4b40-bc1b-241275ef8097/openstack-network-exporter/0.log" Oct 11 03:06:51 crc kubenswrapper[4743]: I1011 03:06:51.199554 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_10e4a6f0-05ef-4f39-96f6-1e44cd3753d4/nova-metadata-metadata/0.log" Oct 11 03:06:51 crc kubenswrapper[4743]: I1011 03:06:51.293148 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_aae1bfd4-4ee9-4b40-bc1b-241275ef8097/ovn-northd/0.log" Oct 11 03:06:51 crc kubenswrapper[4743]: I1011 03:06:51.447964 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7c8284ba-a2b2-4f9f-a692-b372e8294d6b/openstack-network-exporter/0.log" Oct 11 03:06:51 crc kubenswrapper[4743]: I1011 03:06:51.502096 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7c8284ba-a2b2-4f9f-a692-b372e8294d6b/ovsdbserver-nb/0.log" Oct 11 03:06:51 crc kubenswrapper[4743]: I1011 03:06:51.634875 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_74c19249-ff95-4e49-96bb-1135e7aa1b08/openstack-network-exporter/0.log" Oct 11 03:06:51 crc kubenswrapper[4743]: I1011 03:06:51.713982 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_74c19249-ff95-4e49-96bb-1135e7aa1b08/ovsdbserver-sb/0.log" Oct 11 03:06:52 crc kubenswrapper[4743]: I1011 03:06:52.046991 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-979bff964-bxbgb_8f846c8e-d28a-4a2e-a5b6-bfc739de275b/placement-api/0.log" Oct 11 03:06:52 crc kubenswrapper[4743]: I1011 03:06:52.165518 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-979bff964-bxbgb_8f846c8e-d28a-4a2e-a5b6-bfc739de275b/placement-log/0.log" Oct 11 03:06:52 crc kubenswrapper[4743]: I1011 03:06:52.269938 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_eed36ee9-8239-4139-97f3-0e7b2962f45b/init-config-reloader/0.log" Oct 11 03:06:52 crc kubenswrapper[4743]: I1011 03:06:52.485886 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_eed36ee9-8239-4139-97f3-0e7b2962f45b/init-config-reloader/0.log" Oct 11 03:06:52 crc kubenswrapper[4743]: I1011 03:06:52.532324 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_eed36ee9-8239-4139-97f3-0e7b2962f45b/prometheus/0.log" Oct 11 03:06:52 crc kubenswrapper[4743]: I1011 03:06:52.542576 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_eed36ee9-8239-4139-97f3-0e7b2962f45b/config-reloader/0.log" Oct 11 03:06:52 crc kubenswrapper[4743]: I1011 03:06:52.726990 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_eed36ee9-8239-4139-97f3-0e7b2962f45b/thanos-sidecar/0.log" Oct 11 03:06:52 crc kubenswrapper[4743]: I1011 03:06:52.838869 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_de84c29c-4168-4383-aadc-0d5cc0ba56f8/setup-container/0.log" Oct 11 03:06:53 crc kubenswrapper[4743]: I1011 03:06:53.042444 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_de84c29c-4168-4383-aadc-0d5cc0ba56f8/setup-container/0.log" Oct 11 03:06:53 crc kubenswrapper[4743]: I1011 03:06:53.073080 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_de84c29c-4168-4383-aadc-0d5cc0ba56f8/rabbitmq/0.log" Oct 11 03:06:53 crc kubenswrapper[4743]: I1011 03:06:53.274371 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_38225901-8300-41cc-8e32-748b754660dc/setup-container/0.log" Oct 11 03:06:53 crc kubenswrapper[4743]: I1011 03:06:53.515280 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_38225901-8300-41cc-8e32-748b754660dc/setup-container/0.log" Oct 11 03:06:53 crc kubenswrapper[4743]: I1011 03:06:53.556699 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_38225901-8300-41cc-8e32-748b754660dc/rabbitmq/0.log" Oct 11 03:06:53 crc kubenswrapper[4743]: I1011 03:06:53.775154 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-m57rp_dc1bb7f9-85e4-4c20-b91b-5dc32f87c1c4/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 03:06:53 crc kubenswrapper[4743]: I1011 03:06:53.795680 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-2fx2x_b9069bf9-41de-4faf-ad86-3913be33cb1a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 03:06:53 crc kubenswrapper[4743]: I1011 03:06:53.995514 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-lrpfr_08cb63b9-9798-4e9f-9df8-7a1676dbe1f8/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 03:06:54 crc kubenswrapper[4743]: I1011 03:06:54.395259 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-7kvlt_45eba0ce-a54b-4530-a391-35572fb868aa/ssh-known-hosts-edpm-deployment/0.log" Oct 11 03:06:54 crc kubenswrapper[4743]: I1011 03:06:54.681874 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-744b8cd687-p7lgl_68219217-d875-4eb2-9611-b9afb0f64c45/proxy-server/0.log" Oct 11 03:06:54 crc kubenswrapper[4743]: I1011 03:06:54.823795 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-744b8cd687-p7lgl_68219217-d875-4eb2-9611-b9afb0f64c45/proxy-httpd/0.log" Oct 11 03:06:54 crc kubenswrapper[4743]: I1011 03:06:54.878941 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-8qvc2_5718aabd-82b4-4079-96f4-d241fb2c8efc/swift-ring-rebalance/0.log" Oct 11 03:06:55 crc kubenswrapper[4743]: I1011 03:06:55.111639 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/account-reaper/0.log" Oct 11 03:06:55 crc kubenswrapper[4743]: I1011 03:06:55.126414 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/account-auditor/0.log" Oct 11 03:06:55 crc kubenswrapper[4743]: I1011 03:06:55.326499 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/account-replicator/0.log" Oct 11 03:06:55 crc kubenswrapper[4743]: I1011 03:06:55.334307 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/container-auditor/0.log" Oct 11 03:06:55 crc kubenswrapper[4743]: I1011 03:06:55.358514 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/account-server/0.log" Oct 11 03:06:55 crc kubenswrapper[4743]: I1011 03:06:55.530291 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/container-server/0.log" Oct 11 03:06:55 crc kubenswrapper[4743]: I1011 03:06:55.633986 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/container-updater/0.log" Oct 11 03:06:55 crc kubenswrapper[4743]: I1011 03:06:55.638731 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/container-replicator/0.log" Oct 11 03:06:55 crc kubenswrapper[4743]: I1011 03:06:55.761575 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/object-auditor/0.log" Oct 11 03:06:55 crc kubenswrapper[4743]: I1011 03:06:55.854711 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/object-expirer/0.log" Oct 11 03:06:55 crc kubenswrapper[4743]: I1011 03:06:55.916865 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/object-replicator/0.log" Oct 11 03:06:56 crc kubenswrapper[4743]: I1011 03:06:56.006059 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/object-server/0.log" Oct 11 03:06:56 crc kubenswrapper[4743]: I1011 03:06:56.112692 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/object-updater/0.log" Oct 11 03:06:56 crc kubenswrapper[4743]: I1011 03:06:56.136289 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/rsync/0.log" Oct 11 03:06:56 crc kubenswrapper[4743]: I1011 03:06:56.260520 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_082aa898-adc9-4e0d-a5e3-329d36f391aa/swift-recon-cron/0.log" Oct 11 03:06:56 crc kubenswrapper[4743]: I1011 03:06:56.405895 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-4tnjd_7585335b-a755-40a1-b388-d90e2fa07121/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 03:06:56 crc kubenswrapper[4743]: I1011 03:06:56.598294 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-bwgfg_f10a464d-943b-4c74-88f8-7d76dbdac358/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 03:06:56 crc kubenswrapper[4743]: I1011 03:06:56.926566 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_795afd2c-7ccb-435d-8cc6-6ef474ddf6e1/test-operator-logs-container/0.log" Oct 11 03:06:57 crc kubenswrapper[4743]: I1011 03:06:57.192612 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-x6f6h_cb69ff06-2c84-40a1-805b-349c4fbfe3ba/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 11 03:06:57 crc kubenswrapper[4743]: I1011 03:06:57.679576 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_59e812a1-677a-4aca-bb9a-c4f0d166710a/tempest-tests-tempest-tests-runner/0.log" Oct 11 03:07:06 crc kubenswrapper[4743]: I1011 03:07:06.913065 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f1159224-8c5f-43ae-8aa3-ca628c69914e/memcached/0.log" Oct 11 03:07:24 crc kubenswrapper[4743]: I1011 03:07:24.010424 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6_61bd1af6-1438-46b0-8762-6ee4abb576cd/util/0.log" Oct 11 03:07:24 crc kubenswrapper[4743]: I1011 03:07:24.185110 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6_61bd1af6-1438-46b0-8762-6ee4abb576cd/pull/0.log" Oct 11 03:07:24 crc kubenswrapper[4743]: I1011 03:07:24.190035 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6_61bd1af6-1438-46b0-8762-6ee4abb576cd/util/0.log" Oct 11 03:07:24 crc kubenswrapper[4743]: I1011 03:07:24.246353 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6_61bd1af6-1438-46b0-8762-6ee4abb576cd/pull/0.log" Oct 11 03:07:24 crc kubenswrapper[4743]: I1011 03:07:24.452006 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6_61bd1af6-1438-46b0-8762-6ee4abb576cd/util/0.log" Oct 11 03:07:24 crc kubenswrapper[4743]: I1011 03:07:24.453578 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6_61bd1af6-1438-46b0-8762-6ee4abb576cd/extract/0.log" Oct 11 03:07:24 crc kubenswrapper[4743]: I1011 03:07:24.477078 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_361c556ca48d23f340d126ae2019f729a5671f38f3b0288da3af0a85856dmz6_61bd1af6-1438-46b0-8762-6ee4abb576cd/pull/0.log" Oct 11 03:07:24 crc kubenswrapper[4743]: I1011 03:07:24.670035 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-nfnkk_ed4aa42c-bd83-4fa8-99f2-5d7cde436979/kube-rbac-proxy/0.log" Oct 11 03:07:24 crc kubenswrapper[4743]: I1011 03:07:24.729967 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-lbnjc_088785b6-72f1-472b-accb-fef0261e024b/kube-rbac-proxy/0.log" Oct 11 03:07:24 crc kubenswrapper[4743]: I1011 03:07:24.798215 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-nfnkk_ed4aa42c-bd83-4fa8-99f2-5d7cde436979/manager/0.log" Oct 11 03:07:24 crc kubenswrapper[4743]: I1011 03:07:24.935717 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-lbnjc_088785b6-72f1-472b-accb-fef0261e024b/manager/0.log" Oct 11 03:07:24 crc kubenswrapper[4743]: I1011 03:07:24.958641 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-95j97_01d65505-63e1-4355-a1dc-675d22f5bdea/kube-rbac-proxy/0.log" Oct 11 03:07:24 crc kubenswrapper[4743]: I1011 03:07:24.989953 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-95j97_01d65505-63e1-4355-a1dc-675d22f5bdea/manager/0.log" Oct 11 03:07:25 crc kubenswrapper[4743]: I1011 03:07:25.110112 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-d58jj_3f7d0e6e-1b92-48de-910b-ef415fac5e7c/kube-rbac-proxy/0.log" Oct 11 03:07:25 crc kubenswrapper[4743]: I1011 03:07:25.210358 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-d58jj_3f7d0e6e-1b92-48de-910b-ef415fac5e7c/manager/0.log" Oct 11 03:07:25 crc kubenswrapper[4743]: I1011 03:07:25.313804 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-sjsvr_a2f6fc09-a2cb-46e0-9fe6-e5dad1025bfe/kube-rbac-proxy/0.log" Oct 11 03:07:25 crc kubenswrapper[4743]: I1011 03:07:25.424448 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-5t64t_547f1b93-dffd-4c63-964a-e3ea6d29970e/kube-rbac-proxy/0.log" Oct 11 03:07:25 crc kubenswrapper[4743]: I1011 03:07:25.458114 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-sjsvr_a2f6fc09-a2cb-46e0-9fe6-e5dad1025bfe/manager/0.log" Oct 11 03:07:25 crc kubenswrapper[4743]: I1011 03:07:25.508708 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-5t64t_547f1b93-dffd-4c63-964a-e3ea6d29970e/manager/0.log" Oct 11 03:07:25 crc kubenswrapper[4743]: I1011 03:07:25.624236 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-6pvq9_f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b/kube-rbac-proxy/0.log" Oct 11 03:07:25 crc kubenswrapper[4743]: I1011 03:07:25.820100 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-6pvq9_f8621ecf-ab1f-40e5-9dd9-4d9d6fd8563b/manager/0.log" Oct 11 03:07:25 crc kubenswrapper[4743]: I1011 03:07:25.869355 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-zrk9n_53fca326-a309-4ff5-b52f-8b547496c069/manager/0.log" Oct 11 03:07:25 crc kubenswrapper[4743]: I1011 03:07:25.872147 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-zrk9n_53fca326-a309-4ff5-b52f-8b547496c069/kube-rbac-proxy/0.log" Oct 11 03:07:26 crc kubenswrapper[4743]: I1011 03:07:26.004865 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-cbc66_2726f212-a3ba-48cb-a96f-8d5f117a7f5e/kube-rbac-proxy/0.log" Oct 11 03:07:26 crc kubenswrapper[4743]: I1011 03:07:26.133537 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-cbc66_2726f212-a3ba-48cb-a96f-8d5f117a7f5e/manager/0.log" Oct 11 03:07:26 crc kubenswrapper[4743]: I1011 03:07:26.148213 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-vvcjm_af96a190-49a7-4179-be4f-4a636d004cd0/kube-rbac-proxy/0.log" Oct 11 03:07:26 crc kubenswrapper[4743]: I1011 03:07:26.252139 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-vvcjm_af96a190-49a7-4179-be4f-4a636d004cd0/manager/0.log" Oct 11 03:07:26 crc kubenswrapper[4743]: I1011 03:07:26.336779 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-qs4hf_be590325-60d2-4f91-9e78-a5520788cfed/kube-rbac-proxy/0.log" Oct 11 03:07:26 crc kubenswrapper[4743]: I1011 03:07:26.358247 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-qs4hf_be590325-60d2-4f91-9e78-a5520788cfed/manager/0.log" Oct 11 03:07:26 crc kubenswrapper[4743]: I1011 03:07:26.507653 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-2htrs_035ca3e9-4fd6-4cb1-8e82-8ebcf39146a3/kube-rbac-proxy/0.log" Oct 11 03:07:26 crc kubenswrapper[4743]: I1011 03:07:26.559195 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-2htrs_035ca3e9-4fd6-4cb1-8e82-8ebcf39146a3/manager/0.log" Oct 11 03:07:26 crc kubenswrapper[4743]: I1011 03:07:26.628554 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-d29v4_4be72879-00a9-4253-9ad9-c266c32b968e/kube-rbac-proxy/0.log" Oct 11 03:07:26 crc kubenswrapper[4743]: I1011 03:07:26.760171 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-d29v4_4be72879-00a9-4253-9ad9-c266c32b968e/manager/0.log" Oct 11 03:07:26 crc kubenswrapper[4743]: I1011 03:07:26.824221 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-nt865_a79f9419-0d04-41bb-b1ab-1615888819df/manager/0.log" Oct 11 03:07:26 crc kubenswrapper[4743]: I1011 03:07:26.838804 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-nt865_a79f9419-0d04-41bb-b1ab-1615888819df/kube-rbac-proxy/0.log" Oct 11 03:07:26 crc kubenswrapper[4743]: I1011 03:07:26.958961 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv_6b8570e7-9b29-4a55-95e3-a3a588ba4083/kube-rbac-proxy/0.log" Oct 11 03:07:27 crc kubenswrapper[4743]: I1011 03:07:27.028323 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dddnmv_6b8570e7-9b29-4a55-95e3-a3a588ba4083/manager/0.log" Oct 11 03:07:27 crc kubenswrapper[4743]: I1011 03:07:27.111426 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6bc9d748dc-5vs6z_cec9bfbd-c515-4bcc-8bf7-63648ecd230b/kube-rbac-proxy/0.log" Oct 11 03:07:27 crc kubenswrapper[4743]: I1011 03:07:27.312241 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-79cb6b48d5-wqg8k_9607e624-b661-41bd-bfaf-ceb7e552fbf2/kube-rbac-proxy/0.log" Oct 11 03:07:27 crc kubenswrapper[4743]: I1011 03:07:27.475640 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-79cb6b48d5-wqg8k_9607e624-b661-41bd-bfaf-ceb7e552fbf2/operator/0.log" Oct 11 03:07:27 crc kubenswrapper[4743]: I1011 03:07:27.502458 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-848vd_bd73b279-e2a4-4500-aed1-70c73212cba1/registry-server/0.log" Oct 11 03:07:27 crc kubenswrapper[4743]: I1011 03:07:27.648630 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-vt9w7_c39ea94c-0f30-4b04-8a87-848ee9a62740/kube-rbac-proxy/0.log" Oct 11 03:07:27 crc kubenswrapper[4743]: I1011 03:07:27.708094 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-vt9w7_c39ea94c-0f30-4b04-8a87-848ee9a62740/manager/0.log" Oct 11 03:07:27 crc kubenswrapper[4743]: I1011 03:07:27.882660 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-6pzz7_884ddb30-3a64-4bf2-83ac-3acc83e8bd96/kube-rbac-proxy/0.log" Oct 11 03:07:27 crc kubenswrapper[4743]: I1011 03:07:27.905584 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-6pzz7_884ddb30-3a64-4bf2-83ac-3acc83e8bd96/manager/0.log" Oct 11 03:07:28 crc kubenswrapper[4743]: I1011 03:07:28.079349 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-w94wj_fd811aaa-ab9b-4d34-a268-ecbfa76bf43a/operator/0.log" Oct 11 03:07:28 crc kubenswrapper[4743]: I1011 03:07:28.247849 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-cxzw6_36bd30aa-037d-4e7d-ae0f-fb53fe20f812/kube-rbac-proxy/0.log" Oct 11 03:07:28 crc kubenswrapper[4743]: I1011 03:07:28.247847 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-cxzw6_36bd30aa-037d-4e7d-ae0f-fb53fe20f812/manager/0.log" Oct 11 03:07:28 crc kubenswrapper[4743]: I1011 03:07:28.395849 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-679ff79844-2dvm2_7967fda3-d5ca-4e28-878e-e50017efc60f/kube-rbac-proxy/0.log" Oct 11 03:07:28 crc kubenswrapper[4743]: I1011 03:07:28.572757 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-9qjkk_962796b2-2bc0-4db5-84be-df36bbc28121/kube-rbac-proxy/0.log" Oct 11 03:07:28 crc kubenswrapper[4743]: I1011 03:07:28.664674 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-9qjkk_962796b2-2bc0-4db5-84be-df36bbc28121/manager/0.log" Oct 11 03:07:28 crc kubenswrapper[4743]: I1011 03:07:28.779047 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-ctp5x_f1a6e436-6a36-45a8-a033-f8d307ba12bd/kube-rbac-proxy/0.log" Oct 11 03:07:28 crc kubenswrapper[4743]: I1011 03:07:28.899075 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-679ff79844-2dvm2_7967fda3-d5ca-4e28-878e-e50017efc60f/manager/0.log" Oct 11 03:07:28 crc kubenswrapper[4743]: I1011 03:07:28.958183 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6bc9d748dc-5vs6z_cec9bfbd-c515-4bcc-8bf7-63648ecd230b/manager/0.log" Oct 11 03:07:28 crc kubenswrapper[4743]: I1011 03:07:28.965529 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-ctp5x_f1a6e436-6a36-45a8-a033-f8d307ba12bd/manager/0.log" Oct 11 03:07:43 crc kubenswrapper[4743]: I1011 03:07:43.771623 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-rt2nd_5690cb54-1fbe-4d33-a809-b7bdca4df6c0/control-plane-machine-set-operator/0.log" Oct 11 03:07:43 crc kubenswrapper[4743]: I1011 03:07:43.972897 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tljzf_79462f0e-13e0-4ee7-af5f-02e6e5cd849d/machine-api-operator/0.log" Oct 11 03:07:43 crc kubenswrapper[4743]: I1011 03:07:43.973217 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tljzf_79462f0e-13e0-4ee7-af5f-02e6e5cd849d/kube-rbac-proxy/0.log" Oct 11 03:07:55 crc kubenswrapper[4743]: I1011 03:07:55.301374 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-4xbkk_3862bd0e-7310-4227-9d4f-8eb551293343/cert-manager-controller/0.log" Oct 11 03:07:55 crc kubenswrapper[4743]: I1011 03:07:55.420227 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-gf47h_5796afa2-031a-4046-b0ed-d2f728e700db/cert-manager-cainjector/0.log" Oct 11 03:07:55 crc kubenswrapper[4743]: I1011 03:07:55.516036 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-xdlw4_60052a4b-2c20-4f20-b109-ca070b9e11e6/cert-manager-webhook/0.log" Oct 11 03:08:06 crc kubenswrapper[4743]: I1011 03:08:06.628746 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-gvckr_f12fff02-4cd4-437d-b704-99766f165f0e/nmstate-console-plugin/0.log" Oct 11 03:08:06 crc kubenswrapper[4743]: I1011 03:08:06.800151 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-j4sk5_8e3fe60f-ace7-445d-8994-03a95ff90479/nmstate-handler/0.log" Oct 11 03:08:06 crc kubenswrapper[4743]: I1011 03:08:06.809976 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-2v928_c2323cd1-eebb-46d7-9393-586093c921f1/nmstate-metrics/0.log" Oct 11 03:08:06 crc kubenswrapper[4743]: I1011 03:08:06.810055 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-2v928_c2323cd1-eebb-46d7-9393-586093c921f1/kube-rbac-proxy/0.log" Oct 11 03:08:06 crc kubenswrapper[4743]: I1011 03:08:06.984063 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-bmzmc_0389ac5e-634b-4dd6-a9a8-084cb349b29e/nmstate-operator/0.log" Oct 11 03:08:07 crc kubenswrapper[4743]: I1011 03:08:07.056274 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-nl9th_6d192208-f92a-4299-866d-14cf8ecffe17/nmstate-webhook/0.log" Oct 11 03:08:14 crc kubenswrapper[4743]: I1011 03:08:14.458536 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 03:08:14 crc kubenswrapper[4743]: I1011 03:08:14.459146 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 03:08:18 crc kubenswrapper[4743]: I1011 03:08:18.177268 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7857f779b4-t484n_d60b36a6-87ff-4325-a245-43b3dea4cfaf/kube-rbac-proxy/0.log" Oct 11 03:08:18 crc kubenswrapper[4743]: I1011 03:08:18.240847 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7857f779b4-t484n_d60b36a6-87ff-4325-a245-43b3dea4cfaf/manager/0.log" Oct 11 03:08:30 crc kubenswrapper[4743]: I1011 03:08:30.767433 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-8958c8b87-zqct6_e4f8060c-3f3e-4e5d-85c5-3c344322869a/cluster-logging-operator/0.log" Oct 11 03:08:30 crc kubenswrapper[4743]: I1011 03:08:30.899049 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-fdslp_15f63aaf-1998-4daa-8ebf-1f9455b483e5/collector/0.log" Oct 11 03:08:30 crc kubenswrapper[4743]: I1011 03:08:30.978156 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_03508b98-c7f8-4ffd-9417-074307cd588e/loki-compactor/0.log" Oct 11 03:08:31 crc kubenswrapper[4743]: I1011 03:08:31.078408 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-6f5f7fff97-72rvc_1f759884-04cd-4b18-90dd-9b4745c12ba7/loki-distributor/0.log" Oct 11 03:08:31 crc kubenswrapper[4743]: I1011 03:08:31.153732 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-c45fcc855-8mtnp_60332ecb-34a5-4628-9311-4469d823f589/gateway/0.log" Oct 11 03:08:31 crc kubenswrapper[4743]: I1011 03:08:31.181750 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-c45fcc855-8mtnp_60332ecb-34a5-4628-9311-4469d823f589/opa/0.log" Oct 11 03:08:31 crc kubenswrapper[4743]: I1011 03:08:31.283397 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-c45fcc855-ddvcq_443c6346-a364-4e67-8c53-2bcd9b1f0927/gateway/0.log" Oct 11 03:08:31 crc kubenswrapper[4743]: I1011 03:08:31.345805 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-c45fcc855-ddvcq_443c6346-a364-4e67-8c53-2bcd9b1f0927/opa/0.log" Oct 11 03:08:31 crc kubenswrapper[4743]: I1011 03:08:31.456154 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_a85d8329-17fe-4d06-b45e-d410514cc210/loki-index-gateway/0.log" Oct 11 03:08:31 crc kubenswrapper[4743]: I1011 03:08:31.614040 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_bf0c8290-1118-4dbf-a638-bde5c07bdaab/loki-ingester/0.log" Oct 11 03:08:31 crc kubenswrapper[4743]: I1011 03:08:31.669747 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-5d954896cf-xd7bv_7b393f9c-b255-4cac-96f2-3d5861cc7cce/loki-querier/0.log" Oct 11 03:08:31 crc kubenswrapper[4743]: I1011 03:08:31.843285 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-6fbbbc8b7d-zzmff_58a37184-c341-454d-b33e-a2af6dc56af3/loki-query-frontend/0.log" Oct 11 03:08:44 crc kubenswrapper[4743]: I1011 03:08:44.159585 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-bwqtl_3b7ecaea-d42f-44b0-a181-3c61cf45bde2/kube-rbac-proxy/0.log" Oct 11 03:08:44 crc kubenswrapper[4743]: I1011 03:08:44.430198 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/cp-frr-files/0.log" Oct 11 03:08:44 crc kubenswrapper[4743]: I1011 03:08:44.457707 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 03:08:44 crc kubenswrapper[4743]: I1011 03:08:44.457772 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 03:08:44 crc kubenswrapper[4743]: I1011 03:08:44.471792 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-bwqtl_3b7ecaea-d42f-44b0-a181-3c61cf45bde2/controller/0.log" Oct 11 03:08:44 crc kubenswrapper[4743]: I1011 03:08:44.613372 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/cp-frr-files/0.log" Oct 11 03:08:44 crc kubenswrapper[4743]: I1011 03:08:44.663791 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/cp-reloader/0.log" Oct 11 03:08:44 crc kubenswrapper[4743]: I1011 03:08:44.696040 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/cp-reloader/0.log" Oct 11 03:08:44 crc kubenswrapper[4743]: I1011 03:08:44.719645 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/cp-metrics/0.log" Oct 11 03:08:44 crc kubenswrapper[4743]: I1011 03:08:44.872372 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/cp-metrics/0.log" Oct 11 03:08:44 crc kubenswrapper[4743]: I1011 03:08:44.875300 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/cp-reloader/0.log" Oct 11 03:08:44 crc kubenswrapper[4743]: I1011 03:08:44.884410 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/cp-frr-files/0.log" Oct 11 03:08:44 crc kubenswrapper[4743]: I1011 03:08:44.919393 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/cp-metrics/0.log" Oct 11 03:08:45 crc kubenswrapper[4743]: I1011 03:08:45.081786 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/cp-reloader/0.log" Oct 11 03:08:45 crc kubenswrapper[4743]: I1011 03:08:45.124119 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/cp-metrics/0.log" Oct 11 03:08:45 crc kubenswrapper[4743]: I1011 03:08:45.124682 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/cp-frr-files/0.log" Oct 11 03:08:45 crc kubenswrapper[4743]: I1011 03:08:45.136900 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/controller/0.log" Oct 11 03:08:45 crc kubenswrapper[4743]: I1011 03:08:45.339893 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/kube-rbac-proxy/0.log" Oct 11 03:08:45 crc kubenswrapper[4743]: I1011 03:08:45.357271 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/frr-metrics/0.log" Oct 11 03:08:45 crc kubenswrapper[4743]: I1011 03:08:45.411147 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/kube-rbac-proxy-frr/0.log" Oct 11 03:08:45 crc kubenswrapper[4743]: I1011 03:08:45.618837 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/reloader/0.log" Oct 11 03:08:45 crc kubenswrapper[4743]: I1011 03:08:45.669502 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-jct7h_2772ef34-f307-4a91-8f2f-28e3b22375a0/frr-k8s-webhook-server/0.log" Oct 11 03:08:45 crc kubenswrapper[4743]: I1011 03:08:45.907185 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7969b47488-dm7g4_ef8e01c5-e132-4e07-9ce3-9a5578548ad7/manager/0.log" Oct 11 03:08:46 crc kubenswrapper[4743]: I1011 03:08:46.043424 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-79c6c9bd96-5sx97_3cf19c00-f066-4814-9134-4a6d4aed88a7/webhook-server/0.log" Oct 11 03:08:46 crc kubenswrapper[4743]: I1011 03:08:46.136983 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-9g4hk_83bddd85-204d-438d-a29f-e7fca659542a/kube-rbac-proxy/0.log" Oct 11 03:08:46 crc kubenswrapper[4743]: I1011 03:08:46.905585 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-9g4hk_83bddd85-204d-438d-a29f-e7fca659542a/speaker/0.log" Oct 11 03:08:47 crc kubenswrapper[4743]: I1011 03:08:47.721772 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-v524d_95e8f67c-537e-4744-a3c2-7dd93084f455/frr/0.log" Oct 11 03:08:58 crc kubenswrapper[4743]: I1011 03:08:58.673340 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x_2d977ae5-754b-436a-87b9-b0618947c353/util/0.log" Oct 11 03:08:58 crc kubenswrapper[4743]: I1011 03:08:58.843473 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x_2d977ae5-754b-436a-87b9-b0618947c353/pull/0.log" Oct 11 03:08:58 crc kubenswrapper[4743]: I1011 03:08:58.867388 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x_2d977ae5-754b-436a-87b9-b0618947c353/util/0.log" Oct 11 03:08:58 crc kubenswrapper[4743]: I1011 03:08:58.880146 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x_2d977ae5-754b-436a-87b9-b0618947c353/pull/0.log" Oct 11 03:08:59 crc kubenswrapper[4743]: I1011 03:08:59.027962 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x_2d977ae5-754b-436a-87b9-b0618947c353/pull/0.log" Oct 11 03:08:59 crc kubenswrapper[4743]: I1011 03:08:59.053707 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x_2d977ae5-754b-436a-87b9-b0618947c353/util/0.log" Oct 11 03:08:59 crc kubenswrapper[4743]: I1011 03:08:59.059852 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0999b653a84702c2c11c13a5389e053aa7f0501a9a88eec9014235a37ds656x_2d977ae5-754b-436a-87b9-b0618947c353/extract/0.log" Oct 11 03:08:59 crc kubenswrapper[4743]: I1011 03:08:59.437589 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8_69d0b6c4-03c5-4968-9dfb-0b6b2774954a/util/0.log" Oct 11 03:08:59 crc kubenswrapper[4743]: I1011 03:08:59.540458 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8_69d0b6c4-03c5-4968-9dfb-0b6b2774954a/util/0.log" Oct 11 03:08:59 crc kubenswrapper[4743]: I1011 03:08:59.568674 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8_69d0b6c4-03c5-4968-9dfb-0b6b2774954a/pull/0.log" Oct 11 03:08:59 crc kubenswrapper[4743]: I1011 03:08:59.615970 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8_69d0b6c4-03c5-4968-9dfb-0b6b2774954a/pull/0.log" Oct 11 03:08:59 crc kubenswrapper[4743]: I1011 03:08:59.726589 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8_69d0b6c4-03c5-4968-9dfb-0b6b2774954a/util/0.log" Oct 11 03:08:59 crc kubenswrapper[4743]: I1011 03:08:59.741544 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8_69d0b6c4-03c5-4968-9dfb-0b6b2774954a/extract/0.log" Oct 11 03:08:59 crc kubenswrapper[4743]: I1011 03:08:59.775371 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d228vb8_69d0b6c4-03c5-4968-9dfb-0b6b2774954a/pull/0.log" Oct 11 03:08:59 crc kubenswrapper[4743]: I1011 03:08:59.925122 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt_f424c028-a0f6-4327-90f7-338a9d21043c/util/0.log" Oct 11 03:09:00 crc kubenswrapper[4743]: I1011 03:09:00.122779 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt_f424c028-a0f6-4327-90f7-338a9d21043c/util/0.log" Oct 11 03:09:00 crc kubenswrapper[4743]: I1011 03:09:00.129104 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt_f424c028-a0f6-4327-90f7-338a9d21043c/pull/0.log" Oct 11 03:09:00 crc kubenswrapper[4743]: I1011 03:09:00.139788 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt_f424c028-a0f6-4327-90f7-338a9d21043c/pull/0.log" Oct 11 03:09:00 crc kubenswrapper[4743]: I1011 03:09:00.271029 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt_f424c028-a0f6-4327-90f7-338a9d21043c/util/0.log" Oct 11 03:09:00 crc kubenswrapper[4743]: I1011 03:09:00.289934 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt_f424c028-a0f6-4327-90f7-338a9d21043c/pull/0.log" Oct 11 03:09:00 crc kubenswrapper[4743]: I1011 03:09:00.315548 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddtlkt_f424c028-a0f6-4327-90f7-338a9d21043c/extract/0.log" Oct 11 03:09:00 crc kubenswrapper[4743]: I1011 03:09:00.451349 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc_3f25a7dd-4148-4251-9896-0d781682a3c3/util/0.log" Oct 11 03:09:00 crc kubenswrapper[4743]: I1011 03:09:00.660192 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc_3f25a7dd-4148-4251-9896-0d781682a3c3/util/0.log" Oct 11 03:09:00 crc kubenswrapper[4743]: I1011 03:09:00.674108 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc_3f25a7dd-4148-4251-9896-0d781682a3c3/pull/0.log" Oct 11 03:09:00 crc kubenswrapper[4743]: I1011 03:09:00.681604 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc_3f25a7dd-4148-4251-9896-0d781682a3c3/pull/0.log" Oct 11 03:09:00 crc kubenswrapper[4743]: I1011 03:09:00.863568 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc_3f25a7dd-4148-4251-9896-0d781682a3c3/pull/0.log" Oct 11 03:09:00 crc kubenswrapper[4743]: I1011 03:09:00.876578 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc_3f25a7dd-4148-4251-9896-0d781682a3c3/util/0.log" Oct 11 03:09:00 crc kubenswrapper[4743]: I1011 03:09:00.882402 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b750ce2fcb78a523ce3c4c91d54ad8430abb37e936593acebfbbbfa6018n9vc_3f25a7dd-4148-4251-9896-0d781682a3c3/extract/0.log" Oct 11 03:09:01 crc kubenswrapper[4743]: I1011 03:09:01.062244 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wqwgk_3543dfa9-ce3b-48b3-bc13-70ade6294a3b/extract-utilities/0.log" Oct 11 03:09:01 crc kubenswrapper[4743]: I1011 03:09:01.222120 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wqwgk_3543dfa9-ce3b-48b3-bc13-70ade6294a3b/extract-content/0.log" Oct 11 03:09:01 crc kubenswrapper[4743]: I1011 03:09:01.290685 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wqwgk_3543dfa9-ce3b-48b3-bc13-70ade6294a3b/extract-utilities/0.log" Oct 11 03:09:01 crc kubenswrapper[4743]: I1011 03:09:01.293660 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wqwgk_3543dfa9-ce3b-48b3-bc13-70ade6294a3b/extract-content/0.log" Oct 11 03:09:01 crc kubenswrapper[4743]: I1011 03:09:01.469716 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wqwgk_3543dfa9-ce3b-48b3-bc13-70ade6294a3b/extract-utilities/0.log" Oct 11 03:09:01 crc kubenswrapper[4743]: I1011 03:09:01.535302 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wqwgk_3543dfa9-ce3b-48b3-bc13-70ade6294a3b/extract-content/0.log" Oct 11 03:09:01 crc kubenswrapper[4743]: I1011 03:09:01.764965 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l6gl2_8b619db5-930f-4298-9d1d-2c74a9e60783/extract-utilities/0.log" Oct 11 03:09:01 crc kubenswrapper[4743]: I1011 03:09:01.992146 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l6gl2_8b619db5-930f-4298-9d1d-2c74a9e60783/extract-content/0.log" Oct 11 03:09:02 crc kubenswrapper[4743]: I1011 03:09:02.001778 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l6gl2_8b619db5-930f-4298-9d1d-2c74a9e60783/extract-utilities/0.log" Oct 11 03:09:02 crc kubenswrapper[4743]: I1011 03:09:02.048000 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l6gl2_8b619db5-930f-4298-9d1d-2c74a9e60783/extract-content/0.log" Oct 11 03:09:02 crc kubenswrapper[4743]: I1011 03:09:02.333813 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l6gl2_8b619db5-930f-4298-9d1d-2c74a9e60783/extract-utilities/0.log" Oct 11 03:09:02 crc kubenswrapper[4743]: I1011 03:09:02.359916 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l6gl2_8b619db5-930f-4298-9d1d-2c74a9e60783/extract-content/0.log" Oct 11 03:09:02 crc kubenswrapper[4743]: I1011 03:09:02.441284 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wqwgk_3543dfa9-ce3b-48b3-bc13-70ade6294a3b/registry-server/0.log" Oct 11 03:09:02 crc kubenswrapper[4743]: I1011 03:09:02.665167 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk_7c5ed10c-1531-47da-8329-5589a12da9ac/util/0.log" Oct 11 03:09:02 crc kubenswrapper[4743]: I1011 03:09:02.782769 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk_7c5ed10c-1531-47da-8329-5589a12da9ac/util/0.log" Oct 11 03:09:02 crc kubenswrapper[4743]: I1011 03:09:02.856990 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk_7c5ed10c-1531-47da-8329-5589a12da9ac/pull/0.log" Oct 11 03:09:02 crc kubenswrapper[4743]: I1011 03:09:02.858198 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk_7c5ed10c-1531-47da-8329-5589a12da9ac/pull/0.log" Oct 11 03:09:03 crc kubenswrapper[4743]: I1011 03:09:03.105244 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk_7c5ed10c-1531-47da-8329-5589a12da9ac/pull/0.log" Oct 11 03:09:03 crc kubenswrapper[4743]: I1011 03:09:03.133589 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk_7c5ed10c-1531-47da-8329-5589a12da9ac/extract/0.log" Oct 11 03:09:03 crc kubenswrapper[4743]: I1011 03:09:03.206236 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6ntgk_7c5ed10c-1531-47da-8329-5589a12da9ac/util/0.log" Oct 11 03:09:03 crc kubenswrapper[4743]: I1011 03:09:03.336063 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-cpcdj_ecba19cf-13a3-40ee-8d5a-17af54a79caa/marketplace-operator/0.log" Oct 11 03:09:03 crc kubenswrapper[4743]: I1011 03:09:03.406029 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttsr5_3bd2c6b7-4918-4cb2-abcd-efa1523befe0/extract-utilities/0.log" Oct 11 03:09:03 crc kubenswrapper[4743]: I1011 03:09:03.542061 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l6gl2_8b619db5-930f-4298-9d1d-2c74a9e60783/registry-server/0.log" Oct 11 03:09:03 crc kubenswrapper[4743]: I1011 03:09:03.595439 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttsr5_3bd2c6b7-4918-4cb2-abcd-efa1523befe0/extract-content/0.log" Oct 11 03:09:03 crc kubenswrapper[4743]: I1011 03:09:03.635208 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttsr5_3bd2c6b7-4918-4cb2-abcd-efa1523befe0/extract-utilities/0.log" Oct 11 03:09:03 crc kubenswrapper[4743]: I1011 03:09:03.651060 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttsr5_3bd2c6b7-4918-4cb2-abcd-efa1523befe0/extract-content/0.log" Oct 11 03:09:03 crc kubenswrapper[4743]: I1011 03:09:03.815481 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttsr5_3bd2c6b7-4918-4cb2-abcd-efa1523befe0/extract-utilities/0.log" Oct 11 03:09:03 crc kubenswrapper[4743]: I1011 03:09:03.870786 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vs4lk_a6061b50-92be-4942-b961-a094b28b50a9/extract-utilities/0.log" Oct 11 03:09:03 crc kubenswrapper[4743]: I1011 03:09:03.904399 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttsr5_3bd2c6b7-4918-4cb2-abcd-efa1523befe0/extract-content/0.log" Oct 11 03:09:04 crc kubenswrapper[4743]: I1011 03:09:04.071404 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ttsr5_3bd2c6b7-4918-4cb2-abcd-efa1523befe0/registry-server/0.log" Oct 11 03:09:04 crc kubenswrapper[4743]: I1011 03:09:04.124662 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vs4lk_a6061b50-92be-4942-b961-a094b28b50a9/extract-content/0.log" Oct 11 03:09:04 crc kubenswrapper[4743]: I1011 03:09:04.136825 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vs4lk_a6061b50-92be-4942-b961-a094b28b50a9/extract-content/0.log" Oct 11 03:09:04 crc kubenswrapper[4743]: I1011 03:09:04.147222 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vs4lk_a6061b50-92be-4942-b961-a094b28b50a9/extract-utilities/0.log" Oct 11 03:09:04 crc kubenswrapper[4743]: I1011 03:09:04.355777 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vs4lk_a6061b50-92be-4942-b961-a094b28b50a9/extract-utilities/0.log" Oct 11 03:09:04 crc kubenswrapper[4743]: I1011 03:09:04.380054 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vs4lk_a6061b50-92be-4942-b961-a094b28b50a9/extract-content/0.log" Oct 11 03:09:05 crc kubenswrapper[4743]: I1011 03:09:05.427968 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vs4lk_a6061b50-92be-4942-b961-a094b28b50a9/registry-server/0.log" Oct 11 03:09:13 crc kubenswrapper[4743]: I1011 03:09:13.409615 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b5lt8"] Oct 11 03:09:13 crc kubenswrapper[4743]: E1011 03:09:13.410744 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f0e7754-386f-4be3-8108-ac89f6b627d6" containerName="container-00" Oct 11 03:09:13 crc kubenswrapper[4743]: I1011 03:09:13.410762 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f0e7754-386f-4be3-8108-ac89f6b627d6" containerName="container-00" Oct 11 03:09:13 crc kubenswrapper[4743]: I1011 03:09:13.411115 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f0e7754-386f-4be3-8108-ac89f6b627d6" containerName="container-00" Oct 11 03:09:13 crc kubenswrapper[4743]: I1011 03:09:13.413319 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b5lt8" Oct 11 03:09:13 crc kubenswrapper[4743]: I1011 03:09:13.431149 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b5lt8"] Oct 11 03:09:13 crc kubenswrapper[4743]: I1011 03:09:13.468283 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/168210fd-ccce-47e3-8a6d-1681a30b2c90-utilities\") pod \"certified-operators-b5lt8\" (UID: \"168210fd-ccce-47e3-8a6d-1681a30b2c90\") " pod="openshift-marketplace/certified-operators-b5lt8" Oct 11 03:09:13 crc kubenswrapper[4743]: I1011 03:09:13.468344 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj788\" (UniqueName: \"kubernetes.io/projected/168210fd-ccce-47e3-8a6d-1681a30b2c90-kube-api-access-cj788\") pod \"certified-operators-b5lt8\" (UID: \"168210fd-ccce-47e3-8a6d-1681a30b2c90\") " pod="openshift-marketplace/certified-operators-b5lt8" Oct 11 03:09:13 crc kubenswrapper[4743]: I1011 03:09:13.468508 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/168210fd-ccce-47e3-8a6d-1681a30b2c90-catalog-content\") pod \"certified-operators-b5lt8\" (UID: \"168210fd-ccce-47e3-8a6d-1681a30b2c90\") " pod="openshift-marketplace/certified-operators-b5lt8" Oct 11 03:09:13 crc kubenswrapper[4743]: I1011 03:09:13.570458 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/168210fd-ccce-47e3-8a6d-1681a30b2c90-catalog-content\") pod \"certified-operators-b5lt8\" (UID: \"168210fd-ccce-47e3-8a6d-1681a30b2c90\") " pod="openshift-marketplace/certified-operators-b5lt8" Oct 11 03:09:13 crc kubenswrapper[4743]: I1011 03:09:13.570588 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/168210fd-ccce-47e3-8a6d-1681a30b2c90-utilities\") pod \"certified-operators-b5lt8\" (UID: \"168210fd-ccce-47e3-8a6d-1681a30b2c90\") " pod="openshift-marketplace/certified-operators-b5lt8" Oct 11 03:09:13 crc kubenswrapper[4743]: I1011 03:09:13.570626 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj788\" (UniqueName: \"kubernetes.io/projected/168210fd-ccce-47e3-8a6d-1681a30b2c90-kube-api-access-cj788\") pod \"certified-operators-b5lt8\" (UID: \"168210fd-ccce-47e3-8a6d-1681a30b2c90\") " pod="openshift-marketplace/certified-operators-b5lt8" Oct 11 03:09:13 crc kubenswrapper[4743]: I1011 03:09:13.571275 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/168210fd-ccce-47e3-8a6d-1681a30b2c90-utilities\") pod \"certified-operators-b5lt8\" (UID: \"168210fd-ccce-47e3-8a6d-1681a30b2c90\") " pod="openshift-marketplace/certified-operators-b5lt8" Oct 11 03:09:13 crc kubenswrapper[4743]: I1011 03:09:13.571404 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/168210fd-ccce-47e3-8a6d-1681a30b2c90-catalog-content\") pod \"certified-operators-b5lt8\" (UID: \"168210fd-ccce-47e3-8a6d-1681a30b2c90\") " pod="openshift-marketplace/certified-operators-b5lt8" Oct 11 03:09:13 crc kubenswrapper[4743]: I1011 03:09:13.591884 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj788\" (UniqueName: \"kubernetes.io/projected/168210fd-ccce-47e3-8a6d-1681a30b2c90-kube-api-access-cj788\") pod \"certified-operators-b5lt8\" (UID: \"168210fd-ccce-47e3-8a6d-1681a30b2c90\") " pod="openshift-marketplace/certified-operators-b5lt8" Oct 11 03:09:13 crc kubenswrapper[4743]: I1011 03:09:13.740807 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b5lt8" Oct 11 03:09:14 crc kubenswrapper[4743]: I1011 03:09:14.458207 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 03:09:14 crc kubenswrapper[4743]: I1011 03:09:14.458586 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 03:09:14 crc kubenswrapper[4743]: I1011 03:09:14.458633 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 03:09:14 crc kubenswrapper[4743]: I1011 03:09:14.462822 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"62fef52b9bc704a24e77d3838ec6b6ea99e34891429f790a37c7da87d4fa7a3f"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 03:09:14 crc kubenswrapper[4743]: I1011 03:09:14.463200 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://62fef52b9bc704a24e77d3838ec6b6ea99e34891429f790a37c7da87d4fa7a3f" gracePeriod=600 Oct 11 03:09:14 crc kubenswrapper[4743]: I1011 03:09:14.752678 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="62fef52b9bc704a24e77d3838ec6b6ea99e34891429f790a37c7da87d4fa7a3f" exitCode=0 Oct 11 03:09:14 crc kubenswrapper[4743]: I1011 03:09:14.753015 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"62fef52b9bc704a24e77d3838ec6b6ea99e34891429f790a37c7da87d4fa7a3f"} Oct 11 03:09:14 crc kubenswrapper[4743]: I1011 03:09:14.753051 4743 scope.go:117] "RemoveContainer" containerID="5156e89fb638919c369d76a46669f0fc82773aee80ae1bf54eefcf957683ec61" Oct 11 03:09:14 crc kubenswrapper[4743]: I1011 03:09:14.855353 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b5lt8"] Oct 11 03:09:15 crc kubenswrapper[4743]: I1011 03:09:15.786628 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerStarted","Data":"0a303352eb43c1d43d7696b12e7ddb4c2224822a23445b7dd5541ffcab230f46"} Oct 11 03:09:15 crc kubenswrapper[4743]: I1011 03:09:15.791025 4743 generic.go:334] "Generic (PLEG): container finished" podID="168210fd-ccce-47e3-8a6d-1681a30b2c90" containerID="1b7bcc291a38fa416910ecd8b0cc92907c5c8667a253d043d5ae8b9f3bdd7381" exitCode=0 Oct 11 03:09:15 crc kubenswrapper[4743]: I1011 03:09:15.791089 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5lt8" event={"ID":"168210fd-ccce-47e3-8a6d-1681a30b2c90","Type":"ContainerDied","Data":"1b7bcc291a38fa416910ecd8b0cc92907c5c8667a253d043d5ae8b9f3bdd7381"} Oct 11 03:09:15 crc kubenswrapper[4743]: I1011 03:09:15.791112 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5lt8" event={"ID":"168210fd-ccce-47e3-8a6d-1681a30b2c90","Type":"ContainerStarted","Data":"9fd7e36318a510149656f26415b4fde908585c250ba254b798eafb1735ef29a1"} Oct 11 03:09:15 crc kubenswrapper[4743]: I1011 03:09:15.795199 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 11 03:09:16 crc kubenswrapper[4743]: I1011 03:09:16.801541 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5lt8" event={"ID":"168210fd-ccce-47e3-8a6d-1681a30b2c90","Type":"ContainerStarted","Data":"25fffcb2c01fec10f80a2029dc7e02e5150857a80ee44043fc48eda0a04d0e33"} Oct 11 03:09:17 crc kubenswrapper[4743]: I1011 03:09:17.114736 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-4ftxm_9918e6d2-f9d4-4c2f-93ef-cc952577182b/prometheus-operator/0.log" Oct 11 03:09:17 crc kubenswrapper[4743]: I1011 03:09:17.337385 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-55f6745849-6fxfg_e62c0910-f3a4-4c85-9ad5-88f6fa5262df/prometheus-operator-admission-webhook/0.log" Oct 11 03:09:17 crc kubenswrapper[4743]: I1011 03:09:17.468156 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-55f6745849-chpjb_97d6cff1-f86b-4110-9c64-907a97ea4ceb/prometheus-operator-admission-webhook/0.log" Oct 11 03:09:17 crc kubenswrapper[4743]: I1011 03:09:17.590999 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-sgfrq_c4642856-48b2-4843-a11e-1a207a8c8efc/operator/0.log" Oct 11 03:09:17 crc kubenswrapper[4743]: I1011 03:09:17.697203 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-6584dc9448-tzxl5_c4bbc3ff-d45e-46ac-a8fc-6b75e1f5d342/observability-ui-dashboards/0.log" Oct 11 03:09:17 crc kubenswrapper[4743]: I1011 03:09:17.787400 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-bdtnb_fe87b24f-db4d-49cd-a2de-ab949443ecea/perses-operator/0.log" Oct 11 03:09:18 crc kubenswrapper[4743]: I1011 03:09:18.831025 4743 generic.go:334] "Generic (PLEG): container finished" podID="168210fd-ccce-47e3-8a6d-1681a30b2c90" containerID="25fffcb2c01fec10f80a2029dc7e02e5150857a80ee44043fc48eda0a04d0e33" exitCode=0 Oct 11 03:09:18 crc kubenswrapper[4743]: I1011 03:09:18.831070 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5lt8" event={"ID":"168210fd-ccce-47e3-8a6d-1681a30b2c90","Type":"ContainerDied","Data":"25fffcb2c01fec10f80a2029dc7e02e5150857a80ee44043fc48eda0a04d0e33"} Oct 11 03:09:19 crc kubenswrapper[4743]: I1011 03:09:19.853102 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5lt8" event={"ID":"168210fd-ccce-47e3-8a6d-1681a30b2c90","Type":"ContainerStarted","Data":"2d82a74101134592e8cd5f9310e2e879b5cca1dab49c3576b3a0635deb979813"} Oct 11 03:09:19 crc kubenswrapper[4743]: I1011 03:09:19.872166 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b5lt8" podStartSLOduration=3.375797031 podStartE2EDuration="6.872143242s" podCreationTimestamp="2025-10-11 03:09:13 +0000 UTC" firstStartedPulling="2025-10-11 03:09:15.79497283 +0000 UTC m=+8250.447953227" lastFinishedPulling="2025-10-11 03:09:19.291319041 +0000 UTC m=+8253.944299438" observedRunningTime="2025-10-11 03:09:19.869806494 +0000 UTC m=+8254.522786901" watchObservedRunningTime="2025-10-11 03:09:19.872143242 +0000 UTC m=+8254.525123639" Oct 11 03:09:23 crc kubenswrapper[4743]: I1011 03:09:23.741533 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b5lt8" Oct 11 03:09:23 crc kubenswrapper[4743]: I1011 03:09:23.741920 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b5lt8" Oct 11 03:09:23 crc kubenswrapper[4743]: I1011 03:09:23.823791 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b5lt8" Oct 11 03:09:27 crc kubenswrapper[4743]: I1011 03:09:27.056234 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5jckn"] Oct 11 03:09:27 crc kubenswrapper[4743]: I1011 03:09:27.060597 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5jckn" Oct 11 03:09:27 crc kubenswrapper[4743]: I1011 03:09:27.072093 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5jckn"] Oct 11 03:09:27 crc kubenswrapper[4743]: I1011 03:09:27.186485 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49afed2e-0a2a-491c-a6ba-caac5035ab42-utilities\") pod \"community-operators-5jckn\" (UID: \"49afed2e-0a2a-491c-a6ba-caac5035ab42\") " pod="openshift-marketplace/community-operators-5jckn" Oct 11 03:09:27 crc kubenswrapper[4743]: I1011 03:09:27.187108 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49afed2e-0a2a-491c-a6ba-caac5035ab42-catalog-content\") pod \"community-operators-5jckn\" (UID: \"49afed2e-0a2a-491c-a6ba-caac5035ab42\") " pod="openshift-marketplace/community-operators-5jckn" Oct 11 03:09:27 crc kubenswrapper[4743]: I1011 03:09:27.187265 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jmpk\" (UniqueName: \"kubernetes.io/projected/49afed2e-0a2a-491c-a6ba-caac5035ab42-kube-api-access-9jmpk\") pod \"community-operators-5jckn\" (UID: \"49afed2e-0a2a-491c-a6ba-caac5035ab42\") " pod="openshift-marketplace/community-operators-5jckn" Oct 11 03:09:27 crc kubenswrapper[4743]: I1011 03:09:27.289273 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49afed2e-0a2a-491c-a6ba-caac5035ab42-utilities\") pod \"community-operators-5jckn\" (UID: \"49afed2e-0a2a-491c-a6ba-caac5035ab42\") " pod="openshift-marketplace/community-operators-5jckn" Oct 11 03:09:27 crc kubenswrapper[4743]: I1011 03:09:27.289476 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49afed2e-0a2a-491c-a6ba-caac5035ab42-catalog-content\") pod \"community-operators-5jckn\" (UID: \"49afed2e-0a2a-491c-a6ba-caac5035ab42\") " pod="openshift-marketplace/community-operators-5jckn" Oct 11 03:09:27 crc kubenswrapper[4743]: I1011 03:09:27.289519 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jmpk\" (UniqueName: \"kubernetes.io/projected/49afed2e-0a2a-491c-a6ba-caac5035ab42-kube-api-access-9jmpk\") pod \"community-operators-5jckn\" (UID: \"49afed2e-0a2a-491c-a6ba-caac5035ab42\") " pod="openshift-marketplace/community-operators-5jckn" Oct 11 03:09:27 crc kubenswrapper[4743]: I1011 03:09:27.290316 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49afed2e-0a2a-491c-a6ba-caac5035ab42-utilities\") pod \"community-operators-5jckn\" (UID: \"49afed2e-0a2a-491c-a6ba-caac5035ab42\") " pod="openshift-marketplace/community-operators-5jckn" Oct 11 03:09:27 crc kubenswrapper[4743]: I1011 03:09:27.290395 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49afed2e-0a2a-491c-a6ba-caac5035ab42-catalog-content\") pod \"community-operators-5jckn\" (UID: \"49afed2e-0a2a-491c-a6ba-caac5035ab42\") " pod="openshift-marketplace/community-operators-5jckn" Oct 11 03:09:27 crc kubenswrapper[4743]: I1011 03:09:27.314820 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jmpk\" (UniqueName: \"kubernetes.io/projected/49afed2e-0a2a-491c-a6ba-caac5035ab42-kube-api-access-9jmpk\") pod \"community-operators-5jckn\" (UID: \"49afed2e-0a2a-491c-a6ba-caac5035ab42\") " pod="openshift-marketplace/community-operators-5jckn" Oct 11 03:09:27 crc kubenswrapper[4743]: I1011 03:09:27.392700 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5jckn" Oct 11 03:09:27 crc kubenswrapper[4743]: I1011 03:09:27.940533 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5jckn"] Oct 11 03:09:28 crc kubenswrapper[4743]: I1011 03:09:28.941849 4743 generic.go:334] "Generic (PLEG): container finished" podID="49afed2e-0a2a-491c-a6ba-caac5035ab42" containerID="a166f758b776f2c49a80ca48c2225952de56922ebe497bf21d5b31fc1cd72d03" exitCode=0 Oct 11 03:09:28 crc kubenswrapper[4743]: I1011 03:09:28.941957 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5jckn" event={"ID":"49afed2e-0a2a-491c-a6ba-caac5035ab42","Type":"ContainerDied","Data":"a166f758b776f2c49a80ca48c2225952de56922ebe497bf21d5b31fc1cd72d03"} Oct 11 03:09:28 crc kubenswrapper[4743]: I1011 03:09:28.943339 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5jckn" event={"ID":"49afed2e-0a2a-491c-a6ba-caac5035ab42","Type":"ContainerStarted","Data":"2e47a9188f8059f19f18e946c06e15812c034ce9b51a5735544b3c6e889b6503"} Oct 11 03:09:30 crc kubenswrapper[4743]: I1011 03:09:30.963115 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5jckn" event={"ID":"49afed2e-0a2a-491c-a6ba-caac5035ab42","Type":"ContainerStarted","Data":"5282000bdafe39d6482b921e0393b88f4cc4d030835200c826b5baf8e83dd1db"} Oct 11 03:09:31 crc kubenswrapper[4743]: I1011 03:09:31.975099 4743 generic.go:334] "Generic (PLEG): container finished" podID="49afed2e-0a2a-491c-a6ba-caac5035ab42" containerID="5282000bdafe39d6482b921e0393b88f4cc4d030835200c826b5baf8e83dd1db" exitCode=0 Oct 11 03:09:31 crc kubenswrapper[4743]: I1011 03:09:31.975146 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5jckn" event={"ID":"49afed2e-0a2a-491c-a6ba-caac5035ab42","Type":"ContainerDied","Data":"5282000bdafe39d6482b921e0393b88f4cc4d030835200c826b5baf8e83dd1db"} Oct 11 03:09:32 crc kubenswrapper[4743]: I1011 03:09:32.249336 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7857f779b4-t484n_d60b36a6-87ff-4325-a245-43b3dea4cfaf/manager/0.log" Oct 11 03:09:32 crc kubenswrapper[4743]: I1011 03:09:32.267316 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7857f779b4-t484n_d60b36a6-87ff-4325-a245-43b3dea4cfaf/kube-rbac-proxy/0.log" Oct 11 03:09:32 crc kubenswrapper[4743]: I1011 03:09:32.986657 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5jckn" event={"ID":"49afed2e-0a2a-491c-a6ba-caac5035ab42","Type":"ContainerStarted","Data":"029929de6f2104d83edbe6364b173b4849def920af14bfc614e244c4b527def7"} Oct 11 03:09:33 crc kubenswrapper[4743]: I1011 03:09:33.008971 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5jckn" podStartSLOduration=2.373439513 podStartE2EDuration="6.008948778s" podCreationTimestamp="2025-10-11 03:09:27 +0000 UTC" firstStartedPulling="2025-10-11 03:09:28.944958833 +0000 UTC m=+8263.597939240" lastFinishedPulling="2025-10-11 03:09:32.580468108 +0000 UTC m=+8267.233448505" observedRunningTime="2025-10-11 03:09:33.001946474 +0000 UTC m=+8267.654926871" watchObservedRunningTime="2025-10-11 03:09:33.008948778 +0000 UTC m=+8267.661929175" Oct 11 03:09:33 crc kubenswrapper[4743]: I1011 03:09:33.796154 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b5lt8" Oct 11 03:09:34 crc kubenswrapper[4743]: I1011 03:09:34.437080 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b5lt8"] Oct 11 03:09:34 crc kubenswrapper[4743]: I1011 03:09:34.438239 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b5lt8" podUID="168210fd-ccce-47e3-8a6d-1681a30b2c90" containerName="registry-server" containerID="cri-o://2d82a74101134592e8cd5f9310e2e879b5cca1dab49c3576b3a0635deb979813" gracePeriod=2 Oct 11 03:09:34 crc kubenswrapper[4743]: I1011 03:09:34.960381 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b5lt8" Oct 11 03:09:35 crc kubenswrapper[4743]: I1011 03:09:35.009143 4743 generic.go:334] "Generic (PLEG): container finished" podID="168210fd-ccce-47e3-8a6d-1681a30b2c90" containerID="2d82a74101134592e8cd5f9310e2e879b5cca1dab49c3576b3a0635deb979813" exitCode=0 Oct 11 03:09:35 crc kubenswrapper[4743]: I1011 03:09:35.009181 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5lt8" event={"ID":"168210fd-ccce-47e3-8a6d-1681a30b2c90","Type":"ContainerDied","Data":"2d82a74101134592e8cd5f9310e2e879b5cca1dab49c3576b3a0635deb979813"} Oct 11 03:09:35 crc kubenswrapper[4743]: I1011 03:09:35.009209 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5lt8" event={"ID":"168210fd-ccce-47e3-8a6d-1681a30b2c90","Type":"ContainerDied","Data":"9fd7e36318a510149656f26415b4fde908585c250ba254b798eafb1735ef29a1"} Oct 11 03:09:35 crc kubenswrapper[4743]: I1011 03:09:35.009227 4743 scope.go:117] "RemoveContainer" containerID="2d82a74101134592e8cd5f9310e2e879b5cca1dab49c3576b3a0635deb979813" Oct 11 03:09:35 crc kubenswrapper[4743]: I1011 03:09:35.009297 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b5lt8" Oct 11 03:09:35 crc kubenswrapper[4743]: I1011 03:09:35.036745 4743 scope.go:117] "RemoveContainer" containerID="25fffcb2c01fec10f80a2029dc7e02e5150857a80ee44043fc48eda0a04d0e33" Oct 11 03:09:35 crc kubenswrapper[4743]: I1011 03:09:35.058449 4743 scope.go:117] "RemoveContainer" containerID="1b7bcc291a38fa416910ecd8b0cc92907c5c8667a253d043d5ae8b9f3bdd7381" Oct 11 03:09:35 crc kubenswrapper[4743]: I1011 03:09:35.061523 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/168210fd-ccce-47e3-8a6d-1681a30b2c90-utilities\") pod \"168210fd-ccce-47e3-8a6d-1681a30b2c90\" (UID: \"168210fd-ccce-47e3-8a6d-1681a30b2c90\") " Oct 11 03:09:35 crc kubenswrapper[4743]: I1011 03:09:35.061665 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj788\" (UniqueName: \"kubernetes.io/projected/168210fd-ccce-47e3-8a6d-1681a30b2c90-kube-api-access-cj788\") pod \"168210fd-ccce-47e3-8a6d-1681a30b2c90\" (UID: \"168210fd-ccce-47e3-8a6d-1681a30b2c90\") " Oct 11 03:09:35 crc kubenswrapper[4743]: I1011 03:09:35.061954 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/168210fd-ccce-47e3-8a6d-1681a30b2c90-catalog-content\") pod \"168210fd-ccce-47e3-8a6d-1681a30b2c90\" (UID: \"168210fd-ccce-47e3-8a6d-1681a30b2c90\") " Oct 11 03:09:35 crc kubenswrapper[4743]: I1011 03:09:35.063382 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/168210fd-ccce-47e3-8a6d-1681a30b2c90-utilities" (OuterVolumeSpecName: "utilities") pod "168210fd-ccce-47e3-8a6d-1681a30b2c90" (UID: "168210fd-ccce-47e3-8a6d-1681a30b2c90"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:09:35 crc kubenswrapper[4743]: I1011 03:09:35.070134 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/168210fd-ccce-47e3-8a6d-1681a30b2c90-kube-api-access-cj788" (OuterVolumeSpecName: "kube-api-access-cj788") pod "168210fd-ccce-47e3-8a6d-1681a30b2c90" (UID: "168210fd-ccce-47e3-8a6d-1681a30b2c90"). InnerVolumeSpecName "kube-api-access-cj788". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:09:35 crc kubenswrapper[4743]: I1011 03:09:35.107456 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/168210fd-ccce-47e3-8a6d-1681a30b2c90-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "168210fd-ccce-47e3-8a6d-1681a30b2c90" (UID: "168210fd-ccce-47e3-8a6d-1681a30b2c90"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:09:35 crc kubenswrapper[4743]: I1011 03:09:35.164530 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj788\" (UniqueName: \"kubernetes.io/projected/168210fd-ccce-47e3-8a6d-1681a30b2c90-kube-api-access-cj788\") on node \"crc\" DevicePath \"\"" Oct 11 03:09:35 crc kubenswrapper[4743]: I1011 03:09:35.164563 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/168210fd-ccce-47e3-8a6d-1681a30b2c90-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 03:09:35 crc kubenswrapper[4743]: I1011 03:09:35.164572 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/168210fd-ccce-47e3-8a6d-1681a30b2c90-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 03:09:35 crc kubenswrapper[4743]: I1011 03:09:35.167728 4743 scope.go:117] "RemoveContainer" containerID="2d82a74101134592e8cd5f9310e2e879b5cca1dab49c3576b3a0635deb979813" Oct 11 03:09:35 crc kubenswrapper[4743]: E1011 03:09:35.168498 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d82a74101134592e8cd5f9310e2e879b5cca1dab49c3576b3a0635deb979813\": container with ID starting with 2d82a74101134592e8cd5f9310e2e879b5cca1dab49c3576b3a0635deb979813 not found: ID does not exist" containerID="2d82a74101134592e8cd5f9310e2e879b5cca1dab49c3576b3a0635deb979813" Oct 11 03:09:35 crc kubenswrapper[4743]: I1011 03:09:35.168528 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d82a74101134592e8cd5f9310e2e879b5cca1dab49c3576b3a0635deb979813"} err="failed to get container status \"2d82a74101134592e8cd5f9310e2e879b5cca1dab49c3576b3a0635deb979813\": rpc error: code = NotFound desc = could not find container \"2d82a74101134592e8cd5f9310e2e879b5cca1dab49c3576b3a0635deb979813\": container with ID starting with 2d82a74101134592e8cd5f9310e2e879b5cca1dab49c3576b3a0635deb979813 not found: ID does not exist" Oct 11 03:09:35 crc kubenswrapper[4743]: I1011 03:09:35.168547 4743 scope.go:117] "RemoveContainer" containerID="25fffcb2c01fec10f80a2029dc7e02e5150857a80ee44043fc48eda0a04d0e33" Oct 11 03:09:35 crc kubenswrapper[4743]: E1011 03:09:35.168954 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25fffcb2c01fec10f80a2029dc7e02e5150857a80ee44043fc48eda0a04d0e33\": container with ID starting with 25fffcb2c01fec10f80a2029dc7e02e5150857a80ee44043fc48eda0a04d0e33 not found: ID does not exist" containerID="25fffcb2c01fec10f80a2029dc7e02e5150857a80ee44043fc48eda0a04d0e33" Oct 11 03:09:35 crc kubenswrapper[4743]: I1011 03:09:35.169000 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25fffcb2c01fec10f80a2029dc7e02e5150857a80ee44043fc48eda0a04d0e33"} err="failed to get container status \"25fffcb2c01fec10f80a2029dc7e02e5150857a80ee44043fc48eda0a04d0e33\": rpc error: code = NotFound desc = could not find container \"25fffcb2c01fec10f80a2029dc7e02e5150857a80ee44043fc48eda0a04d0e33\": container with ID starting with 25fffcb2c01fec10f80a2029dc7e02e5150857a80ee44043fc48eda0a04d0e33 not found: ID does not exist" Oct 11 03:09:35 crc kubenswrapper[4743]: I1011 03:09:35.169026 4743 scope.go:117] "RemoveContainer" containerID="1b7bcc291a38fa416910ecd8b0cc92907c5c8667a253d043d5ae8b9f3bdd7381" Oct 11 03:09:35 crc kubenswrapper[4743]: E1011 03:09:35.170027 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b7bcc291a38fa416910ecd8b0cc92907c5c8667a253d043d5ae8b9f3bdd7381\": container with ID starting with 1b7bcc291a38fa416910ecd8b0cc92907c5c8667a253d043d5ae8b9f3bdd7381 not found: ID does not exist" containerID="1b7bcc291a38fa416910ecd8b0cc92907c5c8667a253d043d5ae8b9f3bdd7381" Oct 11 03:09:35 crc kubenswrapper[4743]: I1011 03:09:35.170067 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b7bcc291a38fa416910ecd8b0cc92907c5c8667a253d043d5ae8b9f3bdd7381"} err="failed to get container status \"1b7bcc291a38fa416910ecd8b0cc92907c5c8667a253d043d5ae8b9f3bdd7381\": rpc error: code = NotFound desc = could not find container \"1b7bcc291a38fa416910ecd8b0cc92907c5c8667a253d043d5ae8b9f3bdd7381\": container with ID starting with 1b7bcc291a38fa416910ecd8b0cc92907c5c8667a253d043d5ae8b9f3bdd7381 not found: ID does not exist" Oct 11 03:09:35 crc kubenswrapper[4743]: I1011 03:09:35.342338 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b5lt8"] Oct 11 03:09:35 crc kubenswrapper[4743]: I1011 03:09:35.350874 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b5lt8"] Oct 11 03:09:36 crc kubenswrapper[4743]: I1011 03:09:36.107669 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="168210fd-ccce-47e3-8a6d-1681a30b2c90" path="/var/lib/kubelet/pods/168210fd-ccce-47e3-8a6d-1681a30b2c90/volumes" Oct 11 03:09:37 crc kubenswrapper[4743]: I1011 03:09:37.393626 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5jckn" Oct 11 03:09:37 crc kubenswrapper[4743]: I1011 03:09:37.393966 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5jckn" Oct 11 03:09:38 crc kubenswrapper[4743]: I1011 03:09:38.450531 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-5jckn" podUID="49afed2e-0a2a-491c-a6ba-caac5035ab42" containerName="registry-server" probeResult="failure" output=< Oct 11 03:09:38 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Oct 11 03:09:38 crc kubenswrapper[4743]: > Oct 11 03:09:38 crc kubenswrapper[4743]: E1011 03:09:38.928635 4743 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.106:53676->38.102.83.106:39201: write tcp 38.102.83.106:53676->38.102.83.106:39201: write: broken pipe Oct 11 03:09:47 crc kubenswrapper[4743]: I1011 03:09:47.444572 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5jckn" Oct 11 03:09:47 crc kubenswrapper[4743]: I1011 03:09:47.525511 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5jckn" Oct 11 03:09:47 crc kubenswrapper[4743]: I1011 03:09:47.681693 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5jckn"] Oct 11 03:09:49 crc kubenswrapper[4743]: I1011 03:09:49.206932 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5jckn" podUID="49afed2e-0a2a-491c-a6ba-caac5035ab42" containerName="registry-server" containerID="cri-o://029929de6f2104d83edbe6364b173b4849def920af14bfc614e244c4b527def7" gracePeriod=2 Oct 11 03:09:49 crc kubenswrapper[4743]: I1011 03:09:49.897225 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5jckn" Oct 11 03:09:50 crc kubenswrapper[4743]: I1011 03:09:50.013231 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49afed2e-0a2a-491c-a6ba-caac5035ab42-catalog-content\") pod \"49afed2e-0a2a-491c-a6ba-caac5035ab42\" (UID: \"49afed2e-0a2a-491c-a6ba-caac5035ab42\") " Oct 11 03:09:50 crc kubenswrapper[4743]: I1011 03:09:50.013695 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49afed2e-0a2a-491c-a6ba-caac5035ab42-utilities\") pod \"49afed2e-0a2a-491c-a6ba-caac5035ab42\" (UID: \"49afed2e-0a2a-491c-a6ba-caac5035ab42\") " Oct 11 03:09:50 crc kubenswrapper[4743]: I1011 03:09:50.013947 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jmpk\" (UniqueName: \"kubernetes.io/projected/49afed2e-0a2a-491c-a6ba-caac5035ab42-kube-api-access-9jmpk\") pod \"49afed2e-0a2a-491c-a6ba-caac5035ab42\" (UID: \"49afed2e-0a2a-491c-a6ba-caac5035ab42\") " Oct 11 03:09:50 crc kubenswrapper[4743]: I1011 03:09:50.015071 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49afed2e-0a2a-491c-a6ba-caac5035ab42-utilities" (OuterVolumeSpecName: "utilities") pod "49afed2e-0a2a-491c-a6ba-caac5035ab42" (UID: "49afed2e-0a2a-491c-a6ba-caac5035ab42"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:09:50 crc kubenswrapper[4743]: I1011 03:09:50.015801 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49afed2e-0a2a-491c-a6ba-caac5035ab42-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 03:09:50 crc kubenswrapper[4743]: I1011 03:09:50.023419 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49afed2e-0a2a-491c-a6ba-caac5035ab42-kube-api-access-9jmpk" (OuterVolumeSpecName: "kube-api-access-9jmpk") pod "49afed2e-0a2a-491c-a6ba-caac5035ab42" (UID: "49afed2e-0a2a-491c-a6ba-caac5035ab42"). InnerVolumeSpecName "kube-api-access-9jmpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:09:50 crc kubenswrapper[4743]: I1011 03:09:50.086951 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49afed2e-0a2a-491c-a6ba-caac5035ab42-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49afed2e-0a2a-491c-a6ba-caac5035ab42" (UID: "49afed2e-0a2a-491c-a6ba-caac5035ab42"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:09:50 crc kubenswrapper[4743]: I1011 03:09:50.119104 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49afed2e-0a2a-491c-a6ba-caac5035ab42-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 03:09:50 crc kubenswrapper[4743]: I1011 03:09:50.119137 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jmpk\" (UniqueName: \"kubernetes.io/projected/49afed2e-0a2a-491c-a6ba-caac5035ab42-kube-api-access-9jmpk\") on node \"crc\" DevicePath \"\"" Oct 11 03:09:50 crc kubenswrapper[4743]: I1011 03:09:50.216713 4743 generic.go:334] "Generic (PLEG): container finished" podID="49afed2e-0a2a-491c-a6ba-caac5035ab42" containerID="029929de6f2104d83edbe6364b173b4849def920af14bfc614e244c4b527def7" exitCode=0 Oct 11 03:09:50 crc kubenswrapper[4743]: I1011 03:09:50.216757 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5jckn" event={"ID":"49afed2e-0a2a-491c-a6ba-caac5035ab42","Type":"ContainerDied","Data":"029929de6f2104d83edbe6364b173b4849def920af14bfc614e244c4b527def7"} Oct 11 03:09:50 crc kubenswrapper[4743]: I1011 03:09:50.216786 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5jckn" Oct 11 03:09:50 crc kubenswrapper[4743]: I1011 03:09:50.216808 4743 scope.go:117] "RemoveContainer" containerID="029929de6f2104d83edbe6364b173b4849def920af14bfc614e244c4b527def7" Oct 11 03:09:50 crc kubenswrapper[4743]: I1011 03:09:50.216797 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5jckn" event={"ID":"49afed2e-0a2a-491c-a6ba-caac5035ab42","Type":"ContainerDied","Data":"2e47a9188f8059f19f18e946c06e15812c034ce9b51a5735544b3c6e889b6503"} Oct 11 03:09:50 crc kubenswrapper[4743]: I1011 03:09:50.250518 4743 scope.go:117] "RemoveContainer" containerID="5282000bdafe39d6482b921e0393b88f4cc4d030835200c826b5baf8e83dd1db" Oct 11 03:09:50 crc kubenswrapper[4743]: I1011 03:09:50.281938 4743 scope.go:117] "RemoveContainer" containerID="a166f758b776f2c49a80ca48c2225952de56922ebe497bf21d5b31fc1cd72d03" Oct 11 03:09:50 crc kubenswrapper[4743]: I1011 03:09:50.288896 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5jckn"] Oct 11 03:09:50 crc kubenswrapper[4743]: I1011 03:09:50.304403 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5jckn"] Oct 11 03:09:50 crc kubenswrapper[4743]: I1011 03:09:50.385499 4743 scope.go:117] "RemoveContainer" containerID="029929de6f2104d83edbe6364b173b4849def920af14bfc614e244c4b527def7" Oct 11 03:09:50 crc kubenswrapper[4743]: E1011 03:09:50.389039 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"029929de6f2104d83edbe6364b173b4849def920af14bfc614e244c4b527def7\": container with ID starting with 029929de6f2104d83edbe6364b173b4849def920af14bfc614e244c4b527def7 not found: ID does not exist" containerID="029929de6f2104d83edbe6364b173b4849def920af14bfc614e244c4b527def7" Oct 11 03:09:50 crc kubenswrapper[4743]: I1011 03:09:50.389084 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"029929de6f2104d83edbe6364b173b4849def920af14bfc614e244c4b527def7"} err="failed to get container status \"029929de6f2104d83edbe6364b173b4849def920af14bfc614e244c4b527def7\": rpc error: code = NotFound desc = could not find container \"029929de6f2104d83edbe6364b173b4849def920af14bfc614e244c4b527def7\": container with ID starting with 029929de6f2104d83edbe6364b173b4849def920af14bfc614e244c4b527def7 not found: ID does not exist" Oct 11 03:09:50 crc kubenswrapper[4743]: I1011 03:09:50.389107 4743 scope.go:117] "RemoveContainer" containerID="5282000bdafe39d6482b921e0393b88f4cc4d030835200c826b5baf8e83dd1db" Oct 11 03:09:50 crc kubenswrapper[4743]: E1011 03:09:50.391727 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5282000bdafe39d6482b921e0393b88f4cc4d030835200c826b5baf8e83dd1db\": container with ID starting with 5282000bdafe39d6482b921e0393b88f4cc4d030835200c826b5baf8e83dd1db not found: ID does not exist" containerID="5282000bdafe39d6482b921e0393b88f4cc4d030835200c826b5baf8e83dd1db" Oct 11 03:09:50 crc kubenswrapper[4743]: I1011 03:09:50.391772 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5282000bdafe39d6482b921e0393b88f4cc4d030835200c826b5baf8e83dd1db"} err="failed to get container status \"5282000bdafe39d6482b921e0393b88f4cc4d030835200c826b5baf8e83dd1db\": rpc error: code = NotFound desc = could not find container \"5282000bdafe39d6482b921e0393b88f4cc4d030835200c826b5baf8e83dd1db\": container with ID starting with 5282000bdafe39d6482b921e0393b88f4cc4d030835200c826b5baf8e83dd1db not found: ID does not exist" Oct 11 03:09:50 crc kubenswrapper[4743]: I1011 03:09:50.391796 4743 scope.go:117] "RemoveContainer" containerID="a166f758b776f2c49a80ca48c2225952de56922ebe497bf21d5b31fc1cd72d03" Oct 11 03:09:50 crc kubenswrapper[4743]: E1011 03:09:50.395985 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a166f758b776f2c49a80ca48c2225952de56922ebe497bf21d5b31fc1cd72d03\": container with ID starting with a166f758b776f2c49a80ca48c2225952de56922ebe497bf21d5b31fc1cd72d03 not found: ID does not exist" containerID="a166f758b776f2c49a80ca48c2225952de56922ebe497bf21d5b31fc1cd72d03" Oct 11 03:09:50 crc kubenswrapper[4743]: I1011 03:09:50.396028 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a166f758b776f2c49a80ca48c2225952de56922ebe497bf21d5b31fc1cd72d03"} err="failed to get container status \"a166f758b776f2c49a80ca48c2225952de56922ebe497bf21d5b31fc1cd72d03\": rpc error: code = NotFound desc = could not find container \"a166f758b776f2c49a80ca48c2225952de56922ebe497bf21d5b31fc1cd72d03\": container with ID starting with a166f758b776f2c49a80ca48c2225952de56922ebe497bf21d5b31fc1cd72d03 not found: ID does not exist" Oct 11 03:09:52 crc kubenswrapper[4743]: I1011 03:09:52.107109 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49afed2e-0a2a-491c-a6ba-caac5035ab42" path="/var/lib/kubelet/pods/49afed2e-0a2a-491c-a6ba-caac5035ab42/volumes" Oct 11 03:10:19 crc kubenswrapper[4743]: I1011 03:10:19.647018 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-42jjb"] Oct 11 03:10:19 crc kubenswrapper[4743]: E1011 03:10:19.649757 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="168210fd-ccce-47e3-8a6d-1681a30b2c90" containerName="extract-utilities" Oct 11 03:10:19 crc kubenswrapper[4743]: I1011 03:10:19.649869 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="168210fd-ccce-47e3-8a6d-1681a30b2c90" containerName="extract-utilities" Oct 11 03:10:19 crc kubenswrapper[4743]: E1011 03:10:19.649950 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="168210fd-ccce-47e3-8a6d-1681a30b2c90" containerName="registry-server" Oct 11 03:10:19 crc kubenswrapper[4743]: I1011 03:10:19.650019 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="168210fd-ccce-47e3-8a6d-1681a30b2c90" containerName="registry-server" Oct 11 03:10:19 crc kubenswrapper[4743]: E1011 03:10:19.650114 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49afed2e-0a2a-491c-a6ba-caac5035ab42" containerName="extract-content" Oct 11 03:10:19 crc kubenswrapper[4743]: I1011 03:10:19.650174 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="49afed2e-0a2a-491c-a6ba-caac5035ab42" containerName="extract-content" Oct 11 03:10:19 crc kubenswrapper[4743]: E1011 03:10:19.650244 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49afed2e-0a2a-491c-a6ba-caac5035ab42" containerName="registry-server" Oct 11 03:10:19 crc kubenswrapper[4743]: I1011 03:10:19.650304 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="49afed2e-0a2a-491c-a6ba-caac5035ab42" containerName="registry-server" Oct 11 03:10:19 crc kubenswrapper[4743]: E1011 03:10:19.650366 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49afed2e-0a2a-491c-a6ba-caac5035ab42" containerName="extract-utilities" Oct 11 03:10:19 crc kubenswrapper[4743]: I1011 03:10:19.650424 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="49afed2e-0a2a-491c-a6ba-caac5035ab42" containerName="extract-utilities" Oct 11 03:10:19 crc kubenswrapper[4743]: E1011 03:10:19.650487 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="168210fd-ccce-47e3-8a6d-1681a30b2c90" containerName="extract-content" Oct 11 03:10:19 crc kubenswrapper[4743]: I1011 03:10:19.650547 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="168210fd-ccce-47e3-8a6d-1681a30b2c90" containerName="extract-content" Oct 11 03:10:19 crc kubenswrapper[4743]: I1011 03:10:19.650948 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="168210fd-ccce-47e3-8a6d-1681a30b2c90" containerName="registry-server" Oct 11 03:10:19 crc kubenswrapper[4743]: I1011 03:10:19.651065 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="49afed2e-0a2a-491c-a6ba-caac5035ab42" containerName="registry-server" Oct 11 03:10:19 crc kubenswrapper[4743]: I1011 03:10:19.653107 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-42jjb" Oct 11 03:10:19 crc kubenswrapper[4743]: I1011 03:10:19.657267 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-42jjb"] Oct 11 03:10:19 crc kubenswrapper[4743]: I1011 03:10:19.747668 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04d6a586-789e-47a9-85d1-d3c091ff5f68-catalog-content\") pod \"redhat-operators-42jjb\" (UID: \"04d6a586-789e-47a9-85d1-d3c091ff5f68\") " pod="openshift-marketplace/redhat-operators-42jjb" Oct 11 03:10:19 crc kubenswrapper[4743]: I1011 03:10:19.748045 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04d6a586-789e-47a9-85d1-d3c091ff5f68-utilities\") pod \"redhat-operators-42jjb\" (UID: \"04d6a586-789e-47a9-85d1-d3c091ff5f68\") " pod="openshift-marketplace/redhat-operators-42jjb" Oct 11 03:10:19 crc kubenswrapper[4743]: I1011 03:10:19.748402 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv495\" (UniqueName: \"kubernetes.io/projected/04d6a586-789e-47a9-85d1-d3c091ff5f68-kube-api-access-jv495\") pod \"redhat-operators-42jjb\" (UID: \"04d6a586-789e-47a9-85d1-d3c091ff5f68\") " pod="openshift-marketplace/redhat-operators-42jjb" Oct 11 03:10:19 crc kubenswrapper[4743]: I1011 03:10:19.851091 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv495\" (UniqueName: \"kubernetes.io/projected/04d6a586-789e-47a9-85d1-d3c091ff5f68-kube-api-access-jv495\") pod \"redhat-operators-42jjb\" (UID: \"04d6a586-789e-47a9-85d1-d3c091ff5f68\") " pod="openshift-marketplace/redhat-operators-42jjb" Oct 11 03:10:19 crc kubenswrapper[4743]: I1011 03:10:19.851253 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04d6a586-789e-47a9-85d1-d3c091ff5f68-catalog-content\") pod \"redhat-operators-42jjb\" (UID: \"04d6a586-789e-47a9-85d1-d3c091ff5f68\") " pod="openshift-marketplace/redhat-operators-42jjb" Oct 11 03:10:19 crc kubenswrapper[4743]: I1011 03:10:19.851297 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04d6a586-789e-47a9-85d1-d3c091ff5f68-utilities\") pod \"redhat-operators-42jjb\" (UID: \"04d6a586-789e-47a9-85d1-d3c091ff5f68\") " pod="openshift-marketplace/redhat-operators-42jjb" Oct 11 03:10:19 crc kubenswrapper[4743]: I1011 03:10:19.852395 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04d6a586-789e-47a9-85d1-d3c091ff5f68-utilities\") pod \"redhat-operators-42jjb\" (UID: \"04d6a586-789e-47a9-85d1-d3c091ff5f68\") " pod="openshift-marketplace/redhat-operators-42jjb" Oct 11 03:10:19 crc kubenswrapper[4743]: I1011 03:10:19.852391 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04d6a586-789e-47a9-85d1-d3c091ff5f68-catalog-content\") pod \"redhat-operators-42jjb\" (UID: \"04d6a586-789e-47a9-85d1-d3c091ff5f68\") " pod="openshift-marketplace/redhat-operators-42jjb" Oct 11 03:10:19 crc kubenswrapper[4743]: I1011 03:10:19.877034 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv495\" (UniqueName: \"kubernetes.io/projected/04d6a586-789e-47a9-85d1-d3c091ff5f68-kube-api-access-jv495\") pod \"redhat-operators-42jjb\" (UID: \"04d6a586-789e-47a9-85d1-d3c091ff5f68\") " pod="openshift-marketplace/redhat-operators-42jjb" Oct 11 03:10:20 crc kubenswrapper[4743]: I1011 03:10:20.003896 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-42jjb" Oct 11 03:10:20 crc kubenswrapper[4743]: I1011 03:10:20.511246 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-42jjb"] Oct 11 03:10:20 crc kubenswrapper[4743]: I1011 03:10:20.538753 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42jjb" event={"ID":"04d6a586-789e-47a9-85d1-d3c091ff5f68","Type":"ContainerStarted","Data":"2e5cee0d0ccb0a8d1e9216b9b2824f1315b89f1dfefd3cf1ac237bff669bd05b"} Oct 11 03:10:21 crc kubenswrapper[4743]: I1011 03:10:21.551211 4743 generic.go:334] "Generic (PLEG): container finished" podID="04d6a586-789e-47a9-85d1-d3c091ff5f68" containerID="42a5f23d1d220d5c79dd71593dbb30a2816a16c5a29e56cbc034de8407b0ca9d" exitCode=0 Oct 11 03:10:21 crc kubenswrapper[4743]: I1011 03:10:21.551701 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42jjb" event={"ID":"04d6a586-789e-47a9-85d1-d3c091ff5f68","Type":"ContainerDied","Data":"42a5f23d1d220d5c79dd71593dbb30a2816a16c5a29e56cbc034de8407b0ca9d"} Oct 11 03:10:22 crc kubenswrapper[4743]: I1011 03:10:22.565130 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42jjb" event={"ID":"04d6a586-789e-47a9-85d1-d3c091ff5f68","Type":"ContainerStarted","Data":"ec2a6b42eb9af786e726f27068d718e078ad6c39b566be5a81feb6eef40a77c3"} Oct 11 03:10:29 crc kubenswrapper[4743]: I1011 03:10:29.635820 4743 generic.go:334] "Generic (PLEG): container finished" podID="04d6a586-789e-47a9-85d1-d3c091ff5f68" containerID="ec2a6b42eb9af786e726f27068d718e078ad6c39b566be5a81feb6eef40a77c3" exitCode=0 Oct 11 03:10:29 crc kubenswrapper[4743]: I1011 03:10:29.635935 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42jjb" event={"ID":"04d6a586-789e-47a9-85d1-d3c091ff5f68","Type":"ContainerDied","Data":"ec2a6b42eb9af786e726f27068d718e078ad6c39b566be5a81feb6eef40a77c3"} Oct 11 03:10:30 crc kubenswrapper[4743]: I1011 03:10:30.646960 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42jjb" event={"ID":"04d6a586-789e-47a9-85d1-d3c091ff5f68","Type":"ContainerStarted","Data":"348dacb6bd5c633c56d27258edae7df418b21809625db250c8b312e8ddfda154"} Oct 11 03:10:30 crc kubenswrapper[4743]: I1011 03:10:30.671158 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-42jjb" podStartSLOduration=3.018544123 podStartE2EDuration="11.671140483s" podCreationTimestamp="2025-10-11 03:10:19 +0000 UTC" firstStartedPulling="2025-10-11 03:10:21.557510568 +0000 UTC m=+8316.210490965" lastFinishedPulling="2025-10-11 03:10:30.210106928 +0000 UTC m=+8324.863087325" observedRunningTime="2025-10-11 03:10:30.663981016 +0000 UTC m=+8325.316961413" watchObservedRunningTime="2025-10-11 03:10:30.671140483 +0000 UTC m=+8325.324120880" Oct 11 03:10:40 crc kubenswrapper[4743]: I1011 03:10:40.004025 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-42jjb" Oct 11 03:10:40 crc kubenswrapper[4743]: I1011 03:10:40.004642 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-42jjb" Oct 11 03:10:40 crc kubenswrapper[4743]: I1011 03:10:40.050286 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-42jjb" Oct 11 03:10:40 crc kubenswrapper[4743]: I1011 03:10:40.867929 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-42jjb" Oct 11 03:10:40 crc kubenswrapper[4743]: I1011 03:10:40.914752 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-42jjb"] Oct 11 03:10:42 crc kubenswrapper[4743]: I1011 03:10:42.773048 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-42jjb" podUID="04d6a586-789e-47a9-85d1-d3c091ff5f68" containerName="registry-server" containerID="cri-o://348dacb6bd5c633c56d27258edae7df418b21809625db250c8b312e8ddfda154" gracePeriod=2 Oct 11 03:10:43 crc kubenswrapper[4743]: I1011 03:10:43.331199 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-42jjb" Oct 11 03:10:43 crc kubenswrapper[4743]: I1011 03:10:43.468814 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jv495\" (UniqueName: \"kubernetes.io/projected/04d6a586-789e-47a9-85d1-d3c091ff5f68-kube-api-access-jv495\") pod \"04d6a586-789e-47a9-85d1-d3c091ff5f68\" (UID: \"04d6a586-789e-47a9-85d1-d3c091ff5f68\") " Oct 11 03:10:43 crc kubenswrapper[4743]: I1011 03:10:43.469183 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04d6a586-789e-47a9-85d1-d3c091ff5f68-catalog-content\") pod \"04d6a586-789e-47a9-85d1-d3c091ff5f68\" (UID: \"04d6a586-789e-47a9-85d1-d3c091ff5f68\") " Oct 11 03:10:43 crc kubenswrapper[4743]: I1011 03:10:43.469224 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04d6a586-789e-47a9-85d1-d3c091ff5f68-utilities\") pod \"04d6a586-789e-47a9-85d1-d3c091ff5f68\" (UID: \"04d6a586-789e-47a9-85d1-d3c091ff5f68\") " Oct 11 03:10:43 crc kubenswrapper[4743]: I1011 03:10:43.470033 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04d6a586-789e-47a9-85d1-d3c091ff5f68-utilities" (OuterVolumeSpecName: "utilities") pod "04d6a586-789e-47a9-85d1-d3c091ff5f68" (UID: "04d6a586-789e-47a9-85d1-d3c091ff5f68"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:10:43 crc kubenswrapper[4743]: I1011 03:10:43.478184 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04d6a586-789e-47a9-85d1-d3c091ff5f68-kube-api-access-jv495" (OuterVolumeSpecName: "kube-api-access-jv495") pod "04d6a586-789e-47a9-85d1-d3c091ff5f68" (UID: "04d6a586-789e-47a9-85d1-d3c091ff5f68"). InnerVolumeSpecName "kube-api-access-jv495". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:10:43 crc kubenswrapper[4743]: I1011 03:10:43.553021 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04d6a586-789e-47a9-85d1-d3c091ff5f68-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04d6a586-789e-47a9-85d1-d3c091ff5f68" (UID: "04d6a586-789e-47a9-85d1-d3c091ff5f68"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:10:43 crc kubenswrapper[4743]: I1011 03:10:43.572205 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04d6a586-789e-47a9-85d1-d3c091ff5f68-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 03:10:43 crc kubenswrapper[4743]: I1011 03:10:43.572446 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04d6a586-789e-47a9-85d1-d3c091ff5f68-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 03:10:43 crc kubenswrapper[4743]: I1011 03:10:43.572516 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jv495\" (UniqueName: \"kubernetes.io/projected/04d6a586-789e-47a9-85d1-d3c091ff5f68-kube-api-access-jv495\") on node \"crc\" DevicePath \"\"" Oct 11 03:10:43 crc kubenswrapper[4743]: I1011 03:10:43.786079 4743 generic.go:334] "Generic (PLEG): container finished" podID="04d6a586-789e-47a9-85d1-d3c091ff5f68" containerID="348dacb6bd5c633c56d27258edae7df418b21809625db250c8b312e8ddfda154" exitCode=0 Oct 11 03:10:43 crc kubenswrapper[4743]: I1011 03:10:43.786128 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42jjb" event={"ID":"04d6a586-789e-47a9-85d1-d3c091ff5f68","Type":"ContainerDied","Data":"348dacb6bd5c633c56d27258edae7df418b21809625db250c8b312e8ddfda154"} Oct 11 03:10:43 crc kubenswrapper[4743]: I1011 03:10:43.786142 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-42jjb" Oct 11 03:10:43 crc kubenswrapper[4743]: I1011 03:10:43.786157 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42jjb" event={"ID":"04d6a586-789e-47a9-85d1-d3c091ff5f68","Type":"ContainerDied","Data":"2e5cee0d0ccb0a8d1e9216b9b2824f1315b89f1dfefd3cf1ac237bff669bd05b"} Oct 11 03:10:43 crc kubenswrapper[4743]: I1011 03:10:43.786175 4743 scope.go:117] "RemoveContainer" containerID="348dacb6bd5c633c56d27258edae7df418b21809625db250c8b312e8ddfda154" Oct 11 03:10:43 crc kubenswrapper[4743]: I1011 03:10:43.805028 4743 scope.go:117] "RemoveContainer" containerID="ec2a6b42eb9af786e726f27068d718e078ad6c39b566be5a81feb6eef40a77c3" Oct 11 03:10:43 crc kubenswrapper[4743]: I1011 03:10:43.823523 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-42jjb"] Oct 11 03:10:43 crc kubenswrapper[4743]: I1011 03:10:43.832262 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-42jjb"] Oct 11 03:10:43 crc kubenswrapper[4743]: I1011 03:10:43.849351 4743 scope.go:117] "RemoveContainer" containerID="42a5f23d1d220d5c79dd71593dbb30a2816a16c5a29e56cbc034de8407b0ca9d" Oct 11 03:10:43 crc kubenswrapper[4743]: I1011 03:10:43.890034 4743 scope.go:117] "RemoveContainer" containerID="348dacb6bd5c633c56d27258edae7df418b21809625db250c8b312e8ddfda154" Oct 11 03:10:43 crc kubenswrapper[4743]: E1011 03:10:43.890458 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"348dacb6bd5c633c56d27258edae7df418b21809625db250c8b312e8ddfda154\": container with ID starting with 348dacb6bd5c633c56d27258edae7df418b21809625db250c8b312e8ddfda154 not found: ID does not exist" containerID="348dacb6bd5c633c56d27258edae7df418b21809625db250c8b312e8ddfda154" Oct 11 03:10:43 crc kubenswrapper[4743]: I1011 03:10:43.890491 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"348dacb6bd5c633c56d27258edae7df418b21809625db250c8b312e8ddfda154"} err="failed to get container status \"348dacb6bd5c633c56d27258edae7df418b21809625db250c8b312e8ddfda154\": rpc error: code = NotFound desc = could not find container \"348dacb6bd5c633c56d27258edae7df418b21809625db250c8b312e8ddfda154\": container with ID starting with 348dacb6bd5c633c56d27258edae7df418b21809625db250c8b312e8ddfda154 not found: ID does not exist" Oct 11 03:10:43 crc kubenswrapper[4743]: I1011 03:10:43.890514 4743 scope.go:117] "RemoveContainer" containerID="ec2a6b42eb9af786e726f27068d718e078ad6c39b566be5a81feb6eef40a77c3" Oct 11 03:10:43 crc kubenswrapper[4743]: E1011 03:10:43.890923 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec2a6b42eb9af786e726f27068d718e078ad6c39b566be5a81feb6eef40a77c3\": container with ID starting with ec2a6b42eb9af786e726f27068d718e078ad6c39b566be5a81feb6eef40a77c3 not found: ID does not exist" containerID="ec2a6b42eb9af786e726f27068d718e078ad6c39b566be5a81feb6eef40a77c3" Oct 11 03:10:43 crc kubenswrapper[4743]: I1011 03:10:43.890977 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec2a6b42eb9af786e726f27068d718e078ad6c39b566be5a81feb6eef40a77c3"} err="failed to get container status \"ec2a6b42eb9af786e726f27068d718e078ad6c39b566be5a81feb6eef40a77c3\": rpc error: code = NotFound desc = could not find container \"ec2a6b42eb9af786e726f27068d718e078ad6c39b566be5a81feb6eef40a77c3\": container with ID starting with ec2a6b42eb9af786e726f27068d718e078ad6c39b566be5a81feb6eef40a77c3 not found: ID does not exist" Oct 11 03:10:43 crc kubenswrapper[4743]: I1011 03:10:43.891004 4743 scope.go:117] "RemoveContainer" containerID="42a5f23d1d220d5c79dd71593dbb30a2816a16c5a29e56cbc034de8407b0ca9d" Oct 11 03:10:43 crc kubenswrapper[4743]: E1011 03:10:43.891604 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42a5f23d1d220d5c79dd71593dbb30a2816a16c5a29e56cbc034de8407b0ca9d\": container with ID starting with 42a5f23d1d220d5c79dd71593dbb30a2816a16c5a29e56cbc034de8407b0ca9d not found: ID does not exist" containerID="42a5f23d1d220d5c79dd71593dbb30a2816a16c5a29e56cbc034de8407b0ca9d" Oct 11 03:10:43 crc kubenswrapper[4743]: I1011 03:10:43.891633 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42a5f23d1d220d5c79dd71593dbb30a2816a16c5a29e56cbc034de8407b0ca9d"} err="failed to get container status \"42a5f23d1d220d5c79dd71593dbb30a2816a16c5a29e56cbc034de8407b0ca9d\": rpc error: code = NotFound desc = could not find container \"42a5f23d1d220d5c79dd71593dbb30a2816a16c5a29e56cbc034de8407b0ca9d\": container with ID starting with 42a5f23d1d220d5c79dd71593dbb30a2816a16c5a29e56cbc034de8407b0ca9d not found: ID does not exist" Oct 11 03:10:44 crc kubenswrapper[4743]: I1011 03:10:44.105088 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04d6a586-789e-47a9-85d1-d3c091ff5f68" path="/var/lib/kubelet/pods/04d6a586-789e-47a9-85d1-d3c091ff5f68/volumes" Oct 11 03:11:14 crc kubenswrapper[4743]: I1011 03:11:14.458023 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 03:11:14 crc kubenswrapper[4743]: I1011 03:11:14.458620 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 03:11:39 crc kubenswrapper[4743]: I1011 03:11:39.409187 4743 generic.go:334] "Generic (PLEG): container finished" podID="f7694f07-e76c-472a-9ac4-7c606ca08f28" containerID="9fa23bd7fa6e38fcaee07f378bf8e4521ce45a7ee0dd7b32573a76ac02e46437" exitCode=0 Oct 11 03:11:39 crc kubenswrapper[4743]: I1011 03:11:39.409285 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rvqc2/must-gather-ns9w6" event={"ID":"f7694f07-e76c-472a-9ac4-7c606ca08f28","Type":"ContainerDied","Data":"9fa23bd7fa6e38fcaee07f378bf8e4521ce45a7ee0dd7b32573a76ac02e46437"} Oct 11 03:11:39 crc kubenswrapper[4743]: I1011 03:11:39.411262 4743 scope.go:117] "RemoveContainer" containerID="9fa23bd7fa6e38fcaee07f378bf8e4521ce45a7ee0dd7b32573a76ac02e46437" Oct 11 03:11:40 crc kubenswrapper[4743]: I1011 03:11:40.302048 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rvqc2_must-gather-ns9w6_f7694f07-e76c-472a-9ac4-7c606ca08f28/gather/0.log" Oct 11 03:11:44 crc kubenswrapper[4743]: I1011 03:11:44.458722 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 03:11:44 crc kubenswrapper[4743]: I1011 03:11:44.459403 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 03:11:53 crc kubenswrapper[4743]: I1011 03:11:53.043021 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rvqc2/must-gather-ns9w6"] Oct 11 03:11:53 crc kubenswrapper[4743]: I1011 03:11:53.045584 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-rvqc2/must-gather-ns9w6" podUID="f7694f07-e76c-472a-9ac4-7c606ca08f28" containerName="copy" containerID="cri-o://ede8ed1b5cfbb64a76ed455f957807cfe6ea6b7af9c3977ae980a81eb9fe7b79" gracePeriod=2 Oct 11 03:11:53 crc kubenswrapper[4743]: I1011 03:11:53.069931 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rvqc2/must-gather-ns9w6"] Oct 11 03:11:53 crc kubenswrapper[4743]: I1011 03:11:53.556068 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rvqc2_must-gather-ns9w6_f7694f07-e76c-472a-9ac4-7c606ca08f28/copy/0.log" Oct 11 03:11:53 crc kubenswrapper[4743]: I1011 03:11:53.557078 4743 generic.go:334] "Generic (PLEG): container finished" podID="f7694f07-e76c-472a-9ac4-7c606ca08f28" containerID="ede8ed1b5cfbb64a76ed455f957807cfe6ea6b7af9c3977ae980a81eb9fe7b79" exitCode=143 Oct 11 03:11:53 crc kubenswrapper[4743]: I1011 03:11:53.557133 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="418da09532cfda5c29f94fa54289483425cd92f875fe47149000c7710f6e5b99" Oct 11 03:11:53 crc kubenswrapper[4743]: I1011 03:11:53.627910 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rvqc2_must-gather-ns9w6_f7694f07-e76c-472a-9ac4-7c606ca08f28/copy/0.log" Oct 11 03:11:53 crc kubenswrapper[4743]: I1011 03:11:53.628297 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rvqc2/must-gather-ns9w6" Oct 11 03:11:53 crc kubenswrapper[4743]: I1011 03:11:53.774908 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4p5r\" (UniqueName: \"kubernetes.io/projected/f7694f07-e76c-472a-9ac4-7c606ca08f28-kube-api-access-x4p5r\") pod \"f7694f07-e76c-472a-9ac4-7c606ca08f28\" (UID: \"f7694f07-e76c-472a-9ac4-7c606ca08f28\") " Oct 11 03:11:53 crc kubenswrapper[4743]: I1011 03:11:53.775121 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f7694f07-e76c-472a-9ac4-7c606ca08f28-must-gather-output\") pod \"f7694f07-e76c-472a-9ac4-7c606ca08f28\" (UID: \"f7694f07-e76c-472a-9ac4-7c606ca08f28\") " Oct 11 03:11:53 crc kubenswrapper[4743]: I1011 03:11:53.782052 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7694f07-e76c-472a-9ac4-7c606ca08f28-kube-api-access-x4p5r" (OuterVolumeSpecName: "kube-api-access-x4p5r") pod "f7694f07-e76c-472a-9ac4-7c606ca08f28" (UID: "f7694f07-e76c-472a-9ac4-7c606ca08f28"). InnerVolumeSpecName "kube-api-access-x4p5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:11:53 crc kubenswrapper[4743]: I1011 03:11:53.878329 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4p5r\" (UniqueName: \"kubernetes.io/projected/f7694f07-e76c-472a-9ac4-7c606ca08f28-kube-api-access-x4p5r\") on node \"crc\" DevicePath \"\"" Oct 11 03:11:53 crc kubenswrapper[4743]: I1011 03:11:53.966300 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7694f07-e76c-472a-9ac4-7c606ca08f28-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f7694f07-e76c-472a-9ac4-7c606ca08f28" (UID: "f7694f07-e76c-472a-9ac4-7c606ca08f28"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:11:53 crc kubenswrapper[4743]: I1011 03:11:53.982348 4743 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f7694f07-e76c-472a-9ac4-7c606ca08f28-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 11 03:11:54 crc kubenswrapper[4743]: I1011 03:11:54.106300 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7694f07-e76c-472a-9ac4-7c606ca08f28" path="/var/lib/kubelet/pods/f7694f07-e76c-472a-9ac4-7c606ca08f28/volumes" Oct 11 03:11:54 crc kubenswrapper[4743]: I1011 03:11:54.564982 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rvqc2/must-gather-ns9w6" Oct 11 03:11:59 crc kubenswrapper[4743]: I1011 03:11:59.986675 4743 scope.go:117] "RemoveContainer" containerID="ede8ed1b5cfbb64a76ed455f957807cfe6ea6b7af9c3977ae980a81eb9fe7b79" Oct 11 03:12:00 crc kubenswrapper[4743]: I1011 03:12:00.010533 4743 scope.go:117] "RemoveContainer" containerID="3327c6af83fa2b5d64c505bc8644955fea821ba5bd794bbfd4d631229ab3e40d" Oct 11 03:12:00 crc kubenswrapper[4743]: I1011 03:12:00.034165 4743 scope.go:117] "RemoveContainer" containerID="9fa23bd7fa6e38fcaee07f378bf8e4521ce45a7ee0dd7b32573a76ac02e46437" Oct 11 03:12:08 crc kubenswrapper[4743]: I1011 03:12:08.051468 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9t6gw"] Oct 11 03:12:08 crc kubenswrapper[4743]: E1011 03:12:08.053122 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d6a586-789e-47a9-85d1-d3c091ff5f68" containerName="extract-content" Oct 11 03:12:08 crc kubenswrapper[4743]: I1011 03:12:08.053140 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d6a586-789e-47a9-85d1-d3c091ff5f68" containerName="extract-content" Oct 11 03:12:08 crc kubenswrapper[4743]: E1011 03:12:08.053163 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7694f07-e76c-472a-9ac4-7c606ca08f28" containerName="copy" Oct 11 03:12:08 crc kubenswrapper[4743]: I1011 03:12:08.053169 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7694f07-e76c-472a-9ac4-7c606ca08f28" containerName="copy" Oct 11 03:12:08 crc kubenswrapper[4743]: E1011 03:12:08.053185 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d6a586-789e-47a9-85d1-d3c091ff5f68" containerName="registry-server" Oct 11 03:12:08 crc kubenswrapper[4743]: I1011 03:12:08.053193 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d6a586-789e-47a9-85d1-d3c091ff5f68" containerName="registry-server" Oct 11 03:12:08 crc kubenswrapper[4743]: E1011 03:12:08.053213 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7694f07-e76c-472a-9ac4-7c606ca08f28" containerName="gather" Oct 11 03:12:08 crc kubenswrapper[4743]: I1011 03:12:08.053222 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7694f07-e76c-472a-9ac4-7c606ca08f28" containerName="gather" Oct 11 03:12:08 crc kubenswrapper[4743]: E1011 03:12:08.053268 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d6a586-789e-47a9-85d1-d3c091ff5f68" containerName="extract-utilities" Oct 11 03:12:08 crc kubenswrapper[4743]: I1011 03:12:08.053275 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d6a586-789e-47a9-85d1-d3c091ff5f68" containerName="extract-utilities" Oct 11 03:12:08 crc kubenswrapper[4743]: I1011 03:12:08.053513 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7694f07-e76c-472a-9ac4-7c606ca08f28" containerName="gather" Oct 11 03:12:08 crc kubenswrapper[4743]: I1011 03:12:08.053544 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7694f07-e76c-472a-9ac4-7c606ca08f28" containerName="copy" Oct 11 03:12:08 crc kubenswrapper[4743]: I1011 03:12:08.053554 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="04d6a586-789e-47a9-85d1-d3c091ff5f68" containerName="registry-server" Oct 11 03:12:08 crc kubenswrapper[4743]: I1011 03:12:08.055554 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9t6gw" Oct 11 03:12:08 crc kubenswrapper[4743]: I1011 03:12:08.085443 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9t6gw"] Oct 11 03:12:08 crc kubenswrapper[4743]: I1011 03:12:08.135359 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d263a06d-37f7-439f-b6f6-fd984922a05b-catalog-content\") pod \"redhat-marketplace-9t6gw\" (UID: \"d263a06d-37f7-439f-b6f6-fd984922a05b\") " pod="openshift-marketplace/redhat-marketplace-9t6gw" Oct 11 03:12:08 crc kubenswrapper[4743]: I1011 03:12:08.135558 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d263a06d-37f7-439f-b6f6-fd984922a05b-utilities\") pod \"redhat-marketplace-9t6gw\" (UID: \"d263a06d-37f7-439f-b6f6-fd984922a05b\") " pod="openshift-marketplace/redhat-marketplace-9t6gw" Oct 11 03:12:08 crc kubenswrapper[4743]: I1011 03:12:08.135594 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g56q\" (UniqueName: \"kubernetes.io/projected/d263a06d-37f7-439f-b6f6-fd984922a05b-kube-api-access-7g56q\") pod \"redhat-marketplace-9t6gw\" (UID: \"d263a06d-37f7-439f-b6f6-fd984922a05b\") " pod="openshift-marketplace/redhat-marketplace-9t6gw" Oct 11 03:12:08 crc kubenswrapper[4743]: I1011 03:12:08.237741 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d263a06d-37f7-439f-b6f6-fd984922a05b-catalog-content\") pod \"redhat-marketplace-9t6gw\" (UID: \"d263a06d-37f7-439f-b6f6-fd984922a05b\") " pod="openshift-marketplace/redhat-marketplace-9t6gw" Oct 11 03:12:08 crc kubenswrapper[4743]: I1011 03:12:08.237894 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d263a06d-37f7-439f-b6f6-fd984922a05b-utilities\") pod \"redhat-marketplace-9t6gw\" (UID: \"d263a06d-37f7-439f-b6f6-fd984922a05b\") " pod="openshift-marketplace/redhat-marketplace-9t6gw" Oct 11 03:12:08 crc kubenswrapper[4743]: I1011 03:12:08.237939 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g56q\" (UniqueName: \"kubernetes.io/projected/d263a06d-37f7-439f-b6f6-fd984922a05b-kube-api-access-7g56q\") pod \"redhat-marketplace-9t6gw\" (UID: \"d263a06d-37f7-439f-b6f6-fd984922a05b\") " pod="openshift-marketplace/redhat-marketplace-9t6gw" Oct 11 03:12:08 crc kubenswrapper[4743]: I1011 03:12:08.238438 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d263a06d-37f7-439f-b6f6-fd984922a05b-catalog-content\") pod \"redhat-marketplace-9t6gw\" (UID: \"d263a06d-37f7-439f-b6f6-fd984922a05b\") " pod="openshift-marketplace/redhat-marketplace-9t6gw" Oct 11 03:12:08 crc kubenswrapper[4743]: I1011 03:12:08.238553 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d263a06d-37f7-439f-b6f6-fd984922a05b-utilities\") pod \"redhat-marketplace-9t6gw\" (UID: \"d263a06d-37f7-439f-b6f6-fd984922a05b\") " pod="openshift-marketplace/redhat-marketplace-9t6gw" Oct 11 03:12:08 crc kubenswrapper[4743]: I1011 03:12:08.261983 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g56q\" (UniqueName: \"kubernetes.io/projected/d263a06d-37f7-439f-b6f6-fd984922a05b-kube-api-access-7g56q\") pod \"redhat-marketplace-9t6gw\" (UID: \"d263a06d-37f7-439f-b6f6-fd984922a05b\") " pod="openshift-marketplace/redhat-marketplace-9t6gw" Oct 11 03:12:08 crc kubenswrapper[4743]: I1011 03:12:08.385765 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9t6gw" Oct 11 03:12:08 crc kubenswrapper[4743]: I1011 03:12:08.865183 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9t6gw"] Oct 11 03:12:08 crc kubenswrapper[4743]: W1011 03:12:08.872357 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd263a06d_37f7_439f_b6f6_fd984922a05b.slice/crio-a7426137d1321cb67b339ed53e6a74912ca3df3211199c46f5761ce39576d540 WatchSource:0}: Error finding container a7426137d1321cb67b339ed53e6a74912ca3df3211199c46f5761ce39576d540: Status 404 returned error can't find the container with id a7426137d1321cb67b339ed53e6a74912ca3df3211199c46f5761ce39576d540 Oct 11 03:12:09 crc kubenswrapper[4743]: I1011 03:12:09.718633 4743 generic.go:334] "Generic (PLEG): container finished" podID="d263a06d-37f7-439f-b6f6-fd984922a05b" containerID="12638d3ec088659161f9afcafd9ddf652e9c745b314a0c34b3b0ae6a26529bb2" exitCode=0 Oct 11 03:12:09 crc kubenswrapper[4743]: I1011 03:12:09.719189 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9t6gw" event={"ID":"d263a06d-37f7-439f-b6f6-fd984922a05b","Type":"ContainerDied","Data":"12638d3ec088659161f9afcafd9ddf652e9c745b314a0c34b3b0ae6a26529bb2"} Oct 11 03:12:09 crc kubenswrapper[4743]: I1011 03:12:09.719664 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9t6gw" event={"ID":"d263a06d-37f7-439f-b6f6-fd984922a05b","Type":"ContainerStarted","Data":"a7426137d1321cb67b339ed53e6a74912ca3df3211199c46f5761ce39576d540"} Oct 11 03:12:10 crc kubenswrapper[4743]: I1011 03:12:10.735766 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9t6gw" event={"ID":"d263a06d-37f7-439f-b6f6-fd984922a05b","Type":"ContainerStarted","Data":"bbca5f9c82196df70873d0a554dddbe5e46bb7a356b479f686420fa7f4891295"} Oct 11 03:12:11 crc kubenswrapper[4743]: I1011 03:12:11.749312 4743 generic.go:334] "Generic (PLEG): container finished" podID="d263a06d-37f7-439f-b6f6-fd984922a05b" containerID="bbca5f9c82196df70873d0a554dddbe5e46bb7a356b479f686420fa7f4891295" exitCode=0 Oct 11 03:12:11 crc kubenswrapper[4743]: I1011 03:12:11.749427 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9t6gw" event={"ID":"d263a06d-37f7-439f-b6f6-fd984922a05b","Type":"ContainerDied","Data":"bbca5f9c82196df70873d0a554dddbe5e46bb7a356b479f686420fa7f4891295"} Oct 11 03:12:12 crc kubenswrapper[4743]: I1011 03:12:12.760882 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9t6gw" event={"ID":"d263a06d-37f7-439f-b6f6-fd984922a05b","Type":"ContainerStarted","Data":"638309eac22eab7c935575d59eea475bcc25786b58bfa125be2a170d2fc128f4"} Oct 11 03:12:14 crc kubenswrapper[4743]: I1011 03:12:14.458542 4743 patch_prober.go:28] interesting pod/machine-config-daemon-cvm72 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 11 03:12:14 crc kubenswrapper[4743]: I1011 03:12:14.458912 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 11 03:12:14 crc kubenswrapper[4743]: I1011 03:12:14.458964 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" Oct 11 03:12:14 crc kubenswrapper[4743]: I1011 03:12:14.459889 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0a303352eb43c1d43d7696b12e7ddb4c2224822a23445b7dd5541ffcab230f46"} pod="openshift-machine-config-operator/machine-config-daemon-cvm72" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 11 03:12:14 crc kubenswrapper[4743]: I1011 03:12:14.459958 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" containerName="machine-config-daemon" containerID="cri-o://0a303352eb43c1d43d7696b12e7ddb4c2224822a23445b7dd5541ffcab230f46" gracePeriod=600 Oct 11 03:12:14 crc kubenswrapper[4743]: E1011 03:12:14.605252 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:12:14 crc kubenswrapper[4743]: I1011 03:12:14.784976 4743 generic.go:334] "Generic (PLEG): container finished" podID="add92263-e252-446b-95de-092585b4357f" containerID="0a303352eb43c1d43d7696b12e7ddb4c2224822a23445b7dd5541ffcab230f46" exitCode=0 Oct 11 03:12:14 crc kubenswrapper[4743]: I1011 03:12:14.785026 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" event={"ID":"add92263-e252-446b-95de-092585b4357f","Type":"ContainerDied","Data":"0a303352eb43c1d43d7696b12e7ddb4c2224822a23445b7dd5541ffcab230f46"} Oct 11 03:12:14 crc kubenswrapper[4743]: I1011 03:12:14.785066 4743 scope.go:117] "RemoveContainer" containerID="62fef52b9bc704a24e77d3838ec6b6ea99e34891429f790a37c7da87d4fa7a3f" Oct 11 03:12:14 crc kubenswrapper[4743]: I1011 03:12:14.785996 4743 scope.go:117] "RemoveContainer" containerID="0a303352eb43c1d43d7696b12e7ddb4c2224822a23445b7dd5541ffcab230f46" Oct 11 03:12:14 crc kubenswrapper[4743]: E1011 03:12:14.786354 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:12:14 crc kubenswrapper[4743]: I1011 03:12:14.810749 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9t6gw" podStartSLOduration=4.342155353 podStartE2EDuration="6.810730625s" podCreationTimestamp="2025-10-11 03:12:08 +0000 UTC" firstStartedPulling="2025-10-11 03:12:09.720977882 +0000 UTC m=+8424.373958279" lastFinishedPulling="2025-10-11 03:12:12.189553114 +0000 UTC m=+8426.842533551" observedRunningTime="2025-10-11 03:12:12.780723642 +0000 UTC m=+8427.433704039" watchObservedRunningTime="2025-10-11 03:12:14.810730625 +0000 UTC m=+8429.463711023" Oct 11 03:12:18 crc kubenswrapper[4743]: I1011 03:12:18.386772 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9t6gw" Oct 11 03:12:18 crc kubenswrapper[4743]: I1011 03:12:18.387149 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9t6gw" Oct 11 03:12:18 crc kubenswrapper[4743]: I1011 03:12:18.465802 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9t6gw" Oct 11 03:12:18 crc kubenswrapper[4743]: I1011 03:12:18.903895 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9t6gw" Oct 11 03:12:18 crc kubenswrapper[4743]: I1011 03:12:18.970784 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9t6gw"] Oct 11 03:12:20 crc kubenswrapper[4743]: I1011 03:12:20.855700 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9t6gw" podUID="d263a06d-37f7-439f-b6f6-fd984922a05b" containerName="registry-server" containerID="cri-o://638309eac22eab7c935575d59eea475bcc25786b58bfa125be2a170d2fc128f4" gracePeriod=2 Oct 11 03:12:21 crc kubenswrapper[4743]: I1011 03:12:21.335560 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9t6gw" Oct 11 03:12:21 crc kubenswrapper[4743]: I1011 03:12:21.444682 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d263a06d-37f7-439f-b6f6-fd984922a05b-utilities\") pod \"d263a06d-37f7-439f-b6f6-fd984922a05b\" (UID: \"d263a06d-37f7-439f-b6f6-fd984922a05b\") " Oct 11 03:12:21 crc kubenswrapper[4743]: I1011 03:12:21.444950 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g56q\" (UniqueName: \"kubernetes.io/projected/d263a06d-37f7-439f-b6f6-fd984922a05b-kube-api-access-7g56q\") pod \"d263a06d-37f7-439f-b6f6-fd984922a05b\" (UID: \"d263a06d-37f7-439f-b6f6-fd984922a05b\") " Oct 11 03:12:21 crc kubenswrapper[4743]: I1011 03:12:21.445039 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d263a06d-37f7-439f-b6f6-fd984922a05b-catalog-content\") pod \"d263a06d-37f7-439f-b6f6-fd984922a05b\" (UID: \"d263a06d-37f7-439f-b6f6-fd984922a05b\") " Oct 11 03:12:21 crc kubenswrapper[4743]: I1011 03:12:21.445697 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d263a06d-37f7-439f-b6f6-fd984922a05b-utilities" (OuterVolumeSpecName: "utilities") pod "d263a06d-37f7-439f-b6f6-fd984922a05b" (UID: "d263a06d-37f7-439f-b6f6-fd984922a05b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:12:21 crc kubenswrapper[4743]: I1011 03:12:21.453106 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d263a06d-37f7-439f-b6f6-fd984922a05b-kube-api-access-7g56q" (OuterVolumeSpecName: "kube-api-access-7g56q") pod "d263a06d-37f7-439f-b6f6-fd984922a05b" (UID: "d263a06d-37f7-439f-b6f6-fd984922a05b"). InnerVolumeSpecName "kube-api-access-7g56q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 11 03:12:21 crc kubenswrapper[4743]: I1011 03:12:21.457054 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d263a06d-37f7-439f-b6f6-fd984922a05b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d263a06d-37f7-439f-b6f6-fd984922a05b" (UID: "d263a06d-37f7-439f-b6f6-fd984922a05b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 11 03:12:21 crc kubenswrapper[4743]: I1011 03:12:21.547450 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d263a06d-37f7-439f-b6f6-fd984922a05b-utilities\") on node \"crc\" DevicePath \"\"" Oct 11 03:12:21 crc kubenswrapper[4743]: I1011 03:12:21.547488 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g56q\" (UniqueName: \"kubernetes.io/projected/d263a06d-37f7-439f-b6f6-fd984922a05b-kube-api-access-7g56q\") on node \"crc\" DevicePath \"\"" Oct 11 03:12:21 crc kubenswrapper[4743]: I1011 03:12:21.547501 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d263a06d-37f7-439f-b6f6-fd984922a05b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 11 03:12:21 crc kubenswrapper[4743]: I1011 03:12:21.868122 4743 generic.go:334] "Generic (PLEG): container finished" podID="d263a06d-37f7-439f-b6f6-fd984922a05b" containerID="638309eac22eab7c935575d59eea475bcc25786b58bfa125be2a170d2fc128f4" exitCode=0 Oct 11 03:12:21 crc kubenswrapper[4743]: I1011 03:12:21.868192 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9t6gw" event={"ID":"d263a06d-37f7-439f-b6f6-fd984922a05b","Type":"ContainerDied","Data":"638309eac22eab7c935575d59eea475bcc25786b58bfa125be2a170d2fc128f4"} Oct 11 03:12:21 crc kubenswrapper[4743]: I1011 03:12:21.868253 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9t6gw" event={"ID":"d263a06d-37f7-439f-b6f6-fd984922a05b","Type":"ContainerDied","Data":"a7426137d1321cb67b339ed53e6a74912ca3df3211199c46f5761ce39576d540"} Oct 11 03:12:21 crc kubenswrapper[4743]: I1011 03:12:21.868196 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9t6gw" Oct 11 03:12:21 crc kubenswrapper[4743]: I1011 03:12:21.868277 4743 scope.go:117] "RemoveContainer" containerID="638309eac22eab7c935575d59eea475bcc25786b58bfa125be2a170d2fc128f4" Oct 11 03:12:21 crc kubenswrapper[4743]: I1011 03:12:21.909011 4743 scope.go:117] "RemoveContainer" containerID="bbca5f9c82196df70873d0a554dddbe5e46bb7a356b479f686420fa7f4891295" Oct 11 03:12:21 crc kubenswrapper[4743]: I1011 03:12:21.918785 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9t6gw"] Oct 11 03:12:21 crc kubenswrapper[4743]: I1011 03:12:21.928024 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9t6gw"] Oct 11 03:12:21 crc kubenswrapper[4743]: I1011 03:12:21.935542 4743 scope.go:117] "RemoveContainer" containerID="12638d3ec088659161f9afcafd9ddf652e9c745b314a0c34b3b0ae6a26529bb2" Oct 11 03:12:22 crc kubenswrapper[4743]: I1011 03:12:22.007357 4743 scope.go:117] "RemoveContainer" containerID="638309eac22eab7c935575d59eea475bcc25786b58bfa125be2a170d2fc128f4" Oct 11 03:12:22 crc kubenswrapper[4743]: E1011 03:12:22.008046 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"638309eac22eab7c935575d59eea475bcc25786b58bfa125be2a170d2fc128f4\": container with ID starting with 638309eac22eab7c935575d59eea475bcc25786b58bfa125be2a170d2fc128f4 not found: ID does not exist" containerID="638309eac22eab7c935575d59eea475bcc25786b58bfa125be2a170d2fc128f4" Oct 11 03:12:22 crc kubenswrapper[4743]: I1011 03:12:22.008096 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"638309eac22eab7c935575d59eea475bcc25786b58bfa125be2a170d2fc128f4"} err="failed to get container status \"638309eac22eab7c935575d59eea475bcc25786b58bfa125be2a170d2fc128f4\": rpc error: code = NotFound desc = could not find container \"638309eac22eab7c935575d59eea475bcc25786b58bfa125be2a170d2fc128f4\": container with ID starting with 638309eac22eab7c935575d59eea475bcc25786b58bfa125be2a170d2fc128f4 not found: ID does not exist" Oct 11 03:12:22 crc kubenswrapper[4743]: I1011 03:12:22.008124 4743 scope.go:117] "RemoveContainer" containerID="bbca5f9c82196df70873d0a554dddbe5e46bb7a356b479f686420fa7f4891295" Oct 11 03:12:22 crc kubenswrapper[4743]: E1011 03:12:22.008429 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbca5f9c82196df70873d0a554dddbe5e46bb7a356b479f686420fa7f4891295\": container with ID starting with bbca5f9c82196df70873d0a554dddbe5e46bb7a356b479f686420fa7f4891295 not found: ID does not exist" containerID="bbca5f9c82196df70873d0a554dddbe5e46bb7a356b479f686420fa7f4891295" Oct 11 03:12:22 crc kubenswrapper[4743]: I1011 03:12:22.008467 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbca5f9c82196df70873d0a554dddbe5e46bb7a356b479f686420fa7f4891295"} err="failed to get container status \"bbca5f9c82196df70873d0a554dddbe5e46bb7a356b479f686420fa7f4891295\": rpc error: code = NotFound desc = could not find container \"bbca5f9c82196df70873d0a554dddbe5e46bb7a356b479f686420fa7f4891295\": container with ID starting with bbca5f9c82196df70873d0a554dddbe5e46bb7a356b479f686420fa7f4891295 not found: ID does not exist" Oct 11 03:12:22 crc kubenswrapper[4743]: I1011 03:12:22.008493 4743 scope.go:117] "RemoveContainer" containerID="12638d3ec088659161f9afcafd9ddf652e9c745b314a0c34b3b0ae6a26529bb2" Oct 11 03:12:22 crc kubenswrapper[4743]: E1011 03:12:22.008724 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12638d3ec088659161f9afcafd9ddf652e9c745b314a0c34b3b0ae6a26529bb2\": container with ID starting with 12638d3ec088659161f9afcafd9ddf652e9c745b314a0c34b3b0ae6a26529bb2 not found: ID does not exist" containerID="12638d3ec088659161f9afcafd9ddf652e9c745b314a0c34b3b0ae6a26529bb2" Oct 11 03:12:22 crc kubenswrapper[4743]: I1011 03:12:22.008749 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12638d3ec088659161f9afcafd9ddf652e9c745b314a0c34b3b0ae6a26529bb2"} err="failed to get container status \"12638d3ec088659161f9afcafd9ddf652e9c745b314a0c34b3b0ae6a26529bb2\": rpc error: code = NotFound desc = could not find container \"12638d3ec088659161f9afcafd9ddf652e9c745b314a0c34b3b0ae6a26529bb2\": container with ID starting with 12638d3ec088659161f9afcafd9ddf652e9c745b314a0c34b3b0ae6a26529bb2 not found: ID does not exist" Oct 11 03:12:22 crc kubenswrapper[4743]: I1011 03:12:22.109155 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d263a06d-37f7-439f-b6f6-fd984922a05b" path="/var/lib/kubelet/pods/d263a06d-37f7-439f-b6f6-fd984922a05b/volumes" Oct 11 03:12:30 crc kubenswrapper[4743]: I1011 03:12:30.092586 4743 scope.go:117] "RemoveContainer" containerID="0a303352eb43c1d43d7696b12e7ddb4c2224822a23445b7dd5541ffcab230f46" Oct 11 03:12:30 crc kubenswrapper[4743]: E1011 03:12:30.093480 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:12:42 crc kubenswrapper[4743]: I1011 03:12:42.092506 4743 scope.go:117] "RemoveContainer" containerID="0a303352eb43c1d43d7696b12e7ddb4c2224822a23445b7dd5541ffcab230f46" Oct 11 03:12:42 crc kubenswrapper[4743]: E1011 03:12:42.093346 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:12:56 crc kubenswrapper[4743]: I1011 03:12:56.103242 4743 scope.go:117] "RemoveContainer" containerID="0a303352eb43c1d43d7696b12e7ddb4c2224822a23445b7dd5541ffcab230f46" Oct 11 03:12:56 crc kubenswrapper[4743]: E1011 03:12:56.104197 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:13:00 crc kubenswrapper[4743]: I1011 03:13:00.197458 4743 scope.go:117] "RemoveContainer" containerID="70a777b6891e48fef92e453fae0a18e8f55e591276bd27ab54f2fb51bf835e7d" Oct 11 03:13:08 crc kubenswrapper[4743]: I1011 03:13:08.092061 4743 scope.go:117] "RemoveContainer" containerID="0a303352eb43c1d43d7696b12e7ddb4c2224822a23445b7dd5541ffcab230f46" Oct 11 03:13:08 crc kubenswrapper[4743]: E1011 03:13:08.092889 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f" Oct 11 03:13:23 crc kubenswrapper[4743]: I1011 03:13:23.092567 4743 scope.go:117] "RemoveContainer" containerID="0a303352eb43c1d43d7696b12e7ddb4c2224822a23445b7dd5541ffcab230f46" Oct 11 03:13:23 crc kubenswrapper[4743]: E1011 03:13:23.093900 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cvm72_openshift-machine-config-operator(add92263-e252-446b-95de-092585b4357f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cvm72" podUID="add92263-e252-446b-95de-092585b4357f"